ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 596:  Wednesday, January 21, 2004

  • "Open-Source E-Voting Heads West"
    Wired News (01/21/04); Zetter, Kim

    Open-source software developed in Australia is the basis for a new electronic voting system to be devised by Scott Ritchie at the University of California at Davis. The 19-year-old college student announced the launch of the nonprofit Open Vote Foundation before the California secretary of state's Voting Systems Panel on Jan. 15; the foundation's mission will be to modify the Electronic Voting and Counting System (eVACS) from Australia's Software Improvements to meet California election standards and make the software freely available to any voting vendors, including those outside of California. The eVACS source code was released online under a general public license, and the system was employed in a 2001 election in the Australian Capital Territory. The Australian system does not use proprietary, secret software developed by private companies, but was produced in collaboration with an independent government entity that issued draft and final versions of the source code online for public review. All e-voting machines in California must include a voter-verified paper trail by July 2006 in accordance with the California Secretary of State's mandate, and Ritchie intends to include such a feature in the eVACS modification. The college student insists that the cost of building eVACS-based voting machines will be much cheaper than e-voting systems created by private companies, since the basic materials are inexpensive and the programmers working on the system are volunteers. Software Improvements engineer Matt Quinn welcomes the project, arguing that there is plenty of room in the e-voting machine market for several open-source products as well as proprietary offerings. However, Rebecca Mercuri of Harvard University warns that open-source cannot solve fundamental e-voting security problems by itself, and calls the printed audit trail an essential backup component.
    Click Here to View Full Article

    To learn more about ACM's activities regarding e-voting; visit http://www.acm.org/usacm/Issues/EVoting.htm.

  • "What Will the PC of the Future Look Like?"
    NewsFactor Network (01/20/04); Valentine, Lisa

    PC boxes are set to morph in ways similar to how the computer display has changed over the last few years. Bulky CRT monitors have been replaced with sleek flat-screen designs that are larger while taking up less desktop real estate; meanwhile, PC design has mostly changed color from beige to black. At the Hewlett-Packard Design Center for Business PC Products, innovators are working to make the PC itself less visible and emphasize user interface aspects, says director Randall Martin. He explains that the mouse, display, video camera, and telephone will become more important in PC design as component divisions collaborate more closely, and notes that voice and video integration will mean major PC design changes. Today, consumers care less about CPU speed than in years past, but incremental increases will be needed to take advantage of Microsoft's next-generation operating system, Longhorn. Martin notes that chip makers Intel and AMD do their best to create demand for faster processors, but most users do not think in terms of CPU speed or hard drive capacity; instead, consumers simply want machines that can adequately perform needed tasks. Sun Microsystems is working on new computing technologies that will ensure that processor performance will stay ahead of application demand in the future: Multithreading chips under development, for example, will handle 32 processes in parallel. The chip design is simpler, but replicated in one unit so as to dramatically increase overall throughput, says Sun Microsystems vice president William Vass; he notes that this will lead to "multiple computers on a chip and making the box quite small," while these future PCs will also boast a next-generation bus that operates wirelessly or at light speed.
    Click Here to View Full Article

  • "New Tech Products Mean New Tech Jobs"
    Investor's Business Daily (01/20/04) P. A1; Deagon, Brian

    The Bureau of Labor Statistics estimates that roughly 13,000 jobs for computer system designers were filled from September through November 2003, and Quantit economist Mat Johnson says these figures imply that companies are once again planning to re-staff higher-level IT positions after several years of declining employment. "It suggests that corporations are coming back to the store, looking to buy, and tech firms are reacting to that," he notes. Corporate tech spending is expected to rise 7 percent this year from last, according to an average of various research firms' estimates. Challenger, Gray & Christmas indicates that high-paying industries are taking on more managers and executives, while analysts say tech employers desire professionals who possess skills in teamwork and other intangibles in addition to a solid math and science education. Federal Reserve Bank of New York economist Erica Groshen explains that many industries once regarded as innovative are maturing, which has led to a lot of tech job offshoring; greater innovation is essential to a resurgence in tech job growth. A recent survey sponsored by the Institute of Electrical and Electronic Engineers finds that demand is expected to increase for people skilled in wireless and optical communications, security applications, sensors, and information theory. Engineers listed biomechanical engineering as the hottest field projected for growth, followed by nanotechnology, megacomputing, and robotics; IEEE Spectrum editor Susan Hassler adds that future tech workers will have to be more willing to relocate and retrain as needed. The Bureau of Labor Statistics predicts that technology will account for eight out of the 10 fastest growing occupations between 2000 and 2010, with application software engineers, computer support specialists, and computer systems engineers comprising the three hottest occupations.

  • "Software Piracy Is in Resurgence, With New Safeguards Eroded by File Sharing"
    New York Times (01/19/04) P. C9; Heingartner, Douglas

    Software piracy is making a comeback, thanks to the advent of free and convenient peer-to-peer (P2P) file-trading programs and the emergence of high-speed Internet access. The Business Software Alliance reckons that software piracy added up to more than $13 billion in lost revenues worldwide in 2002. Tracking pirates down is becoming increasingly difficult for a number of reasons: Some pirates are becoming craftier at concealing their identities, and can churn out pirated versions of programs on P2P networks before the programs are officially released. Motive is another factor--Drew McManus, director of Adobe's antipiracy operation, notes that most digital pirates are trying to make names for themselves. "It's about being the first or the best at cracking," he explains. Software companies' antipiracy strategies vary: Symantec, Adobe, and others have embedded mandatory online activation in some of their products; Symantec's William Plante acknowledges that such registration schemes are not foolproof, but points out that using pirated versions of the software is "extremely inconvenient." Smaller software firms with tight wallets may choose to look upon P2P piracy with ambiguity, even going so far as to promote illegal copying in the hopes of building a customer base. Still another solution involves finding a middle ground between software companies and P2P networks, such as using the networks to promote their copyrighted products; Forrester analyst Josh Bernoff frowns upon this practice, arguing that it legitimizes P2P services.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Of Ants and Online Pirates"
    ABCNews.com (01/20/04); Eng, Paul

    Though peer-to-peer (P2P) file-sharing has taken a hit thanks to the Recording Industry Association of America's aggressive litigation against suspected online music pirates, programmer Jason Rohrer could conceivably give the practice a shot in the arm with MUTE, a freely available P2P software program that thwarts file tracing by relying on the same swarm intelligence exhibited by ants. Tracking files passed along current P2P networks such as Grokster and KaZaA is simple because the files are directly exchanged between computers in small packets using Internet Protocol addresses. The address for each computer on a MUTE network is represented by a random string of digits and characters, and a new random address is created every time a computer links to the MUTE network. A file search request is sent from the originating computer on the MUTE network only to nearby machines the program is aware of; if the file does not reside on those computers, their MUTE software sends out the request to the next group of computers they know about. Much like ants leaving a pheromone trail that other ants can follow to find food, the computer that has the file relays the message along the chain of computers that sent the request all the way back to the originating computer. Rohrer explains that this scheme improves one's chances of locating and retrieving specific files, and boosts the user's anonymity. However, the programmer acknowledges that MUTE, which currently only has about 32,000 users, could lose efficiency as more and more computers join the network. Furthermore, the MUTE network may not escape the scrutiny of the music industry, which has pledged to continue its legal war against file sharers and the providers of file-sharing tools.
    Click Here to View Full Article

  • "Wireless E-Voting Machines Raise Concern"
    New Scientist (01/20/04); Biever, Celeste

    The potential for Diebold Electronic Voting Systems' AccuVote-TSx machines to wirelessly transmit vote tallies is causing concern among computer scientists. Though the TSx units lack the cards required to establish wireless network connections, they do possess the PCMCIA slots that the cards can be inserted into, and Mark Radke of Diebold acknowledges that sliding in the card and configuring the machine is all that is necessary to deploy wireless capability "if required by the jurisdiction." If certified by the federal government, the TSx will be used in the upcoming November presidential election. E-voting advocates assert that wireless e-voting speeds up elections and ensures fairness: A central server could rapidly collate all the votes from a polling station, sparing officials the headache of physically collecting the memory cards from all e-voting terminals, while software could also be updated remotely via a wireless link. But though Doug Jones of the University of Iowa admits that the wireless e-voting's benefits would be significant, he maintains that there are far more drawbacks than advantages. Some researchers note that the presence of a PCMCIA slot raises the danger of a fake election official or voter clandestinely putting in a wireless card, which would complicate monitoring for hackers. Even those who think encryption could be the key to transmitting critical information securely harbor doubts that this will ever come to pass. "What concerns me is that the poll workers would need to be technically competent and sensitive to sophisticated machines," notes Robert Kibrick of the University of California, Santa Cruz.
    Click Here to View Full Article

    To learn more about ACM's concerns regarding e-voting; visit http://www.acm.org/usacm/Issues/EVoting.htm.

  • "Spam Fighters Compare Notes"
    IDG News Service (01/16/04); Roberts, Paul

    Anti-spam weapons were the focus of the second annual MIT Spam Conference, where topics ranged from the effectiveness of Bayesian spam filters to litigation to user authentication. John Praed of the Internet Law Group explained that strong lawsuits against spammers can only be built through close collaboration between spam-filter writers and law enforcement, while UnSpam cofounder Matthew Prince advised email providers and law enforcement to become more proficient at leveraging existing statutes to halt email harvesting and other practices that support spammers. Conference attendees said Bayesian filters did little to stop the onslaught of junk email, and discussions were held on how their performance and accuracy could be improved through such strategies as deploying them on servers instead of on email clients. Yahoo! representative Laura Yecies supported the implementation of user authentication measures, particularly domain keys that verify senders of messages by employing public key encryption technology at the domain level; she explained that ISPs can permit authenticated emails to circumvent spam filters via domain keys, while even small online businesses would be able to avail themselves of the technology, which is free. The majority of 2004 Spam Conference attendees doubted that spam would be eliminated in the near future. Praed noted that spammers can mask the source of their email by using networks of susceptible home computers, and evade litigation by moving their operations to offshore locations. Nevertheless, there was general consensus that spammers are being impacted by the anti-spam movement--for example, spam lawsuits, filtering, and other efforts are definitely raising the cost of doing business.
    Click Here to View Full Article

  • "Software Gives Cell Customers Say Over Who's Tracking Them"
    Associated Press (01/19/04); Meyerson, Bruce

    Bell Labs researchers will detail a network software engine that allows cell phone users to pick and choose when, where, and with whom to share location data, as well as what specific data should be shared, at this week's 2004 IEEE International Conference on Mobile Data Management. This personalized location information sharing can be achieved without overtaxing the network's computing power by utilizing a "rules-driven" programming strategy, the researchers claim. The breakthrough could be an important step toward the introduction of wireless "location-based services" that customers will appreciate for their convenience and non-intrusiveness. Examples of such services include restaurants and other businesses sending text messages to cell phones when users come into close proximity, or customer and co-worker location. Most cell phone owners, however, do not like the idea of their movements being monitored 24/7, which is what makes personalization so desirable. The flexibility offered by such personalization is highly sought after by wireless companies that wish to service numerous customers on a single network. Bell Labs declares that negotiations are underway with wireless operators to test the technology, which could be ready for the commercial market by 2005.
    Click Here to View Full Article

  • "Is the War on File Sharing Over?"
    Salon.com (01/15/04); Manjoo, Farhad

    Recording Industry Association of America (RIAA) officials are confident that the hundreds of lawsuits the organization filed against individual users of file-trading services has led to a significant decrease in song swapping, while growth in legitimate online digital music sales has started to reverse the erosion of music industry revenues. But critics dismiss these conclusions and claim song swapping has actually risen since the lawsuits began, while strong sales numbers for the iTunes Music Store--considered to be the standard-bearer of legit digital music services--are not representative of all music business. The RIAA bases its findings on studies such as a 2003 Pew Internet & American Life Project telephone survey of over 1,300 Internet users, estimating that the number of people who admitted to downloading songs off the Internet fell from 29 percent in March to just 14 percent in November and December; however, other groups take such estimates to task, arguing that many respondents gave false information. The odds are good that RIAA lawsuits will become less effective in 2004, considering that the music industry is now required by law to present its case to a judge whenever it plans to sue a file trader. In addition, the threat of litigation is unlikely to deter the legions of engineers developing more advanced peer-to-peer offerings. If, however, the RIAA's claims are accurate, critics such as Electronic Frontier Foundation attorney Fred von Lohmann doubt that authorized digital song outlets will offer music fans as much freedom as free music services such as Napster. He proposes that a blanket license system be implemented in which consumers play a flat fee to download as many copyrighted tracks as they want, with the recording industry splitting the money according to standard downloading measurements. Von Lohmann is confident that a blanket license scheme will emerge by year's end as music fans become more estranged and music revenues continue to fall off.
    Click Here to View Full Article
    (Access to full article available to paid subscribers only.)

  • "IT Industry Watches Iowa"
    CNet (01/19/04); McCullagh, Declan

    Democratic presidential hopefuls have had little to say on technology issues, but with the Iowa caucuses and the forthcoming New Hampshire primary, lobbyists are keeping a close eye on candidates' positions on IT offshore outsourcing, a hot-button topic that many companies support as a competitiveness-sustaining measure. Reps. Dennis Kucinich (D-Ohio) and Richard Gephardt (D-Mo.), along with Sen. John Kerry (D-Mass.), are taking an anti-offshoring stance: Kerry's campaign Web site promises that Kerry would, if elected, attempt to slow down offshoring; the Massachusetts senator also introduced legislation last November calling for employees of offshore call centers to identify their location. Information Technology Association of America President Harris Miller argues that such a policy threatens to erode the United States' leadership position on trade issues, and expresses concern that, "in their eagerness to create a policy issue, some [candidates] have engaged in a lot of antitrade rhetoric and antiglobalization rhetoric." Electronic Industries Alliance President Dave McCurdy singles out Sen. Joseph Lieberman (D-Conn.) for being more simpatico with industry on tech issues than the Bush administration; prior to the 2000 election, Lieberman favored continuing the Internet tax moratorium, removing the H-1B visa cap, promoting antispam bills, and extending the research and development tax credit. McCurdy also praises Kerry for his savvy on tech issues, and calls Gephardt out of their league because of his stance on trade and outsourcing. Howard Dean stands out for an Internet policy paper calling for the protection of the Internet's end-to-end nature while at the same time promoting greater federal spending on universal Internet access, while former Army Gen. Wesley Clark's Web site has no tech topic. Sen. John Edwards' (D-N.C.) opt-in spyware bill gives him a strong privacy record, according to Chris Hoofnagle of the Electronic Privacy Information Center, but political experts generally agree that technology will not be a major theme of the presidential election. Adam Thierer of the Cato Institute predicts that both President Bush and the Democratic candidates "will probably just play it safe and stick to bland platitudes and generalities about how 'technology is vital to the U.S. economy.'"
    Click Here to View Full Article

  • "LWM Speaks With Richard Stallman"
    LinuxWorld (01/19/04); Bedell, Kevin

    GNU Project founder Richard Stallman says the free software movement is based on the desire for technical and community freedom, and requires an entirely non-proprietary system: That is the fundamental reason for the GNU Project and GNU operating system. Proprietary software inhibits people's freedom to help themselves by modifying products, and help others by distributing their improvements; in contrast, free software is written entirely for the benefit of users without any hidden agenda. Stallman says proprietary software products often have hidden features that actually go against users' interests, but are impossible to remove except by the vendor. The GNU Project is analogous to a new continent where programmers can build new enterprises without artificial inhibitions: People can contribute to the free software movement by writing new programs or technical manuals for existing programs. Stallman says there is tremendous need for technical manuals, while political organization and advocacy are important contributions as well, as is convincing people's existing companies to use free software solutions. Another important step people can make to help the free software movement is to clarify the difference between free software and open-source software, and to clarify the difference between the GNU system and Linux. Linux was a kernel developed for the GNU system and uses a GNU Project free software license. Open-source advocates promote free software for technical reasons, not ethical ones. Stallman says the free software movement is fundamentally different because it is premised on freedom, not just technical superiority, and he complains that the open-source philosophy sees no harm in adulterating free systems with non-free software.
    Click Here to View Full Article
    (Access to article available to paying subscribers only.)

  • "Software Repairs Itself on the Go"
    Technology Research News (01/21/04); Patch, Kimberly

    MIT researchers have developed an ad-hoc method of data repair that permits a system to continue computing while repairs take place, using funding from the National Science Foundation, the Defense Advanced Research Projects Agency, and the Singapore-MIT Alliance. The technique is less costly and more resource-efficient than current log-rollback-and-replay methods, which means it could be applied to more applications. The method analyzes abstract representations of data to ascertain relationships between data objects rather than searching for computer code errors. MIT's Martin Rinard says that upon the identification of an inconsistency the system performs a series of repair actions: Such actions include data reconstruction, removal of corrupted elements, initialization and insertion of missing components, the creation of self-consistency by variable value shifts, or a combination of the above. By making an inconsistent state consistent, the method allows a system to continue functioning even in the face of errors that would normally cause it to fail, Rinard notes; in addition, the software overhead of failure recovery is reduced. "One of the goals of our research is to automate the technique to the extent that it becomes feasible to apply routinely to all kinds of software," says Rinard. "Our research may turn out to [enable] the systems to detect and repair damage to themselves so that they can continue to execute successfully without human intervention." On the other hand, the MIT researcher acknowledges that data could be destroyed by error, in which case it makes more sense to permit system failures and wait for human intervention.
    Click Here to View Full Article

  • "Companies Tossing Aside Consumers' Freedoms"
    SiliconValley.com (01/18/04); Gillmor, Dan

    Dan Gillmor writes that technology companies' fealty to avaricious entertainment cartels that seek to protect intellectual property at the expense of consumer freedoms was evident at this month's Consumer Electronics Show (CES), although some consumer-friendly products did stand out. Microsoft, Intel, and Hewlett-Packard showcased technologies and initiatives that adhere to the entertainment industry's restrictions on the distribution of copyrighted digital content. HP CEO Carly Fiorina officially promised that her company would remain loyal to entertainment providers by blocking the use of unauthorized content in forthcoming consumer electronic products; Microsoft, meanwhile, is bundling content-management controls and other digital usage limitations into its products, while Intel's effort to enhance PC security could also mean less customer-friendly wares. Gillmor moderated a panel at CES where panelists agreed that users of high-definition TV would enjoy limited usage rights. TiVo will provide a service this year that lets customers view, archive, and replay content in multiple home locations and even on PCs in certain situations, but the devices will not be employed to their full usability. Gillmor writes that it is "all too likely" that the entertainment industry will attempt to impose their usage restrictions at more consumer-friendly offerings spotlighted at CES, such as Elgato Systems' Eye TV line, which allows users to view, record, edit, and store TV on the Macintosh.
    Click Here to View Full Article

  • "The Forest vs. the Trees"
    Computerworld (01/12/04) Vol. 32, No. 2, P. 40; Hoffman, Thomas

    Economists and academics say current measures of IT-driven productivity do not take into account intangible aspects such as increased value to the customer. Macro-level research that looks at government statistics does not help companies make decisions, according to business observers such as Computerworld columnist and former CIO Paul A. Strassman; he and colleagues at IT consultancy Alinean surveyed 20,000 firms to find the ratio of sales, general, and administrative (SG&A) costs to cost of goods. That specific figure provides a measure of a company's efficiency, but has remained flat for the last 10 years due to dot-com-era IT investments, rapidly increasing health care costs, and higher salaries. "Build-and-junk" cycles also prevent companies from measuring the full effect of their investments. Such specific measures do not account for added customer value, argue academics such as Cap Gemini Ernst & Young CTO John Jordan. General Motors chief strategy officer Daniel G. McNicholl says IT investments at his company have reduced conflicts between the manufacturer and its dealers--an intangible benefit not taken into account by micro-level studies. Strassman responds by saying that company boards of directors cannot effectively interpret broad research into strategy; more detailed studies could lend insight into how instant messaging and Web surfing put a drag on productivity in the workplace, for example. Recently, Harvard University's Dale Jorgenson reported that IT has boosted productivity in the services sector where investment has been heaviest. Harvard Business School senior associate dean F. Warren McFarlan says there is a need for both macro- and micro-level studies, but that effects of new products and efficiencies require micro-level scrutiny.
    Click Here to View Full Article

  • "U.S. Stays on Top"
    CIO (01/01/04) Vol. 17, No. 6, P. 44; Overby, Stephanie

    Although there is little room for doubt that American IT workers will be in less demand by the end of the decade, there is no evidence to suggest that the entire U.S. IT workforce will be replaced by lower-wage professionals in other countries: Application development, system maintenance, programming, legacy, call center operations, and other low-level IT positions are likely to be offshored, but high-level jobs such as strategy development and business process improvement will remain in-country, although professionals would be wise to broaden their skills while CIOs should press for a bigger business education component in U.S. IT degree programs. "A strong investment in education in the U.S. and intelligent U.S. government policies will produce aggressive investment and innovation, and the U.S. will benefit along with the rest of the world from the economic benefits of IT," contends Intel CIO Doug Busch. Gartner estimates that the U.S. IT workforce will have shrunk 25 percent by 2008, while AMR Research's Lance Travis predicts that IT workers whose expertise includes business processes, architecture, strategy, and project management will be retained--and be even more highly valued. He describes such employees as "50/50 professionals," which Gartner VP Diane Morello defines as versatile IT workers skillful in both business and technical matters. Not everyone, however, will want to remain in IT, given what is required of them to attain this versatility. Though the government has a responsibility to foster research and development to encourage innovation, Busch says private-sector CIOs should not wait for this to happen and support their own enterprise R&D investments. Talented IT staff and a healthy R&D focus are critical to maintaining the United States' innovation superiority, but its capitalist democratic model is also a key strength. "The free flow of information and the freedom to develop products is really the U.S.'s ace in the hole," asserts Meta Group's Maria Schafer.
    Click Here to View Full Article

  • "Linux Looks for New Worlds to Conquer"
    Network World (01/19/04) Vol. 21, No. 3, P. 47; Hochmuth, Phil

    Linux will face more challenging business issues than technical ones in the coming year, according to industry experts. The software has established credibility in the data center and has enough implementation references to start really building momentum. A number of independent software vendors are also working to add their enterprise applications to the Linux fold, including SAP and Oracle. IBM Linux general manager Jim Stallings says his company now has about 5,000 applications in its Linux database, but that the operating system needs about 20,000 applications available to most effectively serve enterprise customers. International Data vice president Dan Kusnetzky says that discussion about Linux adoption needs to move up several tiers in the business organization and that Linux companies need to lay out a good business case as to why Linux is the most economical platform. As Linux grows in the upper IT echelons, it is also reaching out to lower-level devices such as the desktop, mobile phones, personal digital assistants, security devices, and home entertainment devices. The governments of Brazil, China, and Germany have already adopted Linux as an alternative to the U.S.-based Microsoft, and Gartner analyst Michael Silver says that companies can selectively begin to switch to Linux on PCs with limited business functions, such as data entry. With Linux distributions consolidating, application development will become more focused, says Novell vice president Jeff Hawkins, whose company last year bought both Linux firms SuSe and Ximian. Linux's flexibility makes it a contender for the small device and embedded systems markets; its superior networking capabilities makes it a good choice for next-generation mobile phones, and it is extremely secure when stripped down for use on security switches, firewall boxes, or intrusion-detection boxes.
    Click Here to View Full Article

  • "Automatons Invade Vegas"
    Software Development (01/04) Vol. 12, No. 1, P. 21; Morales, Alexandra Weber

    The Palo Alto Research Center's (PARC) PolyBot project funded by the Defense Advanced Research Projects Agency was partly conceived as an initiative to build robust, versatile, and reconfigurable robots for disaster recovery. The PolyBots, which are currently in their third generation, are modular machines that can automatically reconfigure themselves into different shapes and that boast affordable, easily replaceable components. "A robot that can adapt to an unknown environment is ideal--it can roll, climb and reconfigure itself to form a protective cage around the person under the rubble," notes PARC researcher Craig Eldershaw. First-generation PolyBots are screwed together, Generation II PolyBots use infrared sensors to help facilitate automatic reconfiguration, and Generation III modules are outfitted with a Motorola PowerPC 555 embedded chip with 1 MB of external RAM. Generation I PolyBots employ the multihop RS485 serial protocol for communications, while the performance limitations of RS485 have been overcome in Generation II and III PolyBots with the Control Area Network protocol and a dynamic routing algorithm to split the network into subnets. PARC plans to license the XML-based language. Each PolyBot module moves according to a "Phase Automata" specified in the user's program, but before the machines can be modified for higher intelligence operations, PARC language researcher Ying Zhang explains that automatic reconfiguration, docking, and reconfiguration planning must be addressed.

  • "The New Enterprise Portal"
    InfoWorld (01/12/04) Vol. 26, No. 2, P. 42; Knorr, Eric

    New kinds of integration and application development are converging at browser-based, business-to-employee (B2E) portals that are quickly assuming the role of enterprise user interface and yielding dramatic increases in worker productivity. Portals have sustained their momentum throughout the economic downturn, while portal server offerings have matured into the current standard of composite control panels based on existing data and apps that can be tailored to individual users. At one end of the B2E portal spectrum are enterprise-wide home pages with a small set of functions, while at the other end are targeted portals designed to handle clusters of related business processes; the narrower, deeper portals have outclassed "broad and shallow" portals of filtered news feeds and corporate announcements in terms of functionality and employee preference. Applications manifest as "portlets"--customizable graphical objects--in the browser-based portal window, and the homogeneous graphical Web apps environment offered by portal servers makes them ideal as an architecture for future app development. Companies boasting server suites usually allow advanced users to customize portals as far as simple composite app development while providing an integrated sensor environment for deep development; IBM's Larry Bowden believes the best place for portal app development is "Where it's a transient kind of integration, and it's an ad-hoc approach to structuring a little mini process." Gartner VP Gene Phifer points out that enterprise application vendors expect to increase revenues by offering the functionality of their apps to more users via portals. Meanwhile, portal servers are building portfolios of features such as search, collaboration, and content management, and independent portlet players will benefit from standardization.
    Click Here to View Full Article

  • "7 Hot Projects"
    Technology Review (01/04) Vol. 106, No. 10, P. 52; Jonietz, Erika

    A diverse number of technologies on the edge of commercialization promise to dramatically improve products and services. IBM's automatic speech translator, which could show up in personal digital assistants or laptops by the middle of next year, is currently in the prototype stage; researchers have developed a laptop that employs speech recognition software to convert spoken words into text, while translation algorithms turn the text into a second language and text-to-speech technology vocalizes the translated phrase. Cynthia Dwork of Microsoft Research conceived of a spam blocker requiring any networked computer to solve a math problem for every email message it sends, and Dwork and several colleagues have worked to adapt the technique to depend on memory latency rather than chip speed so that both newer and older computer systems can benefit. As part of a Defense Advanced Research Projects Agency initiative to create a next-generation supercomputer, Sun Microsystems is working on capacitive coupling, a wireless chip-to-chip communication method in which the movement of a charge through a transistor on a chip disturbs the surrounding electrical field enough to cause an identical charge to pass through the same type of transistor on the facing chip; the communications technique, which is estimated to be up to 60 times faster than the fastest existing system, could be introduced into computers within five years. Electrical engineers at Hewlett-Packard Labs are attempting to facilitate end-to-end media delivery across all networks and devices by moving digitized files more quickly by adjusting their transmission format to the devices receiving them, and to place more content at the network edge. The researchers have developed nodes that can be added to current networks: The nodes transfer media files off individual Web servers and place them closer to users; ascertain the optimal routes to send files; discover what nearby users are watching, determine their viewing and listening preferences, and "pre-fetch" data they will probably want; and detect the specific devices receiving the transmissions and appropriately calibrate the media stream.
    Click Here to View Full Article
    (Access to article available to paying subscribers only.)

    [ Archives ]  [ Home ]