Timely Topics for IT Professionals
About ACM TechNews
ACM TechNews is published every week on Monday, Wednesday, and Friday.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.
To send comments, please write to firstname.lastname@example.org.
Volume 4, Issue 438: Monday, December 23, 2002
- "Bush Plan to Monitor Net Raises Stir"
MSNBC (12/20/02); Sullivan, Bob
Among the goals outlined in the National Strategy to Secure Cyberspace is the formation of a Cyberspace Network Operations Center, a hub where ISPs would share information about network traffic in order to forestall cyberattacks. The September draft of the national strategy indicated that this central clearinghouse would be run by industry, but the New York Times reported on Dec. 20 that the plan has been revised to allow for federal control, a maneuver that has provoked worry among ISPs and others that the center could become a tool for online wiretapping. However, a Bush administration official who is close to the matter insists that there are no plans to take control of the center away from the private sector. He did acknowledge that government agencies that focus on cybersecurity--the FBI's National Infrastructure Protection Center and Carnegie-Mellon's Computer Emergency Response Team, for instance--might be merged together under the plan. Nevertheless, Washington lawyer Stewart Baker comments that ISPs are under "an awful lot of pressure" to be included in a central Internet monitoring center, although he has doubts about its effectiveness. Rather than focus on ordinary traffic for signs of terrorist activity, security experts will probably devote their attention to bigger traffic disruptions that could herald online assaults. Baker adds that ISPs are skeptical that the facility's potential security strengths outweigh the risks to people's privacy. An amended version of the National Strategy to Secure Cyberspace will be submitted by the White House in early 2003.
- "Many Tools of Big Brother Are Now Up and Running"
New York Times (12/23/02) P. C1; Markoff, John; Schwartz, John
The government already has eyes and ears observing people's everyday activities, but that data is not currently gathered or analyzed comprehensively in a way that would be useful to intelligence agencies. The Internet, as well as new Web technologies such as XML, have helped to link of thousands of information warehouses with different types of data--email, cellular phone usage data, toll booth data, e-commerce stats, and banking information. The controversial Total Information Awareness project would enable the government to tap into these various sources, compare the data, and alert authorities to possible terrorist activity. Besides drawing on conventional digital transactions, the Total Information Awareness project also uses commercially available technology, such as the Groove collaboration software created by Lotus Notes creator Ray Ozzie. Groove enables real-time remote collaboration between intelligence analysts at different agencies and hooks up various data analysis software. Critics say such a system compromises civil liberties, while others argue it is unworkable. Dorothy Denning, a Naval Postgraduate School professor in Department of Defense Analysis, doubts the government can connect the right dots fast enough to avert a terrorist strike since it does not know exactly what to look for. The idea for the project was conceived at the Defense Advanced Research Projects Agency (DARPA). A DARPA-sponsored advisory group formed of policy, technology, and intelligence experts in both the government and the private sector debated the project in three meetings after the Sept. 11 attacks. WebMethods senior vice president and former technology secretary for Virginia, Don Upson, says the debate over the project is healthy, since it will produce the data analysis policies needed in the future.
(Access to this site is free; however, first-time visitors must register.)
- "File Swapper Eluding Pursuers"
Washington Post (12/21/02) P. A1; Cha, Ariana Eunjung
File-trading software Kazaa is proving a slippery adversary for the music and movie industries seeking to shut it down through legal means. The system was built in Estonia by three freelance computer programmers who were commissioned by a man in the Netherlands, and is now owned by a company incorporated in the island nation of Vanuatu but operated from Australia. The multitude of legal targets, all situated in different legal jurisdictions, is part of a strategy meant to avoid prosecution, according to the industry plaintiffs. Those same forces were able to shut down Napster, which was more centralized in terms of management and technology than Kazaa. Kazaa has built a far greater following than Napster ever had, with over 3 million users online at any time sharing files--twice the amount Napster was able to claim in its heyday. In their suit, the Recording Industry Association of America and the Motion Picture Association of America succeeded in getting a U.S. court judge to rule that the software programmers who made Kazaa speak to the plaintiffs' representatives. The goal is to obtain technical details of the Kazaa program as well as ways that it might possibly be shut down. However, an Estonian judge has ruled that the programmers do not need to respond to those summons. Meanwhile, a Russian programmer known as Yuri created Kazaa Lite, a rogue program without built-in advertising capabilities. And Niklas Zennstrom, one of the two Scandinavian businessmen who commissioned the creation of the original Kazaa, has hired technology firm Altnet to create a new system that would allow for content distribution control, such as the ability to charge fees, assign limited use, and find files from the nearest network node.
- "Electronics Makers Give Little Respect to Consumers' Rights"
SiliconValley.com (12/23/02); Gillmor, Dan
Electronics providers routinely ban consumer modifications of the products they buy, a practice that is "blatantly anti-competitive," writes Dan Gillmor. For example, DVD movies typically contain software code that limits how they play on DVD players, as well as in certain regions. This is an example of how the entertainment cartel is attempting to control all distribution of its products, as well as the resulting profits. Games, movies, and other content are released only when and where the industry decides, and the price for what are basically the same products can vary from region to region. This is nothing more than market manipulation, Gillmor charges. Meanwhile, in Hong Kong, Sony, Nintendo, and Microsoft are suing Lik Sang for selling hardware modification chips that allow consumers to reconfigure game consoles to play unauthorized copies of trademarked games as well as other kinds of software. The console manufacturers and the court appear to be ignoring that such devices could be used for legitimate purposes. Gillmor notes that the recent acquittal of a Russian software company for allegedly breaking the Digital Millennium Copyright Act for selling a program that disables e-book software copy safeguards could set a precedent for challenging the law. On the other hand, he fears it could trigger a flood of similar corporate prosecutions that seek to suppress consumers' reasonable and eminently legal use of products.
Click Here to View Full Article
- "Digital Copyright: A Law Defanged?"
Business Week Online (12/19/02); Salkever, Alex
Although cyber-libertarians declare the U.S. District Court of San Jose's acquittal of Russian software firm ElcomSoft for allegedly violating the Digital Millennium Copyright Act (DMCA) a triumph for their side, it does not really set a precedent for challenging the interpretation of the law. For that to happen, a similar case has to be brought before the appellate court. However, the ruling may make federal courts exercise caution before deciding to prosecute so-called DMCA violators. This could encourage researchers to pursue tech innovations without fear of lawsuit. To successfully convict someone of breaking the DMCA, prosecutors must demonstrate that the offender "willfully" makes technology to bypass copyright protection, but proving this in the ElcomSoft case was difficult because there is no Russian DMCA counterpart. Other factors that may have worked against the federal case was the trial's location in Silicon Valley, where the jury was more likely to be sympathetic to technology interests. "This is a good reminder that the statute is more limited than people think," notes George Washington University professor Orin S. Kerr.
Click Here to View Full Article
To read more about DMCA, visit http://www.acm.org/usacm.
- "Nanotech Pioneer Looks Ahead"
Investor's Business Daily (12/23/02) P. A5; Tsuruoka, Doug
Nanotechnology pioneer and IBM Fellow Don Eigler believes electronic devices that exist on the molecular scale will one day revolutionize mankind's way of life, although researchers have only just begun to tap into nanotech's vast potential. He explains that his team's development of the new molecule-cascade technique shows that nanometer-scale computation is feasible, as demonstrated by an atom-sized switch created with the method. Although many people say that a practical use for nanotech will not emerge for several decades, Eigler notes that nanoscale structures are already in use, including semiconductor components and tiny disk drive recording heads. He says that nanotech will allow computers to continue shrinking in size, while raising their performance levels, integration levels, and power efficiency. This could eventually lead to innovations such as a wristwatch-sized device that has the processing power of 100,000 or even a million desktops. Eigler cannot predict exactly when such a device will be developed, given the fact that science itself is inherently unpredictable. There are claims that producing nanotech devices commercially is too expensive, but he thinks it will one day become possible to mass-produce nanotech relatively cheaply. Although he says, "We're very much in the age of exploration when it comes to nanotechnology," he also says, "We've surprised ourselves with the progress we've been able to make."
- "Computers Just Doing What Comes Naturally"
Melbourne Age (12/20/02); Barker, Garry
Computer technology needs to become easier to use, not necessarily more powerful, according to Telstra Research Laboratories CEO Hugh Bradlow. Telstra's IT research facilities are the largest in Australia, and workers there are developing new and better computer interfaces. The latest product is Lyrebird, a speech-recognition program that translates spoken commands into actionable computer code. The program works in tandem with other speech applications, and requires speech samples from the user in order to build a grammatical foundation it can understand. Bradlow says Telstra researchers are also developing a common sign-on capability for multiple platforms, such as mobile devices, PCs, and TVs. The idea is to allow users to receive the same customized information and messages in whatever format is convenient. Other efforts focus on making computing more accessible to the disabled, and even those with mental deficiencies. Bradlow says miniaturization led to lower costs, which in turn allowed for technology to be used in new and creative ways. Wireless base stations available today for several hundreds of dollars, for example, would have cost millions 20 years ago. Bradlow says this type of commoditization will continue in the future and make today's high-end capabilities readily available to everyone in decades to come.
- "An Aria With Hiccups: The Music of Data Networks"
New York Times (12/19/02) P. E7; Eisenberg, Anne
Stanford University music professor Chris Chafe has developed a new acoustic method of monitoring network performance. In the same way a guitar string resonates at a higher pitch when vibrations are shorter in length, network signals are assigned higher pitches when they speed faster to and fro on the network. When there is data loss, the acoustic diagnostic system translates the network "jitter" as staccato pulses. The result, says Chafe and other experts who are familiar with the work, is a system that promises to add another, more intuitive dimension to monitoring network performance. Network connections currently can be checked with programs that offer visualizations, or by measuring latency with a "ping," a test signal sent to the destination and returned. Network performance is a critical factor in collaborative Internet applications, such as telesurgery. Doctors performing remote operations, for example, would be able to more easily factor in network latency while doing their job. Likewise, videoconferences are often hampered by lagging audio signals, negating the feeling of immediacy and closeness the technology is supposed to provide, according to McGill University researcher Jeremy R. Cooperstock. He is working with Chafe to integrate audio diagnostics into his own high-performance teleconferencing work. The two researchers recently organized a collaborative musical performance for musicians in their respective cities of California and Montreal.
(Access to this site is free; however, first-time visitors must register.)
- "Agencies Seek Stronger Controls on Trade in Dual-Use Technologies"
National Journal's Technology Daily (12/18/02); New, William
Agencies in the Bush administration want to curb the export of dual-use technologies--commercial products that have military applications as well--to unfriendly nations by raising awareness among "transshipment" countries that lack effective export controls. Terrorists' increasing use of digital tools is adding urgency to the initiative. Targeted dual-use technologies the United States restricts to certain countries include electronics, information security and telecommunications, computers, and lasers and sensors. John Schlosser, director of the Office of Export Control Cooperation at the U.S. State Department's Bureau of Nonproliferation, said in Bangkok last week that arms dealers take advantage of such technologies' dual nature to hide their military capabilities and fool government officials and legitimate businesses at transshipment ports. To combat this, the United States is helping equip such nations with technology that monitors shipments, such as radiation detectors, handhelds, and cargo-imaging devices. Meanwhile, the Transshipment Countries Export Control Initiative (TECI), run by the Commerce Department's Bureau of Industry and Security, is a program designed to help those nations develop effective export-control policies via government-to-government and government-to-industry collaboration. The Commerce Department's Kharan Bhatia explained in Bangkok last week that TECI's goals are to boost awareness, establish communications channels, and devise "best practices."
- "New Heights for Wireless Net Access?"
ABCNews.com (12/20/02); Eng, Paul
Sanswire Technologies, among other companies, is working on high-altitude solutions to increase the range of high-speed Internet access, using new airships to deliver wireless communications services. In collaboration with Canada's 21st Century Airships, Sanswire is developing the Stratellite, a spherical blimp designed to carry as much as 4,000 pounds of telecommunications equipment and float it up to 13 miles above the ground, where it can deliver Wi-Fi voice and data services across 300,000 square miles. At that altitude, the Stratellite's operations will not be disrupted by air traffic or weather; furthermore, the lack of external gondolas or control fins will make the vehicle highly aerodynamic and efficient, while its spherical configuration obviates the need for wide turns, unlike the cigar shape of current airships, notes Sanswire CEO Michael K. Molen. In addition, all of the communications gear is incorporated into the Stratellite's Kevlar fabric. 21st Century Airships CEO Hans Colting reports that a spherical craft that can reach heights in excess of 7.5 miles will be tested next year, but it will be some time before they can test the vehicle at the 13-mile altitude. Analyst Tim Barajin notes that several factors could work against the concept, notably the FCC's hesitancy to allow Wi-Fi operations to increase their range, and possible moves by the Department of Defense to limit Wi-Fi systems on the grounds that the transmissions could affect radar systems. Back in July, SkyTower demonstrated a solar-powered plane capable of transmitting voice, data, and video images to handheld cell phones from an altitude of over 12 miles. Before the technology can take off, the aircraft will have to be equipped with new power systems--possibly modified fuel cells--to ensure that the communications equipment can function for long periods.
Click Here to View Full Article
- "Voice Holds the Key"
BBC News (12/21/02); Hardy, Ian
Speech recognition technology carries much more weight as a result of Sept. 11 and its aftermath, which includes a new focus on security, especially at the corporate level. Thanks to recent advances, voice recognition companies such as Phonetic Systems are marketing biometric voice pattern authentication systems to enhance building security. With Phonetic's system, a person speaks his or her name into a telephone in the lobby, providing a voice sample that is authenticated, thus authorizing access. "That's why we have verification where you don't have to say a specific password, where you can just talk," notes voice biometrics expert Dr. Judith Markowitz. Meanwhile, a tool from Nuance requires a person to speak into a telephone handset, where a computer compares the voiceprint to a vast database, using sound markers to find a match. Having the voice be live rather than recorded ensures security, since tape recordings cannot precisely replicate a voiceprint. Speechworks International's Stuart Patterson acknowledges that changes in voice--due to a cold, for instance--could make voiceprint recognition difficult. That possibility would necessitate a secondary means of identification, such as inquiring what a person's social security number is.
- "The Notebook Vs. Desktop Popularity Contest"
NewsFactor Network (12/18/02); Zager, Masha
Notebook computer purchases are on the increase, both because of their decline in price, says IDC analyst Alan Promisel, and because of their skyrocketing performance improvements. Notebooks now rival the speed and processing capacity of desktops. Notebooks have fallen in price from an average of $3,000 in 1999 to an average price of $1,500 today, and commensurate with falling prices, notebooks have jumped in terms of percentage of computer devices shipped from 17 percent in 1999 to 25 percent during 2002's third fiscal quarter, according to IDC. During third quarter 2002, the number of notebooks shipped was 18 percent higher than third quarter 2001, while desktops only gained 5 percent in volume over the same period. Promisel says there is "still a technology gap" that separates notebook performance from PC performance but that this gap has been greatly reduced. Promisel notes that consumer spending has driven the surge in notebook purchases while still not outnumbering corporate purchases. Consumers are more responsive to falling prices, says Promisel, while Forrester Research analyst Jed Kolko says that consumers are turning to notebook purchases when looking for a second computer. Forthcoming Tablet PCs will create a hubbub of attention, says Promisel, and both analysts predict that Tablet PCs will appeal mostly to the corporate sector and will not elbow-out notebooks.
- "The 10 Best Hype Jobs of 2002"
ZDNet (12/17/02); Oltsik, Jon
The marketing of technology again suffered from too much hype this year, as a number of technologies were promoted beyond their ability to deliver, writes Jon Oltsik. Oltsik's pick for the 10 most overhyped technologies this year are, in order: ROI, the tech recovery, storage management software, self-regulating data centers, iSCSI, 10GB Ethernet, security, unbreakable Oracle 9i, CRM, and Web services. Web services made the list because it is still very underdeveloped, with uncertainties surrounding issues such as data definitions, business semantics, and a lack of application-level security. However, Web services eventually could very well do wonders for application development and integration, and even revolutionize the use of software. Technology companies also heavily marketed customer relationship management, but the concept was overhyped because people and business processes are also needed if companies are to successfully implement applications that will help them understand and serve their customers better. CRM does not fix business processes or train people, but can be used to support such efforts. The tech recovery and return on investment were overhyped the most. Analysts were still forecasting a tech recovery into the fourth quarter, while tech companies are now saying new technology will help lower costs and increase productivity, rather than saying IT will drive revenue.
- "Move Over, Silicon"
Economist (12/14/02) Vol. 365, No. 8303, P. 20
The quest to make cheap and flexible electronics has fueled advances in the use of plastic as a semiconductor, and yielded some interesting breakthroughs. Electro-luminescent light-emitting polymers (LEPs) hold great promise as flat panel displays: They have the functionality of silicon chips but are more conductive; they require no backlighting; and they are lighter and consume less power than other display technologies. Furthermore, LEP displays can be sprayed onto a flexible surface via inkjet printing, as demonstrated two years ago by Cambridge Display Technology and Seiko Epson. In the meantime, E-Ink is developing electronic ink that will facilitate the production of flexible displays with a bistable molecular structure that enables the devices to retain information even when they are turned off. This discovery has prompted Hewlett-Packard, Intel, and other chip makers to pursue polymer memories, often in collaboration with firms such as Thin Film Electronics and Coatue. It is hoped that plastic circuitry can also one day be printed on consumer items such as T-shirts and beverage containers. Meanwhile, Bell Labs is researching circuitry that can be stamped onto plastic rolls, a technique that offers greater resolution than inkjet printing, according to Bell Labs' John Rogers. However, for all their advantages, plastic transistors cannot match the switching speed of their silicon counterparts, and the technology will probably find its niche in displays, hybrid plastic-silicon devices, and disposable products.
Click Here to View Full Article
- "Patenting the Process"
InformationWeek (12/16/02) No. 919; Soat, John; Kontzer, Tony
There is a growing trend for companies to claim ownership of patented e-commerce processes and file infringement suits against users in order to collect usage fees that they feel are owed them. Sheppard, Mullins, Richter, & Hampton attorney Jonathan Hangartner says the Internet's omnipresence and the many e-commerce patents out there mean that many users are unaware of their existence. Enterprise application-integration tools and services vendor Divine has filed lawsuits against 15 e-commerce firms, alleging infringement on a shopping-cart patent it acquired when it bought Open Market last October, according to assistant general counsel Rich Nawracaj. Meanwhile, Acacia Media Technologies has filed patent-infringement lawsuits against 27 adult-entertainment companies for supposedly violating several digital media transmission patents. Acacia's counsel Robert Berman says the company thoroughly researched the patents over the last year to make sure that there was no prior art that could invalidate their claim. He adds that Acacia is entitled to 1.75 percent to 2.25 percent of the revenue generated by the patented technology. Many e-commerce patents are accused of using overly simplistic language and of being too general, but Bromberg & Sunstein Chairman Bruce Sunstein claims that this is done so that the processes being described by the patent applicants are clear. Many companies opt to settle such lawsuits out of court because they cannot afford to pursue litigation.
- "2002 Year in Review"
InfoWorld (12/16/02) Vol. 24, No. 50, P. 40; Schwartz, Ephraim; Shafer, Scott Tyler; Conolly, P.J.
The past year has witnessed significant developments in the areas of pervasive computing, edge computing, Web services, open source and open standards, and virtualization. The expansion of handhelds, wireless networks, and mobile infrastructure and standards from the likes of IBM, BEA, and Microsoft enabled wireless technologies to penetrate the enterprise even further, an important step on the road to pervasive or ubiquitous computing; the introduction of the 10GbE standard was a notable development, while wireless LAN technology matured. Edge computing emerged in 2002 as an important complement to centralized computing: Aberdeen Group research director Dana Gardner explains that centralized/decentralized computing hybrids will boost productivity and network value, while edge computing and identity management have started to converge to expedite intercompany information exchange and clarify data management procedures. Although Web services still require improved security, they advanced this past year thanks to the development of tools from Google and Macromedia designed to expand their usability, and moves to standardize back-end operations so that Web services can seamlessly interoperate. Open-source technologies, Linux in particular, made progress in 2002 thanks to several trends, including the creation of the Web Services Interoperability Organization. Meanwhile, companies such as Hewlett-Packard and IBM unveiled products and services that support Linux and other open-source technologies. Finally, virtualization made a sizable impact in the storage sector, and is making inroads into data centers. A key breakthrough was the debut of virtualization across heterogeneous systems, although the technology still lacks a universal description.
- "Future Tech: Thinking Machines"
Discover (12/02) Vol. 23, No. 12; Walter, Chip
The development of a thinking machine has repeatedly eluded researchers who follow a top-down approach whereby a computer is used as a model for the brain, but scientists such as neuromorphic engineer Kwabena Boahen theorize that better results can be obtained by patterning circuits after the brain and its neuronal functions. In his opinion, most artificial intelligence efforts failed because researchers did not consider the brain's inherent ability to change its neural pathways in response to the information it receives. At the University of Pennsylvania's Neuroengineering Research Laboratory, Boahen is working on chips that reprogram themselves in a similar fashion, using routers to determine the optimal information pathways. His lab has produced a retinomorphic chip that contains almost 6,000 photoreceptors and 4,000 artificial nerve connections, yet runs on just 0.06 watt of power. Within two years, this microprocessor could be implanted inside an eyeball, Boahen says. Meanwhile, Bruce McCormick of Texas A&M University's Brain Networks Laboratory expects to one day furnish a 3D neural schematic of the human brain with a microscopic camera known as the Brain Tissue Scanner; he anticipates that it will take about two decades to achieve this, once software that can represent vast amounts of visual data from every angle is developed. McCormick adds that building a crude version of the human brain by integrating neuronal maps with self-switching chips will take at least 20 years. A team led by Russell Jacobs at Caltech's Human Brain Project is capturing internal views of fetal animal brains using noninvasive methods such as magnetic resonance imaging to gain insights on the changes brain structure undergoes during development.