Timely Topics for IT Professionals
About ACM TechNews
ACM TechNews is published every week on Monday, Wednesday, and Friday.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.
To send comments, please write to firstname.lastname@example.org.
Volume 5, Issue 477: Wednesday, April 2, 2003
- "Does Security Mean Sacrificing Privacy?"
IDG News Service (04/01/03); Pruitt, Scarlet
ACM's Computers, Freedom, and Privacy (CFP) conference, now underway in New York City, is highlighting how government electronic surveillance efforts have accelerated in the wake of Sept. 11, and topics of discussion will include the Total Information Awareness project, the USA Patriot Act, and the proposed follow-up, Patriot II. Each of these initiatives involves the use of technology to extend law enforcement's powers to gather data and monitor the movements of citizens. Other items to be discussed at the conference include the National Crime Information Center criminal database and a passenger profiling system. The running theme of this year's CFP will be how civil liberties are being impacted by proposed and approved measures in the name of security. Speakers will include members of the ACLU, the Electronic Privacy Information Center, the Center for Democracy and Technology, and various academic institutions, but CFP chair Barry Steinhardt notes there will be a paucity of government representatives. Government officials who will be delivering addresses include former Rep. Bob Barr (R-Ga.) and Rep. Jerrold Nadler (D-N.Y.). Steinhardt adds that stricter U.S. visa regulations will discourage international participation. "It's a shame because CFP's great strength is as a forum of debate," he says. Scheduled sessions at CRP include "The Great Firewall of China--Internet filtering and Free Expression," and "Human Rights and the Internet."
For more information about ACM's CFP03, visit http://www.cfp.org.
- "'Big Iron' Retains Lustre"
Financial Times--FTIT Survey (04/02/03) P. 5; Foremski, Tom
Enterprise applications, scientific research, and other factors are fueling demand for mainframes and supercomputers; Bill Zeitler of IBM Enterprise Systems notes that the mainframe market has remained more or less the same over the last several years, while the server market declined 14 percent and 12 percent in 2001 and 2002, respectively. Zeitler attributes the mainframe's continuing popularity to several trends, including core customers' familiarity with mainframe architecture, server consolidation, and "technology substitution around Linux." Mainframes are also desirable for their security, easy manageability, durability, fault tolerance, and proven operating systems. The impending retirement of a significant portion of IT workers with mainframe skills could put a damper on the market, given the lack of talent needed to replace them; nevertheless, IBM will continue to develop and debut new mainframe systems and apply mainframe technologies to other areas, including grid computing. Another important element of grid computing is supercomputers: Hewlett-Packard's Sara Murphy comments that without grid computing, researchers would have a difficult time sharing supercomputing resources. She notes that robust supercomputers are easier to assemble out of commercial components using clustering software, of which IBM and HP are lead suppliers. Supercomputing's chief market is the life sciences sector, which has used such technology for efforts as diverse as the human genome project, drug discovery, and protein modeling. Supercomputers have also proven useful in product design and development, while the U.S. Energy Department is exploiting some of the most powerful machines to model weather systems and nuclear weapon stockpiles under the aegis of the Advanced Simulation and Computing (ASCI) program.
- "ISMA Pushes DRM for MPEG-4"
InternetNews.com (04/01/03); Naraine, Ryan
In an effort to develop digital rights management (DRM) capabilities to shield multimedia content formatted in MPEG-4, the Internet Streaming Media Alliance (ISMA) is moving forward with its Content Protection specification, which provides an encryption blueprint for streaming media and file downloading that can integrate with various key and rights management technology and licensed content protection devices. ISMA President Tom Jacobs says the standard is based on the 128-bit AES encryption spec outlined by the National Institute of Standards and Technology (NIST). "[It's] unencumbered by any additional royalty fees and intellectual property concerns and compatible with established Internet Engineering Task Force [IETF] specifications," he adds. ISMA has issued a peer review of the DRM spec and has sent out a call for network security, content protection, and cryptography experts to evaluate it; the alliance expects to have the standard finalized in June. The ISMA Content Protection spec is based on the v1.0 standard issued two years ago, and is considered by many to be a major step toward the widespread adoption of compatible, multi-vendor streaming media products and services. Jacobs notes that the spec "builds upon existing open standards and provides a core technical foundation for the protection of digital content."
- "DMCA Critics Decry State-Level Proposals"
CNet (03/28/03); McCullagh, Declan
Critics of the Digital Millennium Copyright Act (DMCA) are alarmed over indications that state legislators are considering proposals that would place even broader restrictions on the circumvention of digital copy-protection safeguards. The Association of Research Libraries, the American Association of Law Libraries, and the American Library Association fired off a letter to Colorado and Arkansas politicians, warning that the amendments they are debating could seriously impair libraries' provision of information services. The Colorado bill would limit the distribution of hardware or software deemed "capable of defeating or circumventing" anti-copying measures, while the government of Texas is considering a proposal that would restrict anonymous Web browsing or email encryption by forcing Internet users to disclose all their communications to communication service providers. "This has obvious implications for the right of anonymity online and the right to create tools such as proxy servers that have a lot of legitimate uses aside from their ability to facilitate anonymous speech," declared Fred von Lohmann of the Electronic Frontier Foundation. The Motion Picture Association of America (MPAA), chief supporter of the state bills, denies that the legislation is similar to the federal DMCA, claiming that it refurbishes cable and satellite protection regulations so they are better equipped to deal with the latest hacking methods. In Maryland, such proposals have been passed into law: It is now illegal for residents in that state to own or distribute software that can convert Web-transmitted music or video to another format without the express permission of the copyright holder. Unlike the DMCA, it features no exemptions for reverse engineering. Virginia, Delaware, Michigan, and Illinois have adopted similar legislation, according to Vans Stevenson of the MPAA.
- "Proposed Encryption Laws Could Prove Draconian, Many Fear"
Associated Press (03/31/03); Jesdanun, Anick
Critics are decrying Justice Department draft legislation that calls for stiffer penalties on the use of encryption in the commission of a crime, arguing that it would negatively impact legitimate applications of cryptography and make little headway in the war on terrorism. "Why should the fact that you use encryption have anything to do with how guilty you are and what the punishment should be?" inquires Stanton McCandlish of the CryptoRights Foundation. "Should we have enhanced penalties because someone wore an overcoat?" If the law were passed, first offenders convicted of crimes involving encryption could face up to five years of imprisonment, while previous offenders could receive up to 10 years. Opponents such as former Justice Department computer crimes prosecutor Mark Rasch worry that the law's provisions could be applied too broadly, to the point that people who accidentally file erroneous electronic tax returns could face such penalties, as could people who fail to pay state taxes attached to online retail transactions. The proposed legislation is a component of Patriot II, a follow-up to the 2001 USA Patriot Act. The proposal is likely to be a topic for discussion at the annual Computers, Freedom, and Privacy conference hosted by ACM in New York this week, which will focus on law enforcement measures being considered in the wake of Sept. 11.
Click Here to View Full Article
- "High-Performance Computing Clusters Have Gridlike Features"
Tech Update (03/25/03); Enck, John
Enterprise applications will not be suited for high-performance computing clusters (HPCCs) until 2008, but there are several key indicators showing readiness, including processor advances, server density, and application development tools. Unlike grid computing, which involves many computers of different platforms in a dynamic environment, HPCCs are servers of the same general type working on specific applications or computing tasks. Increasingly, such systems run on Linux and are based on Intel architectures, although some make use of reduced instruction set computer (RISC) architecture. Server vendors are keen to sell HPCCs to enterprises because they involve large numbers of servers as well as installation, integration, and application development services opportunities. Intel's 64-bit Itanium architecture is the most advanced processor type available for HPCCs, but is reserved for floating-point intensive parallel applications, while cheaper 32-bit IA-32 architectures are used for integer-oriented applications. HPCC hardware usually uses just one processor instead of multi-processor units because of the parallel nature of the task. Server density is another issue companies will have to consider, and new developments in blade server setups, as well as tighter server rack configurations, are important considerations. Application development tools addressing the HPCC environment have not yet been created, leaving company programmers to build programs by hand, using de facto standards.
Click Here to View Full Article
- "Are We Doomed Yet?"
Salon.com (03/31/03); Pacotti, Sheldon
Sheldon Pacotti writes that the computerization of information and the spread of networking could lead to what Sun Microsystems' Bill Joy terms "knowledge-enabled mass destruction," in which freely disseminated information accessible to anyone could have deadly consequences. Such threats would most likely take the form of self-replicating technologies such as nanotech plagues, and Joy's proposed solution is to "reconfigure" the concept of limitless knowledge-sharing. This would be in keeping with one of two surveillance models envisioned by science-fiction author David Brin, in which government officials can access a centralized network to keep track of citizens' movements--a setup that is vulnerable to abuse. Pacotti favors the second model, whereby the network can be accessed by anyone regardless of position, and checks and balances are distributed equally. He writes that Brin's scheme can be applied to the Internet and the Information Age, in which the choice people face is strict government regulation of knowledge or unrestricted access to information for everyone. In the end, it will be impossible to utterly eradicate self-replicating threats, but Pacotti believes that an open society model would support much faster and effective responses to those threats. The closed society, which relies on the suppression of information, would suffer much greater damage in such a scenario because only a privileged few would be allowed to develop a solution. "Our best chance of survival lies not in criminalizing certain kinds of expertise or knowledge but in disseminating that knowledge as widely as possible, so that any attack will be met by the widest possible resistance, a citizens' militia of knowledge workers, rather than a handful of cronies in an intelligence agency," argues Pacotti.
Click Here to View Full Article
- "First American Open in Robot Soccer"
Carnegie Mellon University is hosting the International RoboCup Federation's first American Open in late April to early May, where over 150 researchers and their soccer-playing robots will congregate. The goal of the international research and sports initiative is to further push the boundaries of artificial intelligence and intelligent robotics and to create a team of autonomous robots that will beat the human world-champion soccer team by 2050, and it has been going on for seven years. Most of the teams are from North and South America; leagues include the Small-Size Robot League, the Sony Legged Robot League, and the Simulation League. The event will include a competition and demonstration of urban search-and-rescue robots on a permanent site that will be used by researchers to develop new robots, and will host ASIMO, the world's most advanced humanoid robot. There will also be exhibitions, challenge events, and a robot talent show, according to chair Professor Manuela Veloso. She notes that the Americas are overdue to have their own robot open, and points out that Germany, Japan, and Australia have been holding opens for some time now.
- "Big Bang Project Sparks Cosmic Response"
CNet (04/01/03); Fried, Ian
The CERN research institute in Switzerland is the site of the Large Hadron Collider, a particle accelerator designed to test the "big bang" theory by generating particles believed to have existed when the universe was born, if they existed at all. The testing itself will not get underway until 2007, giving scientists four years to build and test a computer network powerful enough to process and store the vast amounts of data that the experiment is expected to yield. The Openlab grid computing network was developed by CERN scientists to evaluate the type of equipment that is likely to be standardized by the time the collider's work begins. IBM Research Fellow Jai Menon says the researchers expect the experiment to generate perhaps 5 petabytes of data annually once the accelerator is fully operational, while Openlab development manager Francois Grey says the collider itself should run steadily for 10 years. He says the network must be ready by 2005 so that researchers can test the simulation software used to measure and re-create the data the collider is supposed to produce. Each of the private companies contributing to the Openlab effort has a vested interest: For IBM, it will be an opportunity to test its new Storage Tank storage management software; Hewlett-Packard and Intel want to publicize Itanium as a next-generation mainstream server technology; and Enterasys Networks will supply 10 Gbps networking equipment. General manager of IBM's grid computing business Tom Hawk explains that both CERN and IBM's business clients face the same conundrum--how to cost-effectively manage a large-scale computing project by tapping into existing computing resources. The Sageza Group's Charles King adds that inclusion in the CERN project is a validation of participating companies' products, and offers an ideal testbed for their technologies.
- "A Vision of Superefficient Displays"
Business Week (04/01/03); Kharif, Olga
Ching Tang, who bears the title of Distinguished Inventor at Eastman Kodak, is a pioneer of organic light-emitting diode (OLED) technology, having found a breakthrough technique in 1985. Tang discovered that sandwiching certain organic compounds allowed those materials to emit different colors when charged with low levels of electricity. Because the screen film itself is the source of the light, instead of the back or side light used in liquid-crystal display (LCD) screens, for example, OLED displays today use as little as 30 percent of the energy required for other display technologies. A number of other groups have since jumped onto the technology and are finding ways to make OLEDs cheaper and better. Princeton University researchers are working on print manufacturing methods that could dramatically reduce the cost to make OLEDs in as little as two years. At General Electric, advanced technology leader Anil Duggal says OLEDs are one of six flagship research projects, and could replace incandescent and fluorescent lights. Entire walls or ceilings could be charged to light buildings, shaving 10 percent of U.S. energy consumption, GE predicts. Kodak's Tang is also working to increase the lifespan of OLED displays from approximately 3.4 years of daily, eight-hour use to around 16 years under the same circumstances. His work has meant adding four more layers to the original two-layer OLED, creating a sandwich that is still less than 0.1 micron thick, imperceptible without visual aid. Tang says OLED technology will challenge other display technologies for dominance in TVs and handheld devices within a decade.
Click Here to View Full Article
- "Molecule Toggle Makes Nano Logic"
Technology Research News (04/02/03); Smalley, Eric
Hewlett-Packard Laboratories researcher Pavel Kornilovitch has been working on a toggle switch that can open and close a circuit much like a household light switch--except that this switch exists on the molecular level. For computing, molecular-scale toggle switches could function in very high-capacity computer memory, with the toggle switch's "on" and "off" signals acting as "1"s and "0"s. This innovation would enable computer memory to hold up to 1.5 TB per square inch, 40 times the capacity of an average DVD today. Kornilovitch says that his envisioned toggle switches could be used for reconfigurable electric circuits that "create adaptive computer logic that would react [to] damage, or artificial brains where reconfiguration would facilitate the process of learning." The design being researched uses a stator fixed between two electrodes and connecting to a knob-shaped rotor that is attached to the stator by an atom, and with this atomic bond the rotor can rotate. Three benzene rings could act as a stator, for instance, with the rotor being just a hydrogen, carbon, or oxygen atom; an electric charge is used to establish the position of the rotor. "One end of the rotor carries an excess of positive charge and the other end carries an excess of negative charge," explains Kornilovitch, and this design acts like a magnet compelling the rotor to align itself in relation to the stator. Shooting the stator with electric current forces the rotor to re-align in an opposition direction, and this movement toggles the molecular switch from "0" to "1." Kornilovitch explains that striking a balance between the molecule's switchability and temperature stability is the most daunting task that lies ahead, while another challenge is maintaining uniformity among the molecule/electrode links.
Click Here to View Full Article
- "IBM, Government Talk Big Iron"
CNet (03/28/03); Kanellos, Michael
IBM reports that company executives met with representatives from the Homeland Security Department, the Energy Department, the National Science Foundation, Lawrence Livermore National Laboratories, and other federal outfits this week to discuss upgrading specific supercomputing applications, as well as issues related to high-performance computing. IBM's Nick Donofrio says that his company and the federal agencies are trying to work out a future supercomputing agenda, but there is sometimes a cultural gap between the private and public sectors. He notes that his firm will only pursue projects that utilize commercially viable technology, while government officials favor custom-made machines designed to fulfill specific research functions, so that performance will be optimized. National Coordinating Office for Information Technology Research and Development (NITRD) director David Nelson delivered a slide presentation last week on how systems that follow the off-the-shelf assembly approach would suffer from a longer development cycle and limited memory and input/output performance. IBM's Kathleen Kingscott counters that computers fabricated from off-the-shelf technology have a track record of superiority over specially-tailored systems. Nelson's presentation also noted that a federal task force will release a five-year federal supercomputing development and purchasing strategy in August. There are indications that Washington legislators are worried that the United States' supercomputing lead is threatened by international efforts such as NEC's Earth Simulator in Japan, but Kingscott says such concerns may be overblown.
- "Yeast Protein Wires Supercomputers"
United Press International (03/31/03); Choi, Charles
Handheld supercomputers equipped with nanoscale wires could one day become a reality thanks to the efforts of researchers at the Whitehead Institute for Biomedical Research. Such wires could be fashioned from highly durable fibers derived from genetically engineered yeast proteins. Ironically, the amyloid prion that forms the basis of such fibers is thought to be the chief cause of degenerative conditions such as mad cow disease. The researchers were able to create self-assembling 10-nm-wide strands by whirling the prion molecules in a centrifuge and varying the rate of spin or the concentration of protein in solution; subjecting the fibers to ultrasound vibrations also had significant results. The protein surface was bespeckled with the amino acid cysteine, which allowed the fibers to spontaneously bond to gold nanoparticles when placed between electrodes. The gold in turn attracted silver ions from the surrounding liquid solution, covering the strands in a highly conductive silver-gold sheath. University of Florida at Gainesville nanoscientist Charles Martin says this method could be used to create self-assembling circuitry, particularly through interaction between protein systems. The Whitehead researchers posted their findings on the March 31 online edition of Proceedings from the National Academy of Sciences.
Forbes (04/14/03); Woolley, Scott
The rapid adoption of the Wi-Fi standard is the sole bright spot in the bleak economic climate hovering over Silicon Valley, but the Institute of Electrical & Electronics Engineers (IEEE) recently released a new standard, Wider-Fi, that promises to outperform Wi-Fi in terms of range, penetration, usability, and security. In anticipation of the standard's impact, companies such as Nokia, Ensemble Communications, and Proxim are designing equipment to support Wider-Fi, while the FCC plans to relax spectrum limitations so that metropolitan wireless networks have more airwave access. Such a move could trigger a dramatic devaluation in cell phone companies' spectrum licenses, and hamper their plans to upgrade wireless networks to faster third-generation (3G) technology. Wider-Fi could also allow wireless ISPs to cheaply deploy broadband connectivity throughout homes and supplant phone lines, thus eating into the revenues of local phone companies and cable operators. Another IEEE working group is busy developing 802.20, a standard that will supposedly add mobility to wireless broadband and allow Wi-Fi communications from rapidly-moving vehicles. Telcos and cellular carriers dismiss such technologies as overhyped. Verizon Wireless CTO Richard Lynch believes that it could be as long as five years before a true Wi-Fi successor is ready for commercialization.
- "Cyber-War Tools Still on the Shelf"
eWeek (03/31/03) Vol. 20, No. 13, P. 1; Fisher, Dennis; Musich, Paula
Experts from the security and defense sectors say chances are slim that the U.S. military will use cyber-warfare to disrupt Iraq's infrastructure in the current conflict. Mark Rasch of Solutionary says the government has been wrestling with cyber-warfare rules of conduct for a dozen years, but doubts remain that a successful cyberattack can be carried out without incurring collateral damage. Government officials are worried that such forms of aggression could hamper future U.S. operations--or worse, provoke terrorists to launch cyberattacks of their own against American interests. However, observers such as Skaion co-founder Steve Durst believe that a major attack on U.S. networks is unavoidable, regardless of the military's restraint. "There's no reason to think that since we've used soft power [such as emailing and calling Iraqi military officers] that we won't use hard power," he contends. Meanwhile, parties outside the U.S. military have been waging a low-level, anti-Iraq cyber-campaign of their own over the past 10 days, most notably by launching a denial-of-service attack against the English- and Arabic-language Web sites of the Al Jazeera Arab news network. The U.S. military is shoring up its own networks through efforts such as the Navy Marine Corps Intranet (NCMI) project, which NCMI Office staff director Capt. Chris Christopher claims has successfully repelled worm and virus attacks.
- "Smart Dust"
Computerworld (03/24/03) Vol. 37, No. 12, P. 32; Hoffman, Thomas
Microelectromechanical systems (MEMS) form the basis of "smart dust," a sophisticated wireless sensor network composed of minuscule, autonomous "motes" that could collect data for many diverse operations, including patient and traffic monitoring, battlefield observation, inspection of manufacturing processes, and the tracking of appliance power consumption. The commercialization of smart dust technology faces a number of technical hurdles, including successfully integrating MEMS and electronics on a single chip, a task that Carnegie Mellon University's Gary Fedder describes as "superhuman." Kristofer Pister of the University of California at Berkeley notes that smart dust motes currently measure about 5mm on a side, and shrinking them to 1mm should make them viable for commercialization. He also predicts that technical breakthroughs will allow the cost of individual motes to drop from between $50 and $100 today to $1 in five years. The goal of both researchers and commercial contractors is to deploy motes that are independent of power sources, and researchers at UC Berkeley, UCLA, and MIT have spent the past two years trying to accomplish this in part by focusing on low-power ad hoc routing protocols. Crossbow Technology CEO Mike Horton foresees two major technical boosts coming up--the consolidation of the two-chip mote operating system into one chip within two years, and the emergence of "scavenger" energy technologies that tap into ambient power within five years. The latter technology is being researched by Shad Roundy of UC Berkeley, Horton notes.
Click Here to View Full Article
- "The Wired War Has Arrived"
Business Week (03/31/03) No. 3826, P. 36; Balfour, Frederik; Ante, Spencer E.
The U.S. Army expects that communications and computer gear will prove to be beneficial to its Third Infantry Division (3ID) as it spearheads the Army's push to Baghdad. The Army has equipped 3ID armored vehicles with a system designed to provide a real-time map of the entire battlefield as well as offer email communication over a secure Intranet. Unit commanders say the technology will allow it to finesse attack plans, provide updates of enemy movements, and warn of chemical or biological attacks. Unit officers will be able to use laptops to gain an onscreen view of U.S. positions (blue icons) in relation to Iraqi positions (red icons) thanks to aerial surveillance and global positioning satellite (GPS) units. Nonetheless, the system is unproven technology, considering its recent installation, there are questions about the compatibility of the off-the-shelf products it uses. The 3ID will become the "first division to use digitized warfare systems in a combat environment," says division automation officer Major Bradley K. Bragg, but the Fourth Infantry Division is the prototype for the new digital system. The 4ID unit, which has more wired vehicles and troops that are better trained on the new equipment, was stranded for several weeks off Turkey, which prevented it from becoming the first unit to use the system during combat.
Click Here to View Full Article
- "Pushing the Edge"
InfoWorld (03/24/03) Vol. 25, No. 12, P. 1; Shafer, Scott Tyler
Thrifty enterprises are turning to network edge appliances as an efficient, inexpensive alternative to more costly software deployments in order to handle fluid security needs, as well as accommodate incoming XML data and support single networks that convey multiple traffic streams. Interest in edge appliances from companies and vendors is growing because the devices enable organizations to adhere to strict technology budget parameters while simultaneously setting aside room to grow. Network edge devices take the burden off software with limited scalability when information overload beckons; their specialization eases manageability, and their use of ASICs allows them to reach higher performance levels than software run on general-purpose CPUs. The biggest plus of edge appliances for many companies is performance and security upgrades. An InfoWorld CTO Network Security Survey finds that 54 percent of respondents chose security as the most highly desired edge functionality they would be willing to invest in. The performance benefits of edge appliances are derived from their ability to scan traffic patterns and inspect packets directly. Vendors aim to enhance the intelligence of network edge devices in order to satisfy customer demands for additional functionality. However, this could lead to overcomplicated appliances that undermine one of the technology's biggest customer draws--simplicity.
Click Here to View Full Article
- "Surveillance Nation"
Technology Review (04/03) Vol. 106, No. 3, P. 34; Farmer, Dan; Mann, Charles C.
The era of the surveillance society is rapidly approaching due to increasing technological sophistication--speedier networking, more powerful microprocessors, improved software, cheaper electronics, bigger hard drives, and so on. Unmonitored public space is expected to become nonexistent thanks to the convergence and meshing of thousands of personal, commercial, law enforcement, and government databases and surveillance systems, not to mention the tens of millions of surveillance cameras deployed throughout the world. Although critics charge that omnipresent surveillance is a tool of political tyranny and corporate greed, the fact remains that citizens desire it for its convenience and security; as a result, individuals and small groups will determine the growth rate of surveillance networks. But while these individual systems may be positive for their primary users, Purdue University's Gene Spafford warns that their overlapping could have disastrous consequences: As more and more data repositories intersect, their data will become less reliable, making quality control and verifying conclusions all the more difficult. Surveillance database management should not present any major headaches, but collating valid data will be a much tougher proposition, according to Stanford University's Rajeev Motwani. The integration of surveillance and computing power will move continuous monitoring out of the workplace and into other areas of daily life, such as stores, schools, and hospitals. Database analysts will need to be thoroughly familiar with the way their systems work, while Spafford says an even more critical component will be a model for the patterns such systems are supposed to look for; this model is better supported by reactive, rather than proactive, search strategies.
(Access to this site is free; however, first-time visitors must register.)