Timely Topics for IT Professionals
About ACM TechNews
ACM TechNews is published every week on Monday, Wednesday, and Friday.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either IBM or ACM.
To send comments, please write to firstname.lastname@example.org.
Volume 5, Issue 536: Friday, August 22, 2003
- "Marketers Say They Intend To Join Effort To Fight Spam"
New York Times (08/22/03) P. C1; Schwartz, John; Markoff, John
The Direct Marketers Association (DMA) is appealing to members for extra funds for Operation Slam Spam, which would provide federal, state, and local law enforcement with reinforcements in their fight against spammers. In the letter to members, the DMA says the $65,000 "participation fee" will help the progress of more favorable spam legislation as opposed to more far-reaching measures that would hinder marketing activities. The DMA supports legislation that would, for instance, require junk email to provide opt-out links for recipients and to honor those agreements. Operation Slam Spam would assist the FBI, the Justice Department, and other law enforcement groups in identifying the most egregious spammers, and bring in expertise and training along with the National White Collar Crime Center. The program could also affect the eight different spam bills awaiting congressional consideration after the current recess. FTC Chairman Timothy J. Muris has taken a critical view of spam legislation, noting that no single law or measure is likely to stop people already flaunting current laws. He says ideas such as a national do-not-spam registry would only tie up needed resources, for example. At the same time, Electronic Privacy Information Center executive director Marc Rotenberg says that self-regulation is not a good solution, and that effective spam legislation has overwhelming public support.
(Access to this site is free; however, first-time visitors must register.)
- "Record Computer Infections Slow U.S., Private Work"
Washington Post (08/22/03) P. E1; Duhigg, Charles; Krebs, Brian
Computer viruses that have proliferated at record rates over the past 10 days appear to be tapering off slightly, according to security firms such as MessageLabs. However, this news hardly breeds optimism for federal agencies--the Small Business Administration, the Department of Commerce, and the FCC among them--reporting productivity and operational slowdowns, computer outages, and unprecedented numbers of infected emails attributed to worms such as Sobig.F, Blaster, and Welchia, whose global reach encompasses at least 1 million residential, business, and government computers. Department of Commerce CIO Tom Pyke says that his department's virus-defense systems intercepted 40,000 Sobig.F-laden messages before Commerce computers were compromised on Aug. 21, and between 500 to 750 emails are being quarantined every hour. Though the damage caused by these viruses is repairable, computer experts say the worms could easily be programmed for more malevolent tasks, and are worried about the next epidemic. Sallie McDonald of the Homeland Security Department notes that both her agency and Microsoft warned of the Windows vulnerability the viruses are exploiting in July, but the record spread of the worms is a clear indication that few people took advantage of the warning, or the patch that was issued. She adds, "If industries and agencies don't start regulating themselves, Congress may put in legislative requirements."
Click Here to View Full Article
- "Technology Key to Anticipating Outages"
Associated Press (08/22/03); Jesdanun, Anick
It is hoped that the antiquated national grid will be upgraded to anticipate power failures such as those that caused the recent cascading blackout with the deployment of sophisticated monitoring technology, although such a vision is 10 years away and could cost tens of billions of dollars. Luther Dow of the Electric Power Research Institute explains that the goal is to develop an intelligent, self-repairing grid capable of monitoring and evaluating its performance, as well as taking steps to eliminate reliability problems. PJM Interconnection uses computers that abstract thousands of power-flow measurements into a graphic representation and run simulations of outages. PJM general manager Robert Hinkel thinks a cascading power failure triggered by local problems could be prevented by implementing automated sharing between neighboring utilities, while advanced artificial intelligence that can project sudden power load changes would also be beneficial. SmartSignal, IBM, and others are also developing improved methods for data analysis using wireless sensors. Meanwhile, Tom Glock of Arizona Public Service reports that his company has been testing computer systems that can help operators gain a wider perspective of grid operations rather than keeping track of myriad lines and substations by simultaneously monitoring separate displays.
- "Strong Attackers, Weak Software"
Washington Post (08/21/03) P. E1; Duhigg, Charles
Computer security experts posit that the recent upswing in fast-spreading virus epidemics is the apex of a long-gestating trend as the skills and daring of virus programmers increased, while the quality of software security decreased. A rise in virus activity at this time of year is often attributed to college students on summer break who are out to make a name for themselves, but Ken Dunham of iDefense says the motivation of virus authors is changing: No longer content with notoriety, some programmers are writing malicious code to be used for ID theft, financial scams, or to make political statements. MessageLabs CTO Mark Sunner thinks that profit may be one of the motives behind the Sobig.F worm, which installs a Trojan horse program that spammers could use to distribute their spam from infected machines. Fred B. Schneider of Cornell University's Information Assurance Institute warns that even more insidious viruses may be on the horizon. "There's nothing stopping someone from taking Blaster or Sobig.F and making it delete all your files or change software on your computer so it no longer works," he explains. But even more helpful to virus writers is the prevalence of poorly designed software, which results from a lack of thorough testing and vendors' eagerness to add bells and whistles. But though companies such as Microsoft hope to address this problem by slowing down software development, there is a trade-off: Schneider observes that more secure software is harder to use. Analysts also note that building more security into software could add to consumer costs, and slow the pace of technological innovation. Technology adviser David Sklar predicts that, should a "software Chernobyl" take place, "We'll start putting up more walls, and thinking that computers should have the same level of reliability we demand from food or cars or fire-retardant pajamas."
Click Here to View Full Article
- "The Global State of Supercomputers"
CNet (08/21/03); Kanellos, Michael
Supercomputing projects are proceeding apace, partly thanks to NEC's Earth Simulator, which currently owns the title of the world's fastest supercomputer. Designed to model climate, the Earth Simulator takes up three floors and is comprised of 5,120 custom-made processors, 10 TB of memory, 640 TB of disk storage capacity, and a 1.5 petabyte ancillary storage system. Other supercomputing efforts aim to dramatically shrink the cost and size of supercomputers while raising performance to new levels. Cray is working on Red Storm, a machine for the U.S. Energy Department that holds 10,368 Opteron processors and features single- rather than multi-processor computing nodes in order to reduce costs, according to hardware architect Robert Alverson. Cray is also investigating whether Red Storm could be expanded to contain 30,000 processors. Meanwhile, Sun Microsystems has won $50 million in funding from the Defense Advanced Research Projects Agency to develop a supercomputer that can hold 100,000 customized processors and fit inside a room, explains Sun laboratory director Jim Mitchell. The size reduction could be made possible by the processors, which rely on asynchronous logic and are thus less power-consumptive than current chips.
- "The Quiet War Over Open-Source"
Washington Post (08/21/03) P. E1; Krim, Jonathan
The general public is oblivious to a fierce battle being waged amongst companies, technologists, academics, and government officials over the place of open-source software in the world of intellectual property. This acrimonious debate flared recently at the World Intellectual Property Organization (WIPO) after an official, Francis Gurry, suggested hosting a meeting about alternative licenses for open-source products to developing nations. The proposal first came from a letter signed by almost 60 technology and economic experts, organized by James Love of the Consumer Project on Technology. After the WIPO official was quoted in the journal Nature, software trade groups acted quickly to bring political pressure on WIPO, a group of about 180 nations that coordinates intellectual property issues worldwide. Business Software Alliance lobbyist Emery Simon said his group was not against open-source software per se, but that it necessarily had to defend the importance of strong intellectual property law; he added that Microsoft and the other large software vendors making up the BSA governing board unanimously opposed the idea of a WIPO open-source meeting. U.S. Patent and trademark Office international relations director Lois Boland said the proposal to loosen intellectual property value ran counter to the stated goals of WIPO and that another organization might be more suited to the meeting. WIPO says it no longer intends any such conference. Open-source advocates say that traditional intellectual property right rules need to be altered to accommodate the growing open-source movement, which they believe will one day provide a new way of sharing technology and art, as well as conducting commerce.
Click Here to View Full Article
- "Three Companies Reach Second Phase of Pentagon's Supercomputer Competition"
TechNewsWorld (08/21/03); Weisman, Robert
The Defense Advanced Research Projects Agency (DARPA) has awarded approximately $150 million to Sun Microsystems, IBM, and Cray to separately develop a high-productivity computer system that is faster, more versatile, and more powerful than any other machine. The system would form the basis of next-generation supercomputers for national security applications such as intelligence gathering and weapons design, as well as industrial applications. DARPA is after a system that boasts durability, adaptability, and easy programmability, while Robert B. Graybill of the agency's Information Processing Technology Office says the initiative could yield a new "value-based metric system" for assessing computer performance. DARPA has recommended that the contractors develop systems whose primary advantages are reliability, portability, and ease of programming. "What we'd like to do is to build more of a general-purpose system that can morph or change itself so it can run different applications efficiently," explains Michael Rosenfield of IBM's Austin Research Laboratory. Five companies were commissioned by DARPA last year to come up with concepts for a high-performance system, but only IBM, Sun, and Cray won the contract; once they have produced prototype designs, DARPA will grant engineering and development funding to no more than two companies. IBM's code name for its project is PERCS (productive, easy-to-use, reliable computing system), while Sun calls its project Hero, and Cray has dubbed its project Cascade. The U.S. supercomputing effort received a wake-up call in 2002 when Japan's Earth Simulator took the lead as the world's fastest supercomputer, and the government has scrambled over the last year to put the United States back on top.
- "IT Security in Energy Sector to Come Under Scrutiny"
Computerworld Online (08/20/03); Verton, Dan
The recent cascading power failure that blacked out much of the northeastern United States and areas of Canada has added credibility to security experts' persistent warnings about the power grid's susceptibility to cyberattacks. Congress is planning several hearings in the coming weeks to find the cause of the outage and to study the overall power grid security issue in more detail. The key vulnerability of U.S. electrical grids is their integration and reliance on all other regional grids, which are maintained by Supervisory Control and Data Acquisition (SCADA) systems. Experts say that more and more SCADA systems are moving away from a proprietary architecture and are being implemented with commercial components that depend on public Internet protocols and links. A 2002 report from Dartmouth's Institute for Security Technology Studies states that "The [energy] sector has always contained security vulnerabilities, but these vulnerabilities have been compounded by the introduction of new networking technologies, deregulation, and structural changes in the industry." The report notes many incidents in which SCADA systems have been affected electronically, and recommends more support for research into the potential effects of a cyberattack on the U.S. grid. Former chairman of the President's Critical Infrastructure Protection Board Howard Schmidt recently said that the IT security technology for safeguarding real-time control systems from network intruders is currently nonexistent, adding that off-the-shelf tools cannot operate in the power grid's real-time control environment. Schmidt noted that the National Strategy to Secure Cyberspace called for a research and development program in SCADA system encryption and validation, and expressed concern that power failures caused by cyberterrorists could lead to a loss of human life if they are coupled with physical attacks in areas affected by the outages.
Click Here to View Full Article
- "Patent Awarded for Method of Making Nanobatteries"
University of Tulsa chemistry professor Dale Teeters and former students Lane Fisher and Nina Korzhova have been awarded a patent for a process to fabricate, charge, and test nanoscale batteries. Thus far the research team has manufactured nanobatteries small enough that over 40 units could be stacked across the width of a human hair, and the researchers are working to shrink the batteries even further. The process involves immersing an aluminum sheet in an acid solution while an electric current is applied, forming an aluminum oxide membrane; dissolving the metal produces a porous, honeycomb structure that is filled with a polymer electrolyte. Once filled, the pores are capped on both sides with ceramic or carbon electrodes. The researchers use a scanning electron microscope and an atomic force microscope to assemble the battery arrays. The atomic force microscope is also used to charge the batteries, which can yield up to 3.5 volts each. The research was funded with a $446,559 grant from the Oklahoma State Regents for Higher Education and the Department of the Navy's Office of Naval Research. Teeters says a viable miniaturized power source is needed for microelectromechanical systems, one example being a microscopic drug-delivery system that travels through the body, much like the microbe-sized submarine in the movie "Fantastic Voyage."
- "CIOs, Experts Cite Urgent Need for U.S. Infrastructure Upgrade"
Computerworld (08/18/03) P. 1; Verton, Dan
Executives of energy companies and industry analysts are currently offering their opinions about what steps should be taken to prevent a repeat of the blackout that left millions of customers in Canada and the United States without power. First Energy Corp.'s CIO, Ali Jamshidi, emphasizes the need for software upgrades that offer real-time information and response capabilities to officials overseeing the electricity grid. He also called for greater involvement by federal authorities and for development of technological tools to help predict potential problems. Powerware Corp. president Mark Ascolese said more investment is needed in transmission and distribution equipment to ensure stable power supply, but Jamshidi believes the outage has called attention to a greater threat--cyber terrorists or hackers infiltrating the system. Kema Consulting analyst Joe Weiss notes the grid is plagued by aging equipment and said the interconnectivity of the grid presents a security threat of itself because of the far-reaching implications of power disruptions. Weiss said the National Research Council's proposal to develop an "intelligent grid" and other initiatives are still in their early stages. Howard Schmidt, who previously chaired the President's Critical Infrastructure Protection Board, identified advanced information-technology systems for enhancing power distribution and security as a vital issue facing the power industry.
Click Here to View Full Article
- "Privacy Advocates Call for RFID Regulation"
CNet (08/18/03); Gilbert, Alorie
Sen. Debra Bowen (D-Calif.) state legislative subcommittee chair on new technologies, recently held a hearing on radio frequency identification (RFID) technology and its privacy implications in the commercial sector. Allusions to the futuristic movie "Minority Report" were made, indicating that advanced RFID tags could allow marketers and the government to track people and their possessions. Many attendees called for legislation that would regulate use of RFID, similar to current "fair use" laws that require clear labeling, the ability to disable the technology, and access to stored records upon request. Privacy Rights Clearing House director Beth Givens said the hearing was "an important first step" in dealing with RFID and its threat to privacy. UCLA professor Greg Pottie, who is part of that school's Center for Embedded Networked Sensing, said RFID technology could be used in such a way to eliminate consumer privacy completely and that now is the time to assess the technology. Major retailers and manufacturers are exploring RFID implementation because of its potential to simplify inventory management and reduce theft: Wal-Mart told its top 100 suppliers to prepare pallets and cases with RFID chips by 2005, and will extend that requirement to all of its approximately 25,000 suppliers by the end of 2006. Association for Automatic Identification and Data Capture Technologies head Dan Mullen said the business case for RFID still needed to be proven and that it would not make sense for some products. Bowen, who is also pushing California anti-spam legislation, said the hearing was simply to gain a better understanding of the RFID issue, and that any bill she were to draft would not outlaw RFID.
- "Internet, Communications Networks Survive Massive Blackout"
NewsFactor Network (08/15/03); Wrolstad, Jay
The massive cascading power failure that blacked out most of the northeastern United States and parts of the Midwest did not seriously affect the Internet and communications networks in those regions. The majority of IT systems were undamaged for the most part, though mobile phone communications were reportedly impacted by problems at wireless cell sites. AT&T, Verizon, and Sprint all reported dramatic surges in phone traffic, at least in the early stages of the blackout. Dave Johnson of AT&T said his company's primary data-routing, Web-hosting, and voice-switching systems were unscathed thanks to built-in redundancy in the form of automated backup power sources supplied by batteries and diesel generators. Verizon's Jim Smith noted that self-reliance is embedded in his company's voice and data networks, while a battery-based power supply buttressed by generators is programmed to kick in during emergencies. These systems immediately took over when 10 percent of Verizon's cell sites experienced a loss of power. Sprint's Dan Wilinsky said that his company's long-distance and PCS switches sustained no damage, although some customers were prevented from making and receiving calls because of a power loss at certain PCS cell sites. Yankee Group analyst Zeus Kerravala argued that the blackout should give businesses a much-needed jolt about the value of deploying contingency plans. "Problems like this one aren't common, but they do happen, and companies need to examine the network services, application services, and data-storage systems they are buying to determine how to resolve continuity issues and protect the infrastructure," he explained.
- "Spam Technology Seeks Acceptance"
TechNewsWorld (08/15/03); Fontana, John
Sieve, a proposed IETF standard filtering technology designed to organize email and mitigate message overload, is being tapped by vendors such as Brightmail and ActiveState as a tool that enables customers to write personalized spam filters. Sieve author Tim Showalter explains, "Email overload has not been the result of receiving too much legitimate email. It has been because of spam." Nevertheless, he is surprised that some vendors altered Sieve for use with their anti-spam engines. Vircom, for example, employs Sieve as the cornerstone of its ModusSieve product, and the company reports that it has devised 13,000 lines of Sieve scripts that are updated 24-7 and enhanced by scripts from clients who have organized into the Vircom Anti-Spam Coalition. "We can quickly modify scripts to react to spammers and share those scripts throughout the coalition," notes ModusSieve product manager Daniel Roy. Rockliffe has deployed Sieve in the Web-mail interface of its MailSite Express messaging server, and Sieve will be included in an upcoming Rockliffe anti-spam filtering product featuring a policy editor extracted from ActiveState's PureMessage anti-spam software. The policy editor boasts a GUI interface designed to make Sieve scripting easier for users. Brightmail CTO Ken Schneider says that Sieve is not a core ingredient of the company's anti-spam engine, but is used to address more site- or platform-specific problems.
- "Total Information Overload"
Technology Review (08/03) Vol. 106, No. 6, P. 68; Jonietz, Erika
Privacy advocates allege that the Defense Department's Terrorism Information Awareness (TIA) project would merge public and private databases into a vast "metabase" that would be mined to gather data on innocent American citizens, but Robert L. Popp of the Defense Advanced Research Projects Agency's (DARPA) Information Awareness Office denies these allegations, insisting that TIA's purpose "is developing a variety of information technologies into a prototype system/network to detect and preempt foreign terrorist attacks." He explains that DARPA is supplying operational agencies within the Defense Department and the intelligence community with analytical counterterrorism tools, adding that these agencies are using only the data and databases that existing legislation, policies, and regulations give them access to. Popp says TIA is not devising data-mining technologies to sift through transactional data such as the purchase of plane tickets to potential sites of terrorist attacks, emails, phone conversations, and newswire stories; instead, TIA is focused on the development and integration of tools that facilitate collaboration, analytics, and decision support, as well as biometrics, security, pattern recognition and predictive modeling, and foreign-language translation. He discusses the two threads that make up TIA activity--an operational thread and a pure R&D thread. The operational thread is built upon the premise that government-owned databases already contain the data needed for an effective counterterrorism strategy, while the R&D thread seeks to determine whether that strategy could be improved if the government had wider access to the information space, as well as address any related privacy issues. Popp attributes the privacy community's backlash against TIA to a misinterpretation of the project's purpose picked up by many news outlets and Web sites last November, yet admits that DARPA ought to have been more straightforward with Congress and the public.
To read more about TIA, visit http://www.acm.org/usacm.
- "Quantum Cryptography's Reach Extended"
IEEE Spectrum (08/03); Mullins, Justin
Quantum cryptographic researchers are developing techniques to extend the range of entangled photonic devices, which leverage the strange physics unique to the quantum world to protect encoded messages. By repeating the entangled signal, researchers expect to be able to send quantum cryptographic signals for as far as there are repeating devices. Devices now being marketed can send a quantum cryptographic signal up to 10 kilometers and use regular photons, technology that is good enough for metropolitan use and to create a sizeable market. In 2001, scientists at the University of Innsbruck in Austria and at Harvard University first proposed using repeaters to create long-distance quantum cryptographic links, where entangled links would be maintained in segments. At the California Institute of Technology, researchers are using an ensemble of millions of atoms that assumes the entangled state of a single transmitted photon. According to previous theory, an entangled photon trapped in a mirrored cavity would transmit its entangled state to a single atom, but that process required near-perfect mirrors that would reflect the photon millions of times. Harvard University scientists have also simplified the repeater design using rubidium atoms and light pulses. Team leader Mikhail Lukin, one of the physicists who originally suggested the repeater scheme, says demonstrative repeater technology should arrive in the next few years.
Click Here to View Full Article
- "Totally Random"
Wired (08/03) Vol. 11, No. 8, P. 88; McNichol, Tom
Encryption by randomization is the linchpin of computer security, but producing the random number sequences that uphold data encryption is an arduous process that requires a random number generator (RNG). Existing computers are characterized as pseudo-RNGs because they are inherently deterministic, whereas true RNGs are supposed to generate seeds from multiple unpredictable sources. An even better method is to embed an entirely random source of entropy into the hardware, such as a lava lamp. This is what SystemExpert cryptographer Landon Noll and two Silicon Graphics colleagues did in 1996 with their development of the patented Lavarand system. Noll has teamed up with encryption expert Simon Cooper to devise an even more sophisticated RNG whose entropic source is a Webcam with its lens cap on. The operating principle of LavaRnd, as the new approach is called, involves the digitization of the thermal "noise" generated by the Webcam and its processing by a hash algorithm that ultimately produces the random numbers. The open-source LavaRnd will not be patented or licensed, and be free to anyone who can build a LavaRnd server. "We're trying to give people the ability to generate random numbers themselves," Noll explains. "The Webcam is a low-cost, readily available stimulus that's not predictable."
- "Information in the Holographic Universe"
Scientific American (08/03) Vol. 289, No. 2, P. 58; Bekenstein, Jacob D.
The holographic principle proposed in 1993 by Gerard 't Hooft of the University of Utrecht theorizes that the universe is akin to a hologram: The universe, which appears to be three-dimensional, could actually be composed of alternative quantum fields and physical laws that are "painted" on a massive surface, in much the same way that a 3D image can be encoded on a 2D surface in holography. The truth of this principle is hinted at by black hole physics, which establish that the maximum entropy or information content of any region is determined by its surface area, not by its volume. By this reasoning, the information content of an immense aggregation of computer chips would surpass the holographic bound when the pile collapses into a black hole. The holographic principle can render two universes of disparate dimensions and obeisance to different physical laws completely equivalent. The hypothesis goes that a five-dimensional anti-de Sitter spacetime where superstring theory dominates is recorded on the four-dimensional surface along its perimeter described by quantum field theory. A black hole in the 5-D universe equals the hot radiation on the hologram. There is no experiment that can tell the difference between these two universes, even though they conform to contrasting physical laws. Physicists hope to glean insight into the ultimate theory of reality from these conclusions.
Click Here to View Full Article
- "Helping the Group to Think Straight"
Darwin (08/03); Chapman, Rod
Group decision support systems (GDSS)--software tools designed to enhance collaboration and boost productivity in face-to-face meetings--are growing more popular and rewriting the rules of decision-making on the executive level. GDSS encourages equal participation in conferences by allowing participants to remain anonymous and by imposing a turn-taking scheme, with the result being more sensible and unbiased decisions. A GDSS architecture usually consists of a meeting facilitator and a bunch of local area network-connected computers running software that streamlines collaborative jobs such as brainstorming, the classification and appraisal of ideas, voting, and designating importance to alternative concepts. Electronic brainstorming via GDSS enables participants to concurrently contribute ideas, allowing more ideas to be offered in less time than in traditional conferences. A typical GDSS package displays the results of such collaborative sessions on a large screen and at participants' individual workstations so that group input can be viewed as a whole. Items are prioritized with the help of consensus building tools, while an agenda-setting element can narrow the group's focus. Thoughts and ideas are collected, organized, and edited through a multi-window setup, and the results can be published immediately after the meeting wraps up. The drawbacks of GDSS include low participation rates by people who type slowly or who do not like technology, a loss of body language and other nonverbal cues that are important communicative signposts, and confusion sowed by the lack of a skilled facilitator.