Timely Topics for IT Professionals
About ACM TechNews
ACM TechNews is published every week on Monday, Wednesday, and Friday.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.
To send comments, please write to firstname.lastname@example.org.
Volume 4, Issue 400: Wednesday, September 18, 2002
- "Czar of Cyber Security Defends Easing of Rules"
Los Angeles Times (09/18/02) P. A9; Piller, Charles; Shiver Jr., Jube
White House cybersecurity adviser Richard Clarke answered critics of his national cybersecurity plan yesterday by declaring that federal regulation would only exacerbate the situation--instead, the report recommends that industries voluntarily help secure cyberspace. Market pressure for improved security products and services will be generated by spreading public awareness and personal responsibility. Clarke contended that the private sector has better knowledge of the IT infrastructure than the government, and added that a centralized strategy does not apply to a problem as intricate as cybersecurity. VeriSign's Michael A. Aisenberg noted that the Bush administration has eliminated several recommendations from the plan; among them was a limit on the use of wireless networks because of security considerations, and a policy requiring ISPs to bundle security technology with their software. Critics claimed that Clarke dropped these efforts due to pressure from technology companies. Security experts said the threat of cyberterrorism--a major impetus for Clarke's plan--has been overrated by federal officials, but Clarke himself proclaimed that the plan calls for an emphasis on vulnerabilities rather than threats. Sans Institute research director Alan Paller praised such a policy, but doubts that a voluntary approach will be enough to counter security threats. Hacker intrusions are growing partly because "whenever there is a perceived conflict between self interest and the national interest, industry has acted in its self interest," he explained.
(Access to this site is free; however, first-time visitors must register.)
- "Who's Running the Digital Show?"
Wired News (09/18/02); King, Brad
Consumers are losing more and more ground in terms of what they can do with electronic devices as hardware vendors and even the government espouse the media and software industries' vision of a controlled digital media environment. A draft report from the federal government on cybersecurity contains provisions that would take away even more online freedoms from users. Already, Congress has shown a willingness to restrict consumers' use of technology through the Digital Millennium Copyright Act (DMCA), which makes it illegal to bypass technological media controls that prevent media copying, for example. Microsoft has been partner to the creation of this controlled environment, even as it moves forward with new technologies that make digital media easier to use. Its agreement with set-top box chip manufacturers, announced this month, embeds support for its Windows Media Video 9 Series, a proprietary format. Putting restrictions on hardware, coupled with DMCA laws, makes it impossible for owners of DVD players, for instance, to copy movies onto VHS tapes. Open-source software advocates are trying to encourage hardware vendors not to accommodate these restrictions, because they will eventually hamper sales. Instead, an open-source option should always be made available to consumers, such as PCs built with the Linux operating system pre-installed.
- "Nanotech Bill: Big Money for Tiny Tech"
ZDNet (09/16/02); McCullagh, Declan
Sen. Ron Wyden (D-Ore.) intends to propose a bill today calling for a National Nanotechnology Research Program that would involve further government funding for early-stage nanotech research. "This is a field with almost unlimited potential, and America needs to stay at the forefront of this field," says Wyden's press secretary, Carol Guthrie. A major purpose of the initiative would be to align research goals with the technology's ethical and societal challenges through the establishment of a center. Furthermore, the bill would allocate federal nanotech grants to U.S. regions with high rates of unemployment. The program will have a budget of approximately $446 million, some of which will come from other parts of the federal budget. Wyden will also convene a hearing today where industry figures will showcase emerging nanotech products such as quantum dots, nano-flat screens, stain-resistant apparel, etc. Although the Nanobusiness Alliance forecasts that nanotech startups will have over $1.2 billion in venture capital by next year, vice president Nathan Tinker insists that Wyden's bill is critical, because venture capitalists are currently too reluctant to invest. A spokeswoman announced that Sen. Joseph Lieberman is expected to co-sponsor Wyden's proposal.
- "State Lags in High-Tech Education"
Contra Costa Times Online (09/17/02); Lee, Ellen
A study from the Milken Institute indicates that California is producing less home-grown high-tech employees and investing less venture capital than many other states, and its educational system is identified as the primary culprit. California ranked third in the institute's science and technology index, behind Massachusetts and Colorado, but other states have made much more progress than California in certain areas. The state ranks 24th in terms of the average percentage of science and engineering graduates, necessitating a reliance on workers from other states and countries. Such findings have forced the Bay Area Regional Technology Alliance to realize that producing more graduates must take a priority, according to president Judy Pacult. The number of California-based companies receiving venture capital funding jumped approximately 30 percent between 1999 and 2000, and there was a 70 percent increase in overall venture capital investment during the same period. However, such figures are minuscule compared to Idaho and Alabama's respective investment growth rates of 6,200 percent and 410 percent. The high-tech industry accounts for almost 11 percent of the state's workforce and 19 percent of payroll wages. The other top 10 states in the Milken report were Maryland, Virginia, Washington, New Jersey, Connecticut, Utah, and Minnesota, while Arkansas was ranked last.
- "Intel Unfurls Experimental 3D Transistors"
CNet (09/16/02); Kanellos, Michael
Intel intends to disclose more details about its experimental Tri-Gate transistor this week at the International Solid State Device and Materials Conference in Japan. By featuring two additional gates, the transistor exhibits more three-dimensional behavior and raises performance levels by boosting the amount of current, according to Intel's Gerald Marcyk. By diverting current into three channels, leakage is also reduced. Marcyk notes that conventional transistors are basically flat and support a planar electron flow, whereas Tri-Gates feature components that bulge out from the silicon wafer. Furthermore, the transistors can be manufactured using current lithographic techniques, so new assembly breakthroughs are not needed. Intel competitors IBM and Advanced Micro Devices (AMD) are investigating the possibilities of double-gate transistors, and the former has gone so far as to build an entire chip out of such transistors. Neither Intel, IBM, nor AMD has yet made a commitment to incorporate such devices into future microprocessors. "This is a second-half-of-the-decade kind of thing," insists Marcyk.
- "Neutron Beam Reveals New Spin on Magnetism"
EE Times Online (09/16/02); Brown, Chappell
Researchers at the National Institute of Standards and Technology's (NIST) Center for Neutron Research, working in conjunction with physicists from Rutgers University and Johns Hopkins University, have discovered a new form of magnetism in the crystal lattice of exotic zincochromite (ZnCrO4) that could be applied to quantum computing. The unusual magnetism stems from the lattice's geometric shape, which makes the configuration of its magnetic spins unstable. ZnCrO4 features tetrahedral corners, preventing the spin vectors of the lattice's atoms to align themselves in stable pairs, and keeping the material in a liquid state even under cool temperatures, observes Seung-Hun Lee of the NIST facility, who defines such systems as "frustrated magnets." His group found that such systems are good candidates for "emergent" behavior, which is responsible for diverse types of superconductivity and other exotic electronic properties. Using a neutron beam, Lee's team discovered a hexagonal array of six anti-parallel spins in the ZnCrO4 spin liquid; this protectorate is capable of self-orientation that produces a single stable spin direction. "Of course, it is too soon to talk about applications, but the emergence of stable patterns of many different quantum levels might be useful in the design of quantum computers," notes Lee. An emergent protectorate of quantum spins could be key to finding a way to preserve a system's quantum coherence long enough to carry out an algorithm.
- "The Supercomputing Speed Barrier"
NewsFactor Network (09/13/02); Lyman, Jay
Joel Tendler of IBM's technology assessment and server group says that the limits of supercomputing speed will be bypassed in the short term, but this will only set up thresholds that will have to be dealt with later on. He adds that economic barriers such as costs and financing could further inhibit technological progress. Still, supercomputer speed continues to rise, despite low demand: Los Alamos National Laboratory scientists are using the $215 million Q supercomputer, which operates at 30 teraflops, or approximately 6,000 times faster than 1990's fastest supercomputer, according to Los Alamos officials. Researchers there also say that the Q unit and the Metropolis Center, which form part of a three-lab coalition coordinated by the National Nuclear Security Administration, will one day achieve speeds higher than 100 teraflops, while Jim Danneskiold of Los Alamos notes that his lab has announced plans for reaching 200 teraflops. The growth of the Q supercomputer will require a facility where cooling equipment and other gear will reside, which illustrates that heat generation and power consumption represent key physical barriers to supercomputing. However, Tendler and others observe that molecular computers and nanoscale technologies could help overcome many of the physical hurdles. "[Researchers] are all doing the same thing--can we make silicon and chips smaller and smaller down to atom separation?" explains Tendler. "The fact that some of these technologies are on the horizon is a very positive sign."
- "MIPS: Measuring Environmental Impact of IT"
ZDNet UK (09/16/02); Loney, Matt
Hewlett-Packard, EMI, and other major companies are trying to measure the environmental impact of their products and services by studying Material Input Per Service (MIPS) ratings. When interpreted as a measurement of computing power, MIPS means the number of low-level machine code instructions a processor can carry out per second, while the environmental rating focuses on the movement of materials. The EU's Digital Europe study, which researches the social and environmental ramifications of IT and e-business, is coordinating the environmental MIPS project. Barclays bank has been using the rating system to determine the environmental impact of online banking compared to conventional banking, explains group environmental director Phil Case. He notes that the bank learned that paying bills by check involved the movement of 2.87 kg of materials, while online payment only caused 0.26 kg to be moved. Meanwhile, EMI's Kate Dunning says that purchasing a CD online, downloading music off the Internet, or downloading it and then burning it onto a CD have much lower MIPS ratings than buying a CD in a store. However, Case and Dunning both acknowledge that these findings are preliminary. HP, the world's largest computer manufacturer, is investigating how to design more environmentally friendly products, but thus far is unsatisfied with the findings of its own MIPS study, since so many materials are involved, according to environmental officer Zoe McMahon.
- "Linux Worm Hits the Network"
Wired News (09/16/02); Delio, Michelle
A new worm called Linux.Slapper is targeting Linux Web servers and creating a network of tens of thousands of drone machines that can be used in a distributed denial-of-service (DDoS) attack. The worm can be used to remotely scan for and recover email addresses on contaminated servers. The Slapper worm had already spread to 11,249 servers running the Apache Web server software on Monday morning, according to F-Secure, which infiltrated the network with a dummy machine. F-Secure antivirus manager Mikko Hypponen notes that Code Red, one of the most damaging viruses to date, had infected only a few hundred computers in a similar time frame. However, London-based systems administrator Tim Rice said that Slapper so far was not proving much of an annoyance, but worried that too many Web servers were unprotected. "An assemblage of those insignificant servers, networked together, could certainly be transformed into a highly annoying threat," he argued. Slapper uses a vulnerability discovered only in August, and works only on Apache Linux machines with OpenSSL technology. An OpenSSL upgrade has been made available since the flaw was discovered.
- "Big Trouble in the World of "Big Physics""
Salon.com (09/16/02); Cassuto, Leonard
Reports from Bell Labs physicist Jan Hendrik Schon claiming that nonconducting molecules can be converted into semiconductors, lasers, and light-absorbing devices were questioned when Princeton researchers noticed that the published outcomes of different experiments were exactly the same, and this prompted an investigation of scientific misconduct by an independent committee appointed by Bell Labs officials; furthermore, no other physicist has yet been able to reproduce the results. Dozens of other papers authored by Schon and colleagues have been called into question since then. The scandal has raised doubts about the accountability of expensive physics research that is mostly funded by taxpayers. The United States is the country with the most researchers engaged in costly experiments, and the physics community is based on an interdependent relationship between academia and corporate labs. A peer review system is in place to judge research findings as worthy of publication as well as ask for outside funding; it is also used to maintain legislators' assurance that such research is being conducted responsibly. The journals that published Schon's findings, Science and Nature, have been criticized for not reviewing the material deeply enough--in fact, Princeton physicist Philip Anderson contends that both publications' rivalry to publish cutting-edge research "compromised the review process in this instance." The reputations of scientists such as Schon can also influence reviewers, according to Arthur Hebard of the University of Florida. Dan Ralph of Cornell University warns that the Schon affair could potentially become "the biggest fraud in the history of physics."
- "Dan Gillmor: Issues That Will Shape the Internet"
SiliconValley.com (09/15/02); Gillmor, Dan
SiliconValley.com technology columnist Dan Gillmor writes that legislation currently under consideration threatens the freedom and openness of the Internet. For instance, Congress has already passed copyright laws that grant the entertainment industry unprecedented authority in determining how digital content can be used, while legislation in development would extend the cartel's powers even further. In addition, the entertainment sector has teamed up with myopic technology companies to modify their products so consumer use is limited, and standards bodies are being urged to promote patent-limited technologies that will let them build what Gillmor calls "royalty tollbooths on everyone who uses networks." He writes that Internet use by criminals and malicious parties is spurring governments and vendors to build more secure technologies and products, but without anonymity and privacy protections, such products will promote surveillance rather than liberty. Gillmor also warns that Congress and regulatory agencies such as the FCC are giving in to demands from telecommunications companies that serve to limit competition and consumer choice. Such demands include the right to refuse to deploy broadband unless granted control of data traffic; and a lowering or elimination of FCC-mandated ownership regulations that would allow companies to own near-limitless numbers of major media outlets in a given community, severely curtailing competitors' right to create and distribute their own content. Meanwhile, the ultimate decision over Microsoft's antitrust case on both sides of the Atlantic will likely have significant ramifications for companies' control over personal computing.
- "U.S. Will Renew ICANN's Authority"
TechNews.com (09/13/02); McGuire, David
Commerce Department undersecretary Nancy Victory says that she is pleased with the progress of ICANN reform and that "at this point we do anticipate that there will be an extension" of ICANN's contract for managing the DNS. Victory adds that the extension will be "designed to ensure that we continue progress forward with the reform effort." ICANN President Stuart Lynn says that ICANN is discussing extension terms with the Commerce Department right now, while Victory did not say when a new contract would be finalized or how long it would last. Critics of ICANN's reform process say that ICANN plans to lock out ordinary users from participating in ICANN, while others say they that Commerce must continue to pressure ICANN for reforms. Victory says that at an International Telecommunications Union meeting later this month she will oppose ITU efforts at increasd Internet regulation.
- "When Software Patents Go Bad"
InternetNews.com (09/13/02); Naraine, Ryan
The United States Patent and Trademark Office (USPTO) has come under fire for approving bad software patents. "People who are spending a lot of time and money innovating need to know they are not going to be copied as soon as they get into the marketplace," explains IP.com CEO Thomas Colson. "The problem we have, and it's a big problem, is that the right patents are not being issued." He says that too many patent claims are being filed, and this has strained the USPTO's capacity. Colson estimates that over 350,000 claims will be filed this year, up from about 180,000 nine years ago. Understaffing and a lack of upgrades makes the office ill-equipped to accommodate the increase. These problems have led to incidences such as the disqualification of ActiveBuddy's right to enforce its instant messaging bot patent when examples of prior art were made available, and such a glaring error illustrates the poor state of USPTO's operations. "Strained conditions under which patent examiners do their jobs [lead to many patents of questionable quality being issued]," notes BustPatents, a Web site that studies the upsurge in Internet, e-commerce, and bioinformatics patents.
- "Radio ID Locks Lost Laptops"
Technology Research News (09/11/02); Smalley, Eric
Researchers from the University of Michigan have developed a laptop security system that does not even require the user to be aware of it. The system involves users wearing a hardware token--such as a watch or piece of jewelry--that communicates to a computer via encrypted radio signals; as long as the token stays within a few feet of the machine, the files remain unlocked. The files are then locked whenever the token is moved out of signal range. The system eliminates the headache of re-entering passwords or using cards with magnetic stripes that users often leave in the card reader, notes the University of Michigan's Brian Noble. Instead, users only enter their password once to establish a connection between the token and the computer. Encryption helps speed the locking times, as the token's keys would take too long to directly lock and unlock files. Noble estimates that it could take as long as five years to develop practical applications for the technology, with embedding sufficient battery life within the token being the chief hurdle. Noble and colleague Mark Corner will disclose their research, funded with the help of the Defense Advanced Research Projects Agency, the National Science Foundation, and others, in late September at the International Conference on Mobile Computing and Networking.
Click Here to View Full Article
- "Accessibility Breakthroughs Broaden Web Horizons"
NewsFactor Network (09/17/02); Martin, Mike
Web sites are supposed to be accessible to users with disabilities per federal and international regulations, but accessibility consultant Mike Paciello says that this quality offers advantages for both handicapped and non-handicapped users. Most PCs are embedded with virtual accessibility tools, notes Dr. Scott Standifer of the University of Missouri--keyboard-based menu commands and "sticky-keys" are an alternative for users who cannot use a mouse, while screen enlargement and simple speech recognition are being incorporated into standard PCs and the XP version of Microsoft Office, respectively. Other creative solutions developed recently include a computer mouse and a wearable system for the visually handicapped. Standifer also lists examples of "accommodation software" as tools disabled people often use: Screen readers that translate on-screen text into spoken language can help the blind, while a microphone that uses speech recognition software to issue commands or dictate text can aid motor-challenged users. Standifer further notes that blind computer users, with the right skills, can process screen readers so fast that they outmatch the processing ability of their sighted counterparts. There are also inexpensive or free programs available to Web designers that support the creation and maintenance of accessible sites. Many disabled people, despite their eagerness to Web surf, are reluctant to do so out of worries that sites will be inaccessible, or will be rendered inaccessible later when enhanced features are deployed, according to the University of Missouri's Gary Wunder.
- "A New Way to Compute"
Newsweek (09/16/02) Vol. 140, No. 12, P. 34J; Foroohar, Rana
The evolution of the Internet into the grid--an architecture in which computer tasks are distributed over a network of servers, enabling computing power to be tapped like electricity--will mark the beginning of "the post-technology era," according to Tom Hawk of IBM. Grid technology promises many benefits, including faster, cheaper processing; multitasking electronic devices; complex simulation of many systems and phenomena, such as earthquakes, viral infections, etc.; and self-coordinating apparel. Other applications facilitated by the connection of companies, governments, consumers, and other agencies via the grid include sophisticated teleconferencing and ubiquitous wireless. But privacy advocates fear that such advancements could usher in a new era of surveillance and an erosion of civil liberties, despite promises from experts such as Accenture's Stan Taylor that the United States will probably adopt a European privacy model and allow individuals to control the dispensation of their personal data. One of the problems of the grid is that firewalls alone will be useless in such a highly networked environment, while legal, billing, and business issues are also complicating matters. The launch of the global grid could also be postponed by telecoms fighting over the proliferation of broadband. Furthermore, a unified, industry-wide grid language specification needs to be adopted, and companies such as Microsoft and IBM acknowledge the importance of open standards. The grid's success will depend on the incorporation of applications capable of self-diagnosis, self-management, and self-repair.
- "Real Time"
InformationWeek (09/16/02) No. 906, P. 34; Foley, John
Many businesses have only just begun to take full advantage of the speed offered by cutting-edge computer systems, and success in this area involves a dramatic restructuring of business processes and corporate culture so that they can support a real-time business model. Some companies are striving to meet these challenges because they are under increasing pressure to make themselves more adaptable to fluctuating conditions, achieve faster inventory turnover, become less susceptible to fraud, and improve customer service. Other firms are doing so to satisfy an SEC directive requiring public companies to disclose financial data faster, and a mandate from the brokerage industry to institute "straight-through processing." Federal Reserve Board Chairman Alan Greenspan delivered a speech in August that cited businesses' migration to real-time operations as essential to boosting the resiliency of the American economy. One success story is the Amberwood Homes construction company, which invested in a real-time application to shave three weeks off the average five-month home development cycle; superintendents use handhelds to file progress reports that are incorporated into a master schedule that subcontractors can access online. Supply chains are an especially heavy area of concentration for businesses because of the vast amount of untapped productivity they promise, according to Bob Betts of SAP. The potential benefits in this sector are driving movements to develop "adaptive" supply-chain software. Integrating data culled from myriad sources via multiple applications from different vendors that run on different operating systems and server platforms remains a formidable challenge.
- "They Might Be Giants"
IEEE Spectrum (09/02); Goldstein, Harry; Gardner, Alan; Kramer, David
High-tech companies are investing heavily in research and development as a matter of survival, and five technologies in particular are seen as essential to the growth of well-established markets and the development of new ones within the next five years. Increasing competition among firms is also prodding them to embark on joint ventures, licensing agreements, and standardization projects designed to help those new technologies proliferate. Joint projects have characterized the Semantic Web effort in which 500 members of the World Wide Web Consortium are collaborating on technology that has the potential to vastly improve search mechanisms and personal agents. Diagnosis, performance optimization, and resource allocation of computer systems offer tremendous efficiency gains, but the huge size of the preexisting market is prompting many tech companies to adopt strategies similar to that of IBM, which is funneling its R&D into open standards and new labor- and cost-saving products. Over 50 companies spanning a host of industries are investing in organic light-emitting diode (OLED) displays, with cross-industry collaborations stemming from the need to overcome technical hurdles. Meanwhile, Nichia is hoping to get on the ground floor of the white-light LED market, but competitors such as General Electric and Philips Lighting are fighting back by allying themselves with other companies to research solid-state lighting. Finally, the speech recognition technology market has only just started to emerge, spurring many companies that jealously guard their intellectual property to race each other to commercialize products. Economist William J. Baumol maintains that consumers should pay taxes to fund basic research, because such research is key to growth.
- "ACM Professional Development Centre Offers Free Courses"
ACM has introduced an education program to help professionals advance their technical knowledge in their current specialties or learn new skills in related fields. The association's new Professional Development Centre offers professional members unlimited access to nearly 200 Web-based technology training courses at no charge. ACM's new program, in partnership with Sun Educational Services, also provides more than 500 additional courses to professional members at a discount price. The courses are available around the clock and offer individualized instruction and the latest multimedia learning tools as well as chat rooms, quizzes, and mentoring options. Among the topics offered are Java, C and C++ programming, object-oriented programming, Web publishing, telecommunications, e-business applications, networking and security, and project management.
- "Printing Meets Lithography"
Industrial Physicist (09/02) Vol. 8, No. 4, P. 16; Michel, Bruno
An IBM team has developed a new soft lithography method based on classic flexography printing that could serve as a low-cost, high-resolution alternative to optical lithography. Almost every classical printing scheme--relief, gravure, screen/stencil, inkjet, lithographic and electrophotographic--is involved in the technique, which could benefit biotechnology research and analysis. This is because it improves molecule extraction from fluids and the arbitrary patterning of those molecules. A master of structured silicon is covered by a poly(dimethylsiloxane) prepolymer, which hardens to form an elastomeric stamp; the stamp is either immersed in ink or pressed onto an ink pad. The pattern is printed onto the substrate, creating a self-assembled monolayer that is selectively etched onto the substrate. The stamp materials, which are harder than commercial siloxane, are capable of producing feature sizes as tiny as 80 nm. One printing tool consists of a thin-film stamp wrapped around a printing cylinder with a 400-mm radius, while another tool that involves an arc segment attached to a modified mask aligner is used for printing 100 x 100 mm areas. The team believes that approximately 50-nm feature sizes can be achieved by integrating the new stamp assembly technique with an optimized copper electroless deposition process.