HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.

To send comments, please write to [email protected].

Volume 4, Issue 399: Monday, September 16, 2002

  • "Report: Nations Need Open Source"
    ZDNet UK (09/13/02); Loney, Matt

    The Commission on Intellectual Property Rights has issued a report recommending that developing nations promote IT education and the use of low-cost software by avoiding restrictive legislation and adopting open-source technology. Specifically, the commission's report said that smaller, less developed countries should avoid being drawn into international agreements such as the WIPO Copyright Treaty, which is reminiscent of the United States' Digital Millennium Act and the EU's Copyright Directive. The report said that developing countries should instead make legal software more accessible by getting involved in software licensing, even voiding restrictive or unfair licenses for shrink-wrapped software when necessary. The commission also noted that open-source software would help people in poorer countries avoid pirating proprietary software because it would overcome an all-important cost barrier. So that they can adapt software to meet local needs, developing nations should certify that reverse engineering of software programs is allowable under copyright law that also adheres to associated international agreements they have entered into. "Stronger protection and enforcement of copyright rules may well reduce access to knowledge required by developing to support education and research, and access to copyrighted products such as software," the commission cautions. "This would have damaging consequences for developing their human resources and technological capacity, and for poor people."

  • "Chip Size to Reach Its Next Milestone"
    SiliconValley.com (09/15/02); Takahashi, Dean

    On Sunday, Intel will unveil the details of a new manufacturing process that will be used to mass produce chips with 90-nm-wide transistors by the second half of 2003, a breakthrough in chip size that ushers in the age of nanotechnology. The new manufacturing process should lead to speedier chips that will shrink the size of electronic devices such as cells phones and digital cameras and eventually lead to chips built at the atomic level. Intel's Sunlin Chou says a 90-nm chip can theoretically contain 1 billion transistors, whereas the upcoming Madison chip, which uses transistors 130-nm wide, can hold nearly 500 million. The new process involves the integration of 90-nm technology with a silicon germanium hybrid that melds computing and communications operations. Intel's Tony Stelliga says this latter feature will eliminate much of the need to equip electronic devices with analog elements, thus saving space and money, while CTO Pat Gelsinger adds that the company will be able to build smart antennas. Meanwhile, IBM, Advanced Micro Devices, and Texas Instruments claim they will compete with Intel for the 90-nm chip market, because they will transition to similar manufacturing processes at the same time. There will, however, be differences: Whereas Intel's process uses strained silicon technology that it says boosts processing speeds 20 percent, IBM's uses silicon-on-insulator (SOI) technology that's designed to reduce power requirements and thus allow more transistors on a single chip.

  • "A Battle Over Software Licensing"
    New York Times (09/16/02) P. C3; Flynn, Laurie J.

    The Uniform Computer Information Transactions Act (UCITA) is still being hotly debated across the country as a variety of opposition groups have joined together to fight the software licensing law. UCITA would standardize software licensing in the same way other industries have done for their products, but opponents say that UCITA has been too heavily influenced by major software firms. Most recently, the National Conference of Commissioners of Uniform State Laws made 38 changes to UCITA based on recommendations from the American Bar Association, which refused to sanction the proposal as it was. Still, critics say the amendments do not prevent software firms from doing things like disabling software in case of a contractual disagreement. That has been one of the most controversial points for the insurance industry, where tough regulations and dependence on electronic operations makes such a possibility a nonstarter. So far, proponents say they have not given up their three-year-long legal fight to get the measure passed in state legislatures. Only Maryland and Virginia have adopted UCITA so far, when it first was introduced in 1999, because they wanted to attract Internet and software firms with an IT-friendly regulatory environment. A UCITA opposition group called Americans for Fair Electronic Commerce Transactions (AFFECT) is meeting this week to plan their strategy to prevent other states from adopting UCITA.
    (Access to this site is free; however, first-time visitors must register.)

    For information about ACM's UCITA activities, visit http://www.acm.org/usacm/IPM.

  • "Bush Has Plan To Protect U.S. From Cyberattacks"
    Wall Street Journal (09/16/02) P. A6; Kulish, Nicolas; Dreazen, Yochi J.

    The Bush administration this week is expected to unveil a comprehensive plan to protect the United States against cyberattacks that will call for an array of public and private initiatives. However, the plan will not contain enforcement provisions, and this leads Counterpane Internet Security official Bruce Schneier to predict that many firms will continue to sideline security concerns while prioritizing profits. A Sept. 5 draft of the plan calls for firewalls in all small business and personal computers, and industry-wide cooperation in sharing security-related best practices. The draft also calls for an Internet fund to support areas that "fall outside the purview of both industry and government." The White House's Office of Cyberspace is creating a government-only secure Internet, and plans to announce its blueprint on Sept. 18 during a new conference in Palo Alto, Calif. Entrust Chairman Bill Conner believes "we are at war and we're not moving at war speed" in regards to Internet security. The Bush administration's plans will protect against hackers as well as against military-related cyberattacks and is the first comprehensive plan to guard against cyber threats to the nation's infrastructure and attempt to find a balance between individual liberties and security on the Internet.

  • "Speed of Light Broken with Basic Lab Kit"
    New Scientist Online (09/16/02); Choi, Charles

    Two researchers at Middle Tennessee State University report that they have built a system that transmits light signals at least four times faster than the speed of light using off-the-shelf components that cost just $500. Jeremy Munday and Bill Robertson's system, which consists of a hybrid coaxial cable attached to a pair of signal generators, is capable of sending signals as far as almost 120 meters at a peak speed of over 4 billion kilometers per hour. In contrast, other faster-than-light optical systems require very costly gear and can only manage a range of a few meters. One signal generator produces a fast wave, while the other produces a slow one; the interference between these two waves generates electric pulses, which are monitored by an oscilloscope. The pulse's total energy does not exceed light speed, so Einstein's relativity is maintained. Theoretically, no useful data can be sent faster than light, because faster speeds create weaker, more distorted signals. Alain Hache of Canada's University of Moncton thinks that the method could lead to a more than 50 percent increase in electrical signal speeds in computers and telecommunications grids.

  • "On Top When It Comes to the Crunch"
    Financial Times (09/16/02) P. 5; Nakamoto, Michiyo

    Japan overtook the United States in supercomputer development earlier this year when its unveiled the Earth Simulator, which is now the record-holder for the most powerful supercomputer in the world. The previous record-holder, IBM's ASCI White, is five times slower than the Earth Simulator, which collates data from satellites, buoys, and other global observation stations to assemble a "virtual planet" that is used to predict weather. This development fortifies confidence in a sector that has been losing ground in other technology industries, and American supercomputer researchers are worried that Japan has gained a technological edge over the United States that could be maintained for years. Key to Japan's triumph with the Earth Simulator was the $430 million the government invested in its development, as well as NEC's dedication to building expertise in parallel vector supercomputing, a technology jettisoned by most supercomputer manufacturers. The push for the Earth Simulator came partly out of the need to facilitate more accurate analysis of global phenomena, especially in the wake of natural disasters such as the Great Hanshin Earthquake seven years ago, while the Japanese government and NEC also saw it as an opportunity to re-dominate the supercomputing sector. Parallel vector supercomputers require the presence of tailor-made processors that boast 8 gigaflops of processing capacity, while parallel scalable machines such as the ASCI White are composed of off-the-shelf, 1.5-gigaflop processors. The United States and Japan have been locked in a supercomputing race egged on by trade disputes, which led to the decline of the U.S. parallel vector supercomputing effort. On the other hand, the United States has a lead on Japan in terms of global phenomena simulation research.

  • "Nimble Nanoswitch May Win Info Relay Race"
    NewsFactor Network (09/13/02); Martin, Mike

    Scientists from the Chalmers University of Technology and Goteborg University in Sweden report in a new paper that they have devised a blueprint for a nanoelectromechanical switch fashioned from carbon nanotubes that could potentially be used for "logic devices, memory elements, and current or voltage amplifiers." Nanoelectronics Planet columnist Sandra Kay Helsel says the researchers' nanotube-silicon integration work is very important. "The evolutionary path for nanotech commercialization looks to be in this kind of integration," she explains. The switch exploits a variation in one circuit's electrical current to influence the current in another circuit; this connection-free control represents a major boon to the operation of high-voltage or power-sensitive devices such as nanoscale robots and tools that are highly susceptible to small changes in magnetic and electrical currents. Practically the entire spectrum of digital devices, ranging from gaming gadgets to the information superhighway, could benefit from the properties of small, lightweight carbon nanotube circuits. However, Jeff Byers of the Naval Research Laboratory Institute for Nanoscience maintains that the nanorelay has not been successfully fabricated yet, and technical challenges may prevent commercialization of the technology for many years. Still, if the switch can be built, he acknowledges that it could supplant the Field Programmable Gate Array (FPGA) logic circuit that Xilinx introduced 17 years ago.

  • "Computer Whiz Toils to Save Internet's Soul"
    Wall Street Journal (09/16/02) P. B1; Dreazen, Yochi J.

    Ben Edelman, at 22 years old, is a successful computer entrepreneur as well as one of the world's leading advocates of digital rights, and has dedicated himself to academic and legal causes that aim to influence Internet use. He has been an expert witness who has served both sides of the debate over Internet freedom. For instance, he helped overturn a law requiring libraries to install antipornography Internet filters on the grounds that they sometimes block access to innocent sites, and therefore constitute a violation of free speech; two years ago, his testimony helped halt the operations of a Canadian Web site whose users were downloading American TV content without authorization. Edelman is also fervently opposed to governments that use filters to block citizens' access to certain Web sites, as well as the penalties they impose on people who try to circumvent such blocks. He says such policies infuriate him because "It upsets the precarious balance the Internet rests on." Edelman will join the ACLU, whom he testified for in the antipornography case, to challenge a four-year-old mandate that prohibits the proliferation of technology that could be used to bypass copyright protection schemes. He also intends to publish a paper on the use of firewalls to block Web access overseas this fall. In the meantime, however, his long-standing employment with Harvard Law School's Berkman Center for Internet and Society is in doubt, because he feels he deserves a higher salary for his services. His work at the school includes tech support for the Berkman facility and research with Professor Jonathan Zittrain.

  • "Tiny Fire Marshals"
    ABCNews.com (09/13/02); Eng, Paul

    Accenture Technology Labs scientists are working with "Smart Dust" technology developed by University of California, Berkeley researchers to develop an early-warning system for forest fires. "Smart dust" is a network of mote-sized sensors that measure environmental factors such as temperature, light, and humidity and can communicate with each other wirelessly. Accenture senior researcher Chad Burkey says the sensors would be distributed throughout a forest, and algorithms developed by his team would be keyed to detect signs of a possible fire, such as elevations in temperature and light and declines in humidity; upon identifying such anomalies, each mote would connect with nearby motes and study these multiple readings to determine if there is a likely threat of a fire in their location. The sensors would then wirelessly communicate a warning to forestry workers. Burkey predicts that it could take three to five years to develop a commercial Smart Dust system, since challenges abound. One such challenge is shrinking the devices down to mote size; UC Berkeley researchers have designed 100 cubic-millimeter-sized sensors, but have yet to develop functioning prototypes. Burkey and Gartner analyst Michael Palma say the system could be applied to other environmental monitoring situations.
    Click Here to View Full Article

  • "Scientists Discuss the Little Things in Life"
    Moscow Times (09/11/02) P. 8; Prince, Todd

    Some 300 Russian and foreign microelectronics experts have gathered in Moscow the week of Sept. 9 to discuss research in the field and Russia's potential contributions. Dubbed Nano and Giga Challenges in Microelectronics Research and Opportunities in Russia, the conference, which ends Sept. 13, will showcase presentations from scientists on such topics as quantum computing, next-generation lithography, atom removal, and nanotechnology devices. Some of the sponsors of the event include Motorola, Ohio Supercomputer Center, and the Russian Federal Nuclear Center. The event could become biannual if the results are tangible, says Anatoly Korkin of Motorola's DigitalDNA Labs in Arizona and one the of the architects of the event. The conference hopes in part to combine the strengths of Russian and Western scientists to facilitate problem solving. In addition, the symposium hopes to encourage commercial initiatives in Russia. Motorola's DigitalDNA Labs is currently involved in a number of projects in Russia as is Samsung. Russia is considered one of the top five countries in micro- and nanotechnology research, in addition to the U.S., China, Japan, and Switzerland.

  • "EU Looks to Match US Investment in R&D"
    Electric News (09/12/02); Buckley, Ciaran

    The European Commission has unveiled a scheme to boost R&D spending to 3 percent of GDP in comparison to today's 1.9 percent average. EU Research Commission Philippe Busquin said in a report that investment in science and technology is crucial for Europe's future, especially to fund innovations for stimulating growth and employment. The scheme is designed to provide training for researchers, strengthen connections between public and private research sectors, implement intellectual property rights, and foster entrepreneurship. Busquin said the United States spent 288 billion euros on R&D in 2000, compared to the EU's 164 billion euros, and the gap could still be widening. Meanwhile, Finland and Sweden's R&D spending totaled 3.7 percent and 3.8 percent of GDP respectively, exceeding requirements. The EC also wants to increase private R&D funding to 66 percent of total R&D funding, up from the current 55 percent. The United States and Japan have already reached that figure while Belgium, Germany, Finland, and Sweden have surpassed it, according to the report.

  • "IT to Fight Terror"
    Computerworld (09/09/02) Vol. 36, No. 37, P. 30; Brewin, Bob

    Researchers at Los Alamos National Laboratory are concentrating on multiple projects designed to help the homeland defense initiative and the war on terror. Some of the work involves systems that were developed for a different purpose--for example, the Transportation Analysis Simulation System (TranSims), which was created to study the movements and social interactions of a metropolitan population, has been reconfigured and spun off into EpiSims, a tool that models the spread of epidemics. Los Alamos' Deborah Leishman says EpiSims will enable public health agencies to combine information from multiple sources--emergency rooms, for instance--into one database. Meanwhile, Terry Hawkins of Los Alamos' nonproliferation and internal security division notes that the lab is integrating a computer chip with a biological antigen to create a device that can detect the presence of dangerous biological agents; the antigen resides in a lipid membrane that could transmit electrical current to the chip whenever it detects an agent, providing users with instant readouts. Hawkins thinks that such research could eventually lead to a portable, programmable device that could be keyed to identify viruses. Los Alamos is also working on quantum encryption, which could provide data security in which intrusion attempts can be easily detected. The Los Alamos computer and computational sciences division is developing a number of projects, one of which involves improved data extraction and modeling methods for the simulation of terror threats or nuclear explosions. All of this homeland security research is supported by massive supercomputers.
    Click Here to View Full Article

  • "One Year Later: Lessons Learned"
    InfoWorld (09/09/02) Vol. 24, No. 36, P. 47; Schwartz, Ephraim; Shafer, Scott Tyler; Moore, Cathleen

    Communications and continuity have become the chief area of concentration for companies in the year since the Sept. 11 attacks, and this has caused technologies showcased during the emergency to change and become standard business components. The crippling effect of the disaster on the wired infrastructures of companies close to ground zero raised the value and credibility of wireless LANs (WLANs), and spurred corporate and government investment in technologies such as the BlackBerry, voice over IP, and IEEE 802.11; however, security worries have inhibited businesses' deployment of wireless, according to Giga's Rob Enderle. Sept. 11 forced many companies to reassess and modify their disaster recovery and business continuity plans so that telecom disruptions do not severely impact access to services. Many carriers observed that interest in conferencing climbed after the tragedy as companies suspended travel plans, while enterprises such as Fieldglass are altering their phone systems, becoming multihoned, and employing other strategies to add redundancy. Honeywell's Gary Bird says that Sept. 11 accelerated initiatives to boost information sharing and knowledge management; companies realized that they must be agile enough to respond to sudden changes in business conditions, and that it would benefit their personnel to be connected so that collaboration without traveling was possible. The Yankee Group's Rob Perry attributes the emphasis on knowledge management to corporate needs to preserve information and effect decentralization. The attacks highlighted the interdependence of data security and physical security, and this triggered a surge of interest in real-time backups and server mirroring, both of which are easier to deploy thanks to storage networking advances. Finally, companies can no longer feign ignorance of their responsibility to implement security, and this led to the extension of security to sectors such as storage, scalability, and back-end infrastructure.

  • "Data Security Oath"
    eWeek (09/09/02) Vol. 19, No. 36, P. 31; Hicks, Matt

    As personal data stored in computer databases becomes a more integral part of daily life, privacy policies need to be built into databases, say IBM researchers, who presented their concept at last month's Very Large Data Bases conference in Hong Kong. IBM fellow Rakesh Agrawal says databases can be designed to give people more control over the data they submit, and allow for rules to be established for the use of the data, similar to how a Hippocratic oath constrains doctors. Microsoft Research senior researcher Phil Bernstein says that concept can be extended from databases to other technologies, such as XML standards, applications, and systems. The upcoming Microsoft SQL Server database will contain finer security controls over data, allowing different levels of user access and ways of filtering data. Meanwhile, IBM researchers have already tested their Hippocratic database technology with the Platform for Privacy Preferences (P3P) standard, which helps protect Web users from improper data collection. Each type of data collected into the Hippocratic database and its elements would have its own assigned metadata table, IBM officials note. Bernstein believes that database vendors should be able to embed privacy awareness better by progressively adding features such as encryption, access control, logging access, and managing and filtering queries in the coming years.

  • "Industry Not Jumping Off InfiniBand-Wagon"
    Network World (09/09/02) Vol. 19, No. 36, P. 1; Connor, Deni

    InfiniBand technology, which promises to greatly increase the data-transfer rates between I/O devices, has had a rocky start, but the cutting-edge technology is gradually being adopted by technology vendors despite lukewarm acceptance from some industry leaders. Several firms are readying actual InfiniBand products, such as Hitachi, which said it would integrate InfiniBand technology from Voltaire into its storage products. IBM is also widely expected to announce the general availability of its 10 Gbps InfiniBand silicon soon, with a few startup firms saying they have plans to make use of it. Yankee Group analyst Jamie Gruener says the coming one or two quarters will be critical in determining whether InfiniBand will take off among mainline technology providers or not. If more vendors announce support similar to what Hitachi has done, then it will force others to join with their own offerings, he explains. InfiniBand allows devices directly linked to one another to communicate much faster than possible with traditional bus technologies, and the technology is expected to take off first in enterprise computing. Microsoft recently said it would not write the drivers for InfiniBand on its Windows 2000 and .Net servers, but would leave that work for third parties, while Intel abandoned its work on 2.5 GB silicon for InfiniBand. Some analysts say economic factors have limited InfiniBand's success. Analyst Arun Taneja says, "InfiniBand has been caught in this no man's land...the economy has been really bad, and with 9/11 happening, the IT community has been totally gun-shy about looking at any new technologies."

  • "One Face in 6 Billion"
    Discover (09/02) Vol. 23, No. 9, P. 17; Garfinkel, Simson

    Face-recognition technology is receiving new focus as a security solution in the wake of the Sept. 11 terrorist attacks, but high-profile deployments of such systems in airports have produced more false positives than successful terrorist identifications. Charles Wilson of the National Institute of Standards and Technology remarks that asking a computer to find a terrorist in a crowded airport with only a few grainy photos for reference is like attempting "to recognize somebody you have glimpsed for only two seconds." In the late 1980s, MIT's Media Lab pioneered some of the face-recognition technology being used today. Viisage sells software that designers built by studying hundreds of thousands of faces to extract a full range of facial physiognomy; the product captures a person's facial image and compares it to this physiognomic database in order to build a template that can be compared to stored images. Identix takes a slightly different approach to coding faces: The technique involves local feature analysis, in which up to 80 differences between facial features are measured. In the field, however, a number of factors can throw the systems off and make database comparison useless, including head movement, lighting, makeup, and the effects of age. Meanwhile, the tendency for such systems to incorrectly identify people as suspects is inconvenient and embarrassing for innocent parties.

  • "Ultrawideband Squeezes In"
    Technology Review (09/02) Vol. 105, No. 7, P. 70; Jonietz, Erika

    Real-time tracking and collision avoidance systems for vehicles and people, personal-area networks of electronic devices, faster data transmission, and super-sensitive security tools are just some of the applications promised by ultrawideband radio technology, but advocates are waging a heated battle with skeptics who worry that ultrawideband transmissions will disrupt radio frequencies vital to airlines, satellite companies, cell phone manufacturers, and others; these groups strenuously objected to an FCC decision that approved of the technology, albeit on a limited basis. Ultrawideband radios are accurate to the centimeter, are cheaper to build and consume less power than conventional transmitters, and are relatively unaffected by multipath interference. One condition of the FCC's approval is that consumer ultrawideband radios can only operate in specific frequency ranges and transmit signals at very low power levels in order to prevent interference with practically all existing radio frequency service. Tracking systems stemming from ultrawideband could significantly improve inventory management and theft prevention, and the rapid location of personnel and equipment during emergencies. Companies are hoping to set up markets for ultrawideband-based tracking systems and collision avoidance radars soon. Opponents still argue that the technology generates interference that could lead to more dropped cell phone calls, less reliable aircraft guidance and air traffic control systems, and other problems, even when it complies with current FCC regulations. The FCC support is likely to spur commercialization, which will help explore and potentially resolve the technology's technical and political ramifications. The proliferation of ultrawideband will depend greatly on standardization, and further support from boosters such as IBM, Intel, and Sony.

[ Archives ] [ Home ]