HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.

To send comments, please write to [email protected].

Volume 4, Issue 428: Wednesday, November 27, 2002

  • "Loss of Major Hub Cities Could Cripple Internet, Study Suggests"
    ScienceDaily (11/26/02)

    The centralized "hub-and-spoke" model of the Internet's infrastructure makes it especially fragile in the event of a terrorist attack or other catastrophe that threatens to knock out major Internet nodes, according to an Ohio State University study that will appear in the February 2003 issue of Telematics and Informatics. These nodes, which are typically located in large metropolitan areas, supply network access to smaller and medium-sized cities. In the event of a disruption, these cities would be cut off from the Internet. For instance, if Los Angeles' Internet connections were destroyed, network accessibility would be disrupted or severely limited in other California cities, as well as Las Vegas, Houston, Dallas, Denver, Tucson, and Phoenix. The node cities, however, would still be connected to the network thanks to multiple internal and external links, although functionality would be more limited. Study co-author Tony Grubesic says a worst-case scenario would involve the destruction of telecommunications equipment in the six largest U.S. Internet hubs--Atlanta, Chicago, Dallas, Los Angeles, New York, and Washington, D.C. He adds that switching to a decentralized Internet infrastructure will reduce the threat. "The ability for networks to re-route, re-connect and have redundancy is clearly important for the survival of the Internet in the face of disasters," Grubesic argues.

  • "Court Finds Limits to California Jurisdiction in Cyberspace"
    SiliconValley.com (11/26/02); Mintz, Howard

    A Monday ruling from the California Supreme Court, in which the justices determined that merely posting content online does not give companies the right to sue out-of-state defendants for copyright infringement in California courts, is considered a triumph for civil liberties organizations and a setback to the DVD industry, which has waged a long-term campaign to prohibit the online distribution of a DeCSS technique used to circumvent DVD copy protections. The justices, in a 4-3 vote, ruled in favor of Texas Web master Matthew Pavlovich, who was among dozens of defendants sued by the DVD industry in Santa Clara County Superior Court in 1999 for posting software codes that decrypt DVDs on the Internet. Pavlovich challenged the industry's argument that such actions would hurt film and DVD companies, many of which are located in California. However, the court did not rule on the legality of posting the software online, and it noted that Pavlovich and the other defendants could still be sued in their own states. "This decision is good for individuals and good for small and medium-sized businesses," declared Pavlovich's attorney Allonn Levy. "It will be worse for multinational conglomerates who use the judicial system in a way it's not meant to be used.'' DVD Copyright Control Association attorney Jeffrey Kessler said the industry is weighing its options, including filing an appeal with the U.S. Supreme Court.

  • "China Tries to Woo Its Tech Talent Back Home"
    Los Angeles Times (11/25/02) P. B1; Tempest, Rone

    The Chinese government is ramping up efforts to win back its engineering and technology students who had gone to the United States to study but stayed to do business. Chinese consul general Wang Yunxiang in San Francisco says jobs for returning Chinese technology workers offer comparable benefits to ones in the current American economy, given the differences in the cost of living in the two countries. In addition, some experienced returnees receive a car and driver, a home, and a well-regarded social status. Chinese-born technology workers in the United States can travel to China with all expenses paid, so that local officials have a chance to woo them personally. In addition, several local Chinese governments joined together this month to sponsor a three-day "China Meets Silicon Valley" job fair in San Jose, which attracted more than 4,000 Chinese-born workers. University of California, Berkeley professor AnnaLee Saxenian, who studies the immigrant communities in Silicon Valley, says that in today's globalized world, the idea of "brain drain" should be replaced by "brain circulation" because of the interaction that occurs between peers on both sides of the Pacific. Many Chinese who return to their native country maintain close ties with business or academic contacts in the U.S., and some executives fly back and forth over the ocean so frequently that they are jokingly dubbed "astronauts." Additionally, Consul Wang says the Chinese government still encourages its brightest students to pursue graduate studies in America.
    (Access to this site is free; however, first-time visitors must register.)

  • "TeraGrid Supercomputing Project Expands"
    CNet (11/25/02); Shankland, Stephen

    The National Science Foundation, which apportioned $53 million last year to fund the construction of the Distributed Terascale Facility (TeraGrid), has authorized an additional grant of $35 million to extend the facility to include different kinds of supercomputers at five separate sites and change a critical component of the system. The 2001 TeraGrid design consisted of four computers incorporating 3,300 Itanium 2 McKinley chips, but the new plan will instead use the upcoming Madison chips, which are also part of the Itanium 2 line. In addition to the four machines located at the National Center for Supercomputing Applications (NCSA), the California Institute of Technology, the San Diego Supercomputer Center, and Argonne National Laboratory, the TeraGrid will encompass a Hewlett-Packard computer with over 2,700 Alpha processors and a new HP server that uses next-generation EV7 Alpha chips at the Pittsburgh Supercomputing Center. The TeraGrid may later come to include a NCSA-based cluster of a dozen IBM p690 servers capable of carrying out 1 trillion calculations per second. "[The TeraGrid is] intended so that additional sites with sufficient compute, storage and network bandwidth will be able to join," comments NCSA's Rob Pennington. TeraGrid's applications demonstrated at last week's SC2002 supercomputing event included climate change simulation, 3D molecular structure animation, and high-speed access to information at remote sites. Once complete, the TeraGrid computers will be able to collectively perform 20 trillion calculations per second.

  • "Volunteers Wanted for IT National Guard"
    InternetNews.com (11/25/02); Joyce, Erin

    As part of the Department of Homeland Security initiative, the federal government will put out a call for volunteers to serve in the National Emergency Technology (NET) Guard, a taskforce of science and technology experts that will quickly mobilize to repair disruptions to the nation's communications and technology infrastructure caused by terrorist attacks or other emergencies. The genesis for the NET Guard proposal came out of the communications outages and bottlenecks resulting from the Sept. 11 attacks, and the inadequate resource management of private-sector and scientific experts who tried to aid the rescue and recovery efforts. The NET Guard legislation calls for establishing interoperability between the communications systems of emergency response personnel, a recommendation that has helped garner the support of federal emergency agencies who have had to contend with obsolete, incompatible communications technology. The bill outlines a pilot program designed to help deploy this communications interoperability on the state level, as well as the establishment of a national clearinghouse of civilian emergency prevention and response technologies. Also under the bill, a center for testing antiterrorism and disaster response technology would be set up within the National Institute for Standards and Technology, while a technology reliability advisory board would also be established. The president is expected to choose a proper department, office, or agency to aggregate and manage a database of volunteer nongovernmental technology and science specialists to assist counterterrorist efforts by federal agencies within 12 months of the bill's signing. The head of this body will then be expected to allocate $5 million in grants to fund pilot programs for the IT National Guard.

  • "Students Learning to Evade Moves to Protect Media Files"
    New York Times (11/27/02) P. C3; Harmon, Amy

    In response to warnings from entertainment companies about students downloading copyrighted material off the Internet without authorization, as well as the cost of such activities' bandwidth demands, U.S. colleges are attempting to deter such behavior with a variety of strategies, including hosting educational sessions, severing access to portals used by file-trading services, and installing software that rations bandwidth for each student. However, such actions are prompting some students to find ways to avoid detection and blocks. For instance, administrators can set download limits by connecting a student ID to a dorm room computer, but students can circumvent this measure by downloading files using computers registered to others. Another strategy is to conduct their activities using multiple computers to confuse detection efforts. Students such as Lehani Potgieter say that these skills and techniques are easily available from other people. Copyright holders are urging universities to heavily monitor their networks for signs of file swapping, but this creates a dilemma for educational institutions: Teaching students about copyright law and moral behavior without curtailing their right to privacy and free speech. "The biggest problem that universities are having is they have not openly decided whether their primary responsibility in this regard is law enforcement or education," observes Virginia Rezmierski of the University of Michigan's School of Information. "Right now they're doing more monitoring than education."
    (Access to this site is free; however, first-time visitors must register.)

  • "Bush Signs Homeland Security Bill"
    CNet (11/25/02); McCullagh, Declan

    Bush signed the Department of Homeland Security bill into law on Monday, thus authorizing the consolidation of 22 federal agencies into a single body tasked with protecting the nation's critical infrastructure. The law has civil liberties groups worried about last-minute provisions that expand the authority of law enforcement to eavesdrop on citizens' Internet activity or telephone conversations without court orders, allow Internet providers to reveal information about subscribers to police in times of emergency, and impose stiffer penalties on people convicted of malicious cybercrimes, including life imprisonment. Another late provision decrees that critical infrastructure information companies disclose to the department will not be subject to the Freedom of Information Act. Also generating concern is a huge database funded by the Defense Advanced Research Projects Agency (DARPA) designed to profile almost every American's behavior and spending habits. The Total Information Awareness (TIA) program is headed by former admiral John Poindexter, who Electronic Privacy Information Center director Marc Rotenberg deems inappropriate. The homeland security law also apportions $500 million for technology research, calls for the establishment of an office that will concentrate on law enforcement technology and finance tools that will help state and local police fight cybercrimes, and sets up a Directorate for Information Analysis and Infrastructure Protection. White House advisor Tom Ridge was nominated by the president to run the Department of Homeland Security.

  • "MIT Cooks Up Wired Kitchen Tools"
    Wall Street Journal (11/27/02) P. B3A; Byrt, Frank

    Since 1998, MIT's Media Lab has been the center of an effort to develop sophisticated electronic kitchen products, the foremost being the Minerva interactive countertop, which combines cameras, scales, and computers to assist chefs in food preparation. Using a camera, the Minerva system can read the radio frequency identification (RFID) tags of each ingredient container, while a scale can weigh the amount of each ingredient; this interaction enables the system to instruct the cook in how much of each ingredient he should add, as well as what kind of foods can be created from the components on display. Other Media Lab kitchen innovations include a countertop whose height can be adjusted, a toaster that can burn greetings, horoscopes, or other messages pulled off the Internet into the bread, an oven mitt that can alert the wearer if an item is too hot, and a "chameleon mug" that can change color in response to the heat of its contents. Technologies pioneered at MIT have also found their way into commercial products: In 1999, Sweden's Electrolux Group debuted a prototype fridge that reads RFID tags to manage inventory and replace foods that are about to spoil, and can create shopping lists on a computer screen and order new food from a grocery-delivery service via email. More recently, LG Electronics started working on a fridge equipped with an Internet communications center, a database of recipes, television, radio, a movie camera, address book, and calendar. Ken Wacks of the Wrap consulting firm explains that networked appliances are a relatively new concept, but are part of the overarching goal of integrating technology into household systems control.

  • "Tough Microbes Offer Clues to Self-Assembling Nano-Structures"
    SiliconValley.com (11/26/02); Chui, Glennda

    NASA biologist Jonathan Trent has proposed that a hardy extremophile organism's resistance to high temperatures could be exploited to produce self-assembling arrays of tiny structures, which forms the basis of nanotechnology. The microbe in question, sulfulobus, is able to survive in high-temperature, acidic environments like hot springs because it produces a specific protein, and Trent's research team determined in 1991 that human beings produce a similar protein. These proteins spontaneously aggregate into fat, barrel-shaped structures called chaperonins, and research has shown that they can be assembled into different configurations. Trent and his colleagues investigated whether they could formulate a stable, reproducible process for creating such shapes, as well as integrate them with other materials; his team set out to tweak the protein's structure and arrange the chaperonins into an array. They have learned through experimentation that gold or semiconducting material, when added to the chaperonins, sticks within the molecules' small openings, and the chaperonins themselves self-align into orderly rows. Trent and his team detail their research in a report published this week in Nature Materials. Ranganathan Shashidhar of the Naval Research Laboratory says the scale that the NASA scientists work with would allow electrons to leap wirelessly between particles, for instance. Chaperonin shows promise because it exhibits resistance to the high temperatures of many manufacturing processes, but scientists are also researching polymers, viruses, DNA, and other molecules as possible nanotechnology building blocks.

  • "Free Software vs. Goliaths"
    Boston Globe (11/25/02) P. F3; Bray, Hiawatha

    The nonprofit Free Software Foundation faces some formidable adversaries in its struggle to support the open-source software movement, including politicians, copyright holders, and commercial software companies. These organizations and individuals have vast resources and clout, while the movement, in Columbia University law professor Eben Moglen's opinion, is seriously under funded. The foundation has been waging a campaign against the Digital Millennium Copyright Act (DMCA), which prohibits consumers from bypassing digital music and movie encryption. Other legislation that has raised the ire of open-source advocates includes Sen. Ernest Hollings' (D-S.C.) proposal to force computer makers to install anti-piracy chips into their hardware. Meanwhile, Microsoft announced it would freely distribute its patented file-swapping methods with all manufacturers except those who produce free software, thus stymieing interoperability between Windows and free software products. But although organizations such as the Free Software Foundation face an uphill battle, they have important officials on their side. White House computer network security director Richard Clarke has recommended a loosening of the DMCA's antipiracy rules, while Rep. Rick Boucher (D-Va.) has drafted a bill calling for DMCA revisions. Moglen is worried that the Hollywood movie industry could influence the government to enact legislation that could cripple the free software movement, and is urging his peers to contribute to organizations such as the Free Software Foundation.

  • "Free-software Gadfly Takes on Net Group"
    CNet (11/25/02); Festa, Paul

    Open-source advocate and Linux pioneer Bruce Perens is working to change the Internet Engineering Task Force's (IETF) policy of including proprietary code in its specifications. Because the IETF's current membership is against such a change, Perens is rallying free-software proponents and asking them to join the IETF so as to help push their agenda more effectively. Free software cannot use IETF specifications if they include royalties. IT vendors have tried to make exemptions for their intellectual property, but have encountered criticism from customers who already make large royalty payments. Perens says the free-software community successfully fought against the inclusion of intellectual property in World Wide Web Consortium (W3C) standards earlier this year, and that his intent to sign more like-minded people on as IETF members is not just to affect intellectual property issues, but other areas of interest within IETF. Perens' plan comes after last week's IETF meeting in Atlanta, where one of the attendees was a working group recently formed to amend and better define the organization's stance on incorporating royalty-burdened technology into its protocols. An informal survey that the group conducted at the meeting found that the open-source movement does not hold much sway with current IETF members. "Not only was there no consensus to request a recharter, there was a consensus against making such a request," noted group Chair Steve Bellovin.

  • "San Diego Supercomputer Center Hits Data-Transfer Speed Milestone"
    Newswise (11/22/02)

    The San Diego Supercomputer Center (SDSC) at the University of California, San Diego, achieved a data transfer speed of 828 Mbps when moving data from its tape drives to disk drives, demonstrating the type of massive storage capabilities needed for new supercomputing applications. High-End Computing program director Phil Andrews says the data sets produced by today's experiments in anatomy, physics, and astronomy are growing by multiples of 1,000 and will soon eclipse one petabyte on a regular basis. To deal with this data growth, supercomputer centers need a mix of cheaper tape storage and disk drives, which cost much more but perform better by a factor of at least 10. The SDSC uses StorageTek tape drives and hardware from other vendors to speed what used to be days' worth of data to the SDSC's IBM Blue Horizon supercomputer in just hours. In addition, the SDSC is updating its storage area network to handle 500 terabytes by next year, up from 50 terabytes. This is in anticipation of the National Science Foundation's TeraGrid project, which will go live in 2003 and make available 20 teraflops of computing power over a 40 Gbps network. The TeraGrid will link five research sites around the country. The SDSC tallied another data transfer first in a demonstration at the Supercomputing 2002 conference in Baltimore, where SDSC computer scientist Bryan Banister moved data to that city from La Jolla, Calif., at 721 Mbps. Such speeds will be routine once the TeraGrid is operational.

  • "Way Back When"
    New Scientist Online (11/23/02); Marks, Paul

    Brewster Kahle is the inventor of the Wayback Machine, an access point for an online archive of roughly 2 billion Web pages that currently takes up more than 100 terabytes (TB). His Alexa Internet commercial Web site cataloging business is funding the archive, along with donations from private organizations and a four-year National Science Foundation grant of approximately $1 million. Kahle explains that the Wayback Machine is composed of around 150 standard PC cases containing four drives each, while the archive is physically located in the San Francisco Bay Area and Egypt's Library of Alexandria. He hopes the Wayback Machine will be part of a network of interoperable online databases. Data is enabled for portability through a simple file format that uses a minimum of meta-tags, while Web pages are written in HTML. Kahle adds that the project's storage technology keeps pace with innovation--the archive is currently switching to 160 gigabyte (GB) drives, and 200 GB drives are just around the corner. The Wayback Machine is, however, limited: Pay sites or password accessible sites are not recorded into the archive, and authors' requests not to post content are respected. "The public library system in the U.S. gets $25 billion a year," Kahle comments. "We could use a little of that money to do a lot better job of trying to put the classics--the best works of humankind--within reach of every child at home via my archive or something like it." Kahle reports that 10 TB of material are added to the archive each month. He also says, "Our archive is the people's medium, the wired way, and you can use it wherever you are."

  • "Experts Mull 'Next Big Thing' in Computing"
    CRN Online (11/19/02); Montalbano, Elizabeth

    Leaders of the IT industry discussed future computing trends in a Comdex panel entitled "The Next Big Thing." The panel focused on wireless, security, and display technology while presenting a vision of a world full of networks. In the future, content would instinctively know what medium it would be displayed on, and all households would have a Web-linked television. General Motors' Tony Scott said the automaker is working on a network-based car that would provide a host of new services to drivers, such as Tivo-style television-on-demand. ViewSonic's Joseph Marc McConnaughey said future display technology would allow people to access so-called smart content on demand "anytime, anywhere" through high-resolution devices, while Corel CEO Derek Burney said Web services will benefit from content that automatically formats itself according to the device being used. The industry executives said Web services would have the biggest impact on the highly interconnected networks of the future. But Jim Hunt of Cap Gemini Technologies said solution providers and their customers need to resolve security problems before extending new technologies to multiple wireless devices.

  • "Software Innovation Without End"
    Financial Times (11/26/02) P. 11; London, Simon

    Computer visionary and inventor Alan Kay, a Palo Alto Research Center principal whose many credits include the standard PC interface, the highly influential Smalltalk programming language, network client/servers, and the Ethernet, is joining Hewlett-Packard Labs as a Senior Fellow today. His reasons for joining HP include the company's interest in the open-source software movement through the marketing of its Unix- and Linux-enabled servers, and his development of Croquet, a next-generation operating system. The basis of Croquet's operations is peer-to-peer computing, which Kay sees as a major future trend. Croquet is designed to allow people to collaborate or play without having to go through central servers, which have limited scalability. Among the challenges in creating such a tool is establishing accord between all network devices as to what constitutes "reality." Other major industry figures devoted to the development of peer-to-peer include Sun Microsystems' Bill Joy, who is working on programming languages and standards for peer-to-peer networks. "Computers are everywhere but the way they are used is based on simple markets and simple notions," Kay muses, who says the computer revolution hasn't really happened yet. He says, "The real revolution is going to happen over the next five, 10, 15 years."

  • "Throttled at Birth"
    Economist (11/23/02) Vol. 365, No. 8300, P. 74

    Matthew Williamson, a researcher at Hewlett-Packard laboratories in Bristol, England, has devised a way to slow the spread of a computer virus, and it appears to work. Williamson's "throttle" method limits the rate at which computers can connect to new computers, thereby containing the spread of viruses during the time most crucial to their survival. Williamson's method is based on the observation that an infected computer will try to connect to as many other computers as quickly as possible; uninfected machines are much slower to connect to other machines and typically seek out familiar or popular connections, such as a favorite Web site or mail server. Williamson's throttle mechanism slows down the rate of connection to one a second and limits the number of connections a computer can make to those not on its recent history list. During a recent test on a group of 16 computers, the throttle mechanism was able to limit the spread of the Nimda virus dramatically. Without the throttle limit installed, Nimda spread to 15 of the machines in just 12 minutes, whereas the virus took 13 minutes to spread to a second computer and half and hour for a third computer in the test with the throttle mechanism. But the key to Williamson's technique is that it alerts people of a virus attack as a huge backlog of requests develop in seconds, while not interfering with normal operations such as Web browsing. Researchers are just starting to examine systems for making computers more resilient, and Williamson believes that architecture plays a critical role in fighting computer viruses.

  • "Global Positioning System: A High-Tech Success"
    Industrial Physicist (11/02) Vol. 8, No. 5, P. 24; Ashby, Neil

    The Global Positioning System (GPS), which started out as a U.S. military project to improve navigational accuracy, has evolved into a tool that can be used by civilians as well, and has many potential applications. The system consists of 24 satellites positioned in six orbital planes inclined 55 degrees from Earth's equatorial plane. They are equipped with atomic clocks that send synchronous timing signals to Earth-based receivers, which pick up precise measurements of position, velocity, and time. Receiving stations also constantly monitor the satellite data and route it to a master control station, where orbit and clock performance are scanned for accuracy and stability; once updated, the information is sent to the satellites, which retransmit the signals to users. Designers of GPS receivers use the publicly available ICD-200 Interface Control Document as a jumping-off point--as a result, there is a diverse array of receivers available with different prices and applications, and their cost has gone down thanks to mass production advances and increasing competition. GPS is already being used for vehicle navigation and meteorological analysis, and could also be used to study tectonic activity, maintain archeological sites, and map out areas that could be used for land and resource management. Some companies are developing GPS-based projects that aim to satisfy a recent FCC mandate to enable wireless phones to automatically supply location data to 911 emergency-service providers. The GPS system is not yet reliable enough for safety-critical applications, because GPS signals are weak and can be obstructed or reflected by objects.

[ Archives ] [ Home ]