Compaq is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, Compaq offers performance-packed products and comprehensive services.

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Compaq or ACM.

To send comments, please write to [email protected].

Volume 4, Issue 349: Wednesday, May 15, 2002

  • "Why Hackers Are a Step Ahead of the Law"
    CNet (05/14/02); Sandoval, Greg

    Sophisticated criminal hackers are nearly impossible to catch online because of the tremendous advantages they have over law enforcement in terms of concealing evidence, legal bureaucracy, and technical competence. The number of online crimes such as identity theft and fraud are increasing, as well as big-time hacks. Gartner says e-tailers lost $700 million in sales last year due to fraud, 5.2 percent of online shoppers have had their credit card numbers stolen, and 1.9 percent have been victims of identity theft. About 10 percent of hackers know the ropes well enough to ensure they are very hard to catch, routing their attacks through several servers and erasing log files as they retrace their steps. Law enforcement officers have little chance to catch hackers who are diligent and speedy, because they often have to obtain and serve multiple subpoenas in order to examine server files for evidence, which is likely gone by the time they obtain access. The government is stepping up its efforts to catch these criminals, adding a new cybercrime unit to the FBI and 50 federal prosecutors to pursue cybercrimes. For businesses, simple steps such as enabling log tracking on Web sites and ensuring that firewall and antivirus software is updated can prevent intrusions and help catch criminals if they do break in.

  • "The Trans-Pacific Valley"
    Financial Times (05/15/02) P. 9; Kehoe, Louise

    The Asian technology industry is benefiting from a two-way flow of IT workers between Silicon Valley and overseas technology hubs. In a study of chiefly Chinese and Indian Bay Area immigrant professionals, AnnaLee Saxenian of the University of California, Berkeley, found that 52 percent "have successfully adopted the unique business culture that distinguishes Silicon Valley in terms of start-ups, networking and information exchange while also maintaining extensive ties to their native countries." The study also found that 50 percent of the valley's enterprising foreigners have established business operations in their homelands, while 76 percent of Indian immigrants and 73 percent of Chinese immigrants said they may do the same in the future. However, Silicon Valley's economy is coming to depend more and more on foreign tech centers, and the loss of the best workers to better opportunities in their homelands, given the cyclical nature of the Bay Area economy, could one day end the valley's tech leadership. European immigrants have not exported valley wisdom into their own tech enclaves because they do not set up tightly woven ethnic networks, as Saxenian reports. Asian immigrants, on the other hand, maintain close ethnic ties: 55 percent of Indian startups in the valley involved at least four Indian founders; that finding also holds for 51 percent of Taiwanese entrepreneurs and 51 percent of Chinese entrepreneurs.
    Click Here to View Full Article

  • "Group Targets Digital TV Piracy"
    Los Angeles Times (05/14/02) P. C1; Healey, Jon

    In an effort to stamp out digital TV piracy, the Broadcast Protection Discussion Group, which includes various technology companies, consumer electronics manufacturers, and Hollywood studios, released a draft proposal on Saturday calling for devices such as computers and TV sets that receive digital TV broadcasts to be equipped with approved anti-piracy technology. Once that proposal is finalized, likely sometime this month, another group will work on a way to ensure that the manufacturers of such devices will install the technology in their products. Critics claim that such action will slow down innovation and violate the rights of consumers. For example, one approved anti-piracy technology would cause DVD recorders to scramble digital TV programs, making them unplayable on tens of millions of DVD players already owned by consumers. DigitalConsumer.org and Excite.com founder Joe Kraus notes that the assumption that all online retransmissions of digital TV programs rather than just those that infringe on copyrights must be halted is problematic. His group wants a fair use exemption to be instituted, but distinguishing between fair use and piracy is a virtually impossible task for protection technologies.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "The Supreme Court and the Wild, Wild Web"
    Newsbytes (05/15/02); Isenberg, Doug

    In a move that follows the trend to impose Internet governance, the Supreme Court on Monday refused to strike down the Child Online Protection Act (COPA). With this ruling, the court is saying that the controversial law may not necessarily be incompatible with the First Amendment, despite protests from plaintiffs that it could ban more online speech than is necessary. The court raised the possibility that "community standards" could be used to deem material "harmful to minors," but the lower appeals court claims this is an impossibility because communities have borders and the Internet does not. The court is essentially skipping over many significant ramifications of the law, including whether it is the least restrictive child protection measure around and whether it is unconstitutionally opaque. The court's decision should make Web publishers--newspapers with Web sites, adult sites, online health discussion boards, and others--more wary that the information they post could violate COPA, even though the law has never been enforced. It is also a marked contrast to an opinion issued five years ago, when the court clearly recognized that a law such as COPA "threatens to torch a large segment of the Internet community."

  • "Supercomputer Lets Researchers Study Material Failures, Atom by Atom"
    SiliconValley.com (05/14/02); Chui, Glennda

    Scientists from Lawrence Livermore National Laboratory, IBM, and the Max Planck Institute for Metal Research in Germany studied materials failures at the atomic level by running supercomputer simulations. Such simulation studies have aided the development of improved computer chip semiconductors and stronger, lighter materials for aircraft. Led by Farid F. Abraham of IBM's Almaden Research Center, the team used Lawrence Livermore's ASCI White supercomputer to study the problem for five days in late 2000. The simulations involved approximately 50 percent of the machine's 8,192 processors. One of the more alarming conclusions the study arrived at was that cracks can spread faster than the speed of sound, although such a phenomenon has not yet been observed in the real world. The number of atoms that can be simulated has increased exponentially, from about 100 atoms in 1965 to 1 billion today. Huajian Gao of the Max Planck Institute says that supercomputers will be important tools used to understand and assess nanotechnology.

  • "Report Roasts Linux on Mainframes"
    ZDNet UK (05/10/02); Broersma, Matthew

    The Meta Group released a report this week that says Linux on the mainframe is only a short-term solution as Intel-based servers continue to drop in price and become more management-capable. The report states that "Longer term (2005-07), as Unix/Win2000-based systems flesh out increasingly robust, mainframe-like management capabilities, the justification for paying the mainframe's significant cost premium will fade." Intel-based hardware will be almost exclusively responsible for the Linux-based data center workload by 2007, Meta Group projects. The company also says that Linux is still immature in terms of handling critical business applications, as demonstrated by the myriad errors and patches listed on Linux providers' sites. Meta Group took Linux vendors to task for forcing users to update their software on a continuous basis in order to remedy flaws. The price/performance gains of commodity hardware is 35 percent a year, an improvement rate that IBM cannot hope to match, according to Meta Group. Intel and AMD are working to make high-end computing power less expensive by respectively introducing the Itanium 2 and Opteron 64-bit processors.

  • "Losing Their Grip"
    Boston Globe (05/13/02) P. C1; Bray, Hiawatha

    Handheld computer companies are crowding a marketplace that may not be big enough to accommodate them all. Many firms have already have had the wind taken out of their sails with the economic slump, price wars, and marketing blunders, and Gartner predicts a mere 18 percent increase in handheld sales this year. Companies are now competing with new hybrid handhelds such as Sony's Clie, Compaq's iPaq, and Research In Motion's BlackBerry. Palm is the No. 1 handheld provider, and much of that is due to its presence among government agencies and companies seeking to equip personnel with handhelds; Compaq has overtaken Handspring to capture the No. 2 spot thanks to the popular iPaq. Both Palm and Handspring have formulated plans to do battle with one another: Palm will offer a spectrum of handhelds that vary in sophistication, while Handspring will seek to break out of the low end of the market with advanced offerings such as the Treo, a cell phone/handheld combo. In the meantime, Compaq will hunker down and fortify its position in the corporate market. These players must also face newcomers such as Sharp Electronics, which is offering a Linux-based PDA. Palm especially appears to be doing well--in addition to securing the top spot, it posted a small profit for the quarter ending in March, mainly from sales of handhelds.
    Click Here to View Full Article

  • "Robotics, Medicine Merger Poses Quandary"
    United Press International (05/13/02); Burnell, Scott R.

    Biorobots and cyborgs were an important topic of discussion at the 2002 International Conference on Robotics and Automation this week. "More and more, biological models are used for the design of biometric robots [and] robots are increasingly used by neuroscientists as clinical platforms for validating biological models," noted Paolo Dario of the Advanced Robotics Technology and Systems Lab at Italy's Scuola Superiore Sant'Anna. He added that microelectromechanical systems have become small enough to interface with individual nerves. The two-way conversion of computer signals into nerve impulses would enable artificial muscles or actuators to be controlled by such signals, and MIT is conducting experiments in this regard. Its effort involves a two-dimensional grid that connects an amplifier to actuators, and MIT's Harry Asada said such research could be applied to an exoskeleton for soldiers or a means for paralysis victims to gain mobility. Carnegie Mellon University's Takeo Kanade explained that human medicine could benefit from the convergence of robotics and biology into such products as a surgical simulator designed to mimic a human body's responses to an operation. However, Penelope Boston of the University of New Mexico at Albuquerque said that such efforts open up an ethical can of worms, and advised researchers to carefully consider the symbiotic ramifications of melding technology with biology.

  • "Two Virginia Universities to Join Forces Against Cybercrime"
    Newsbytes (05/13/02); Krebs, Brian

    George Mason University's National Center for Technology and Law and James Madison University, both based in Virginia, will collaborate on a $6.5 million initiative that aims to smooth out the tangled legal, technical, and policy aspects of the nation's effort to protect its essential computer systems against online attacks. The Commerce Department's National Institute for Standards and Technology is funding the program, which is known as the Critical Infrastructure Protection Project. Heading the project will be John A. McCarthy, who once was part of a Clinton administration team that coordinated collaboration between the government and the private sector in readying computer systems for Y2K. "We hope that by putting our third-party hat on we'll be able to bring together the right constituencies to broker lasting and useful solutions to long-term problems," he says. One such problem involves information sharing between the government and private industry, which is being held up by tech companies' worries that sensitive data will be leaked to the public through the Freedom of Information Act. Both House and Senate legislators have introduced bills that would protect such information, but consumer and privacy advocates claim the legislation really protects tech companies from legal liability. The new center will provide congressional testimony and act as the core clearinghouse for research and information related to cybersecurity.

  • "Because Little Things Mean a Lot"
    Siliconvalley.internet.com (05/13/02); Singer, Michael

    The Nanotech Planet Spring 2002 Conference & Expo taking place this week in San Jose consolidated the many players converging on this hot field, which promises great advancements in materials sciences, manufacturing, biomedicine, and computing. IT stalwarts such as IBM, Intel, and Hewlett-Packard are competing alongside smaller nanotech firms such as Altair, Nanogen, and Nanophase Technologies. Some of the hoped-for applications include computer architectures a billion times faster and nanobots that can enter the bloodstream to cure disease. Besides the potential for scientific and commercial breakthroughs--or perhaps because of it--there is a huge amount of money flowing into nanotechnology, especially from the federal government. President Bush has upped the amount of federal funds going toward nanotechnology research to $710 million, which includes $679 million in the main budget and other allotments made for NASA and the U.S. Department of Agriculture. All federal departments and agencies are concentrating on three new R&D areas: Nanoscale assembly processes, nanoscale instrumentation and metrology, and ways nanotechnology can detect and safeguard against chemical, biological, radioactive, and explosive hazards.

  • "Scientists Get Atoms Ready for a Close-Up"
    New York Times (05/14/02) P. D3; Chang, Kenneth

    In the April 25 issue of Nature, researchers at Lucent Technologies' Bell Labs reported that they have devised a new microscopy technique that can pinpoint individual atoms within a sheet of silicon. The method involves a narrow beam of high-energy electrons being focused through a microscope onto the silicon sheet, whose width is about one-thousandth that of a human hair. The deflection of the electrons is then measured to determine the position of atoms. Previous techniques were only able to image atoms that were sticking out on the surface of the silicon. As transistors shrink in size, a more precise way of determining whether dopant atoms are going into their proper places will be needed. Bell Labs research team leader Dr. David Muller explains that if an eight-inch-long silicon wafer was the size of the United States, then transistors would be the size of cars and atoms the size of pinheads. With the new method, "We are able to locate the equivalent of a few pins, hidden in a few cars, somewhere in the United States," he says.
    (Access to this site is free; however, first-time visitors must register.)

  • "Nanostructure Techniques Come Closer to Quantum Dot"
    UniSci (05/14/02)

    A team of scientists from the Universitat Autonoma de Barcelona, the Institute of the Science of Materials in Barcelona, the Institute of Microstructure Physics in Nizhny Novgorod, and the Institute of Semiconductor Physics in Kiev have hit upon a new method for precisely controlling the growth of semiconductor material islets, or nanoilles. They have observed that the distribution, composition, and shape of silicon germanium nanoilles can be affected by various factors. Raising the temperature of the silicon substrate, for instance, increases the nanoilles' silicon content. Meanwhile, varying the thickness of the germanium layers as well as the temperature can yield large densities of small pyramid-shaped islets, small densities of large round islets, or a blend of both types. This breakthrough could significantly benefit the areas of nanoelectronics and optoelectronics: The materials formed from the process could expand the color spectrum of semiconductor lasers, for example. Data transmission through fiber-optics and electronic circuits could also be improved. This research brings the formation of quantum dots one step closer, and the scientists are currently engaged in creating semiconductor nanolagoons from other materials. The researchers' discovery was showcased in Nanotechnology magazine.

  • "Why the Network Will Be the Computer"
    Enterprise (05/10/02); Farber, Dan

    This year's NetWorld+Interop conference shows significant momentum building for IP-based networks, which many say are the inevitable future of enterprise computing. Several factors are currently holding back migration of corporate IT infrastructure over to IP, including the lack of a clear migration plan, ambiguity over next-generation IPv6 (Internet protocol version 6), slow broadband take-up, and businesses' self-interest holding back Web services. But as Internet technology emerges from the dot-com hype, it is being seen as an even more robust platform for enterprise applications. IBM's Irving Wladwasky-Berger, for example, predicts that grid computing over the Internet will prove the network's mettle as an application platform. Also, both John Chambers and Serge Tchurok, CEOs of Cisco and Alcatel, respectively, spoke at NetWorld+Interop about the new IP-based LAN and WLAN technologies that will prove the enablers of corporate IP-based computing in the future. Chambers said, "Almost no CIO I talk to today disagrees that within five years we will have a single infrastructure for data, voice, and video." Now that the hype for "cool" Internet applications is over, the next phase of the Internet, where it becomes a network infrastructure and applications platform backbone, can begin, writes Dan Farber.
    Click Here to View Full Article

  • "INS to Launch Online Foreign-Student Tracking System"
    IDG News Service (05/13/02); Garretson, Cara

    The Department of Justice's Immigration and Naturalization Service (INS) will roll out an online tracking system for foreign students attending U.S. learning instiutions and make sure they are enrolled in the schools they came to study at. Officials say the system will also be used to report whether students have been convicted of crimes, whether they change names, or whether they drop out of their courses. Attorney General John Ashcroft last week announced the Internet-based system, named the Student Exchange Visitor Information System (SEVIS). Schools are required to begin using SEVIS by Jan. 30, 2003, but participation can start as early as July. SEVIS replaces a paper-based tracking system that is widely criticized as inefficient and faulty, and has been in the works since before the Sept. 11 attacks. Several of the terrorists involved entered the United States with student visas. Ashcroft says about one million foreign students currently are enrolled at U.S. institutions.
    Click Here to View Full Article

  • "Web Pioneer Looks at Ground Covered, Future"
    SiliconValley.com (05/11/02); Gillmor, Dan

    At last week's 11th annual World Wide Web Conference, Internet visionary and World Wide Web Consortium director Tim Berners-Lee said the Web has come a long way, although there are still problems. He noted that the Internet community has developed tools that could increase the collaborative nature of the Web, although they are incomplete; the community itself, despite the high degree of Internet penetration in the United States, is still in its adolescence. Berners-Lee has found the digital divide to be a very upsetting trend, but said the issue is much bigger than merely adding Internet access to poor and remote areas. "The bottom line is that the developed world needs to spend a lot more taxpayer money on helping the developing world," he advised. In his keynote speech, Berners-Lee admonished that imposing royalties on Web technology patents would stifle innovation. He also argued that the Web's past incarnations should be preserved for posterity, while copyright laws that protect technologies will ultimately fail, because people will find a way to circumvent such technologies. The future Net will be machine-readable as well as human-readable with the advent of the Semantic Web, according to Berners-Lee. He predicts that its usability and flexibility will expand, and it will boast omnipresent connections.
    Click Here to View Full Article

  • "PCs: For Whom the Decibels Toll"
    Wired News (05/13/02); Glasner, Joanna

    The noise computers produce is becoming a significant issue for more and more PC buyers, and designers are taking note. Because newer machines operate faster and produce more heat, the apparatus needed to cool them down makes more noise. Solutions to this problem include reducing processors' heat output and installing less noisy fans, according to PC noise expert Tomas Risberg. "Low noise" products are being promoted and deployed by NEC and Transmeta, among others. NEC's MATE PC is based on Transmeta's Crusoe chip; Transmeta claims that the same technology that allows Crusoe to operate at low power also makes it ideal for reducing noise. Microsoft, Intel, and Apple are also pushing noise reduction and its pluses. In fact, Microsoft issued Windows XP design guidelines that recommend that computers produce no more than 37 decibels adjusted for the human ear (dBA) when idle and 55 dBA when active. Meanwhile, PC industry experts are not completely convinced that noise is a deciding issue for PC buyers: First of all, PCs are far less noisy than ringing phones, traffic, and other people. In the second place, noise-sensitive buyers may opt for quieter laptops instead of desktops.

  • "The Future Is Here"
    National Journal (05/11/02) Vol. 34, No. 19, P. 1372; Cannon, Carl M.

    The gulf between agencies that possess and are implementing cutting-edge technology and those that still rely on outdated systems is illustrated by Sept. 11: The tragedy was precipitated by terrorists who used the Internet to coordinate their attacks and were able to enter and move freely throughout the country because the Immigration and Naturalization Service did not have the electronic means to quickly identify and track them down. Furthermore, the government is often unable to optimally deploy the latest technologies because free-market, individual-oriented democracy does not provide a leadership structure for the dispensation of technological innovations. Paul Saffo of the Institute for the Future notes that technology no longer trickles down from the government to the private sector, but vice-versa. Barriers to federal adoption of new technologies include unwieldy procurement rules, limited funds for technology initiatives, the scale of the worldwide consumer market, and the high price government often pays for failure. "Not only do government systems typically have to work tolerably well, they have to serve everyone equitably and they have to be semitransparent in terms of their development, budgets, and accountability," explains Gary Chapman of the University of Texas. By wisely dispensing technology, the government can make a positive impact on many things besides security, including cost savings and improvements to the quality of life. For example, forest fires could be controlled better if the Forest Service and the Bureau of Land Management have access to a fleet of satellites that the Defense Department will put up to detect missile launches--but such technology sharing can only take place with congressional support, says Satellite Industry Association executive director Richard Dal Bello.

  • "The Next (Not So) Big Thing"
    InformationWeek (05/13/02) No. 888,; Ewalt, David M.

    Nanotechnology promises to usher in world-shaking changes in many industries, although the field is still in its early stages. Thus far, nanotech products already on the market are mostly passive: They include "pixie dust" woven into fabric to make clothing stain resistant, nanopowder incorporated into sunscreen, and nanoparticles injected into plastic to keep beverages fresh. Although a method to mass-produce molecular machines is still waiting to be developed, NanoBusiness Alliance executive director Mark Modzelewski says that "[Nanotech] is starting to show up in the profit column instead of the R&D column." His observation is borne out by a survey estimate that the nanotech industry produces annual sales revenues of $45.5 billion, a figure that could skyrocket to $700 billion by 2008; another study from the National Science Foundation projects a $1 trillion nanotech market by 2015. The private sector is less eager to invest in nanotech than the federal sector, so most research will probably be led by the military, academia, and a few corporate labs. Carbon nanotubes are a heavy area of focus, since their strength, flexibility, and electrical conductivity could be used to produce ultra-small, super-fast circuits and nanowires that could shrink down the size of electronic and optical hardware. This will be especially important as manufacturers approach the limits of silicon's ability to uphold Moore's Law. Perhaps the most awesome long-term goal of nanotech is the creation of minuscule supercomputers. Ways must be worked out to smoothly integrate disruptive technologies such as nanotech into business, and allay skeptics and those who fear some of the science's negative aspects, including one claim that nanoscale robots have the potential to destroy the Earth.

  • "Breaking Down the Language Barrier"
    Wireless Review (04/02) Vol. 19, No. 4, P. 28; Wickham, Rhonda

    The SyncML initiative conceived by Douglas Heintzman of IBM aims to synchronize different computing devices to each other, and the technology is making rapid gains with leading wireless companies. The first SyncML protocol facilitated end-to-end linkage between devices and back-end servers, while the latest version offers users mobile device management and software downloads; Heintzman expects digital rights management to be featured in a future version. Heintzman says, "SyncML is like what HTTP and IP are to the wired world." Nokia, Ericsson, Openwave Systems, Motorola, Starfish, Symbian, and IBM are working together to offer multiple perspectives and ensure the appropriateness of design decisions, according to Heintzman. SyncML addresses users' need to perform tasks whether or not they are linked to back-end applications by enabling an "intermittently connected" computing model: By defining a standard protocol and object representation, virtually unlimited numbers of back-end servers and applications and front-end devices can connect to process changes and updates. A large stable of handset, server, and application companies have accepted SyncML, but Microsoft has not, probably because "Industry specifications aren't necessarily in their best interest," explains Heintzman. He adds that this could work to Microsoft's disadvantage, since its rivals' devices would be able to talk to each other through SyncML. SyncML has been designed to run on the Pocket PC and Bluetooth platforms, and the wide area and local direct connect and infrared networks.
    Click Here to View Full Article

[ Archives ] [ Home ]