HomeFeedbackJoinShopSearch
Home

       HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.

To send comments, please write to [email protected].

Volume 4, Issue 424: Monday, November 18, 2002

  • "Affordable Technology, Not Gee-Whiz Stuff, Is Focus at Comdex"
    SiliconValley.com (11/17/02); Takahashi, Dean

    Fancy, impractical innovations may give way to more affordable products at this year's Comdex trade show, although state-of-the-art technology will still be on display. National Semiconductor CEO Brian Halla says the tech industry's recovery is likely to be driven by technology that supports Internet connectivity, a view shared by Comdex keynote speakers such as Microsoft Chairman Bill Gates, Sun Microsystems CEO Scott McNealy, and Hewlett-Packard CEO Carly Fiorina. Wi-Fi wireless Internet technology seems poised to dominate the conference; once such product is Microsoft's Mira display, which is portable and stays linked to PCs via Wi-Fi. Gates predicts that wireless Internet connectivity will eventually be embedded into practically any type of device. Also to be showcased at Comdex will be Intel's 3 GHz Pentium 4 microprocessor and Advanced Micro Devices' Hammer chip line. Other technologies that will be promoted at the conference include National Semiconductor's Geode Extended Office, a handheld computer that comes with a complete Windows XP operating system. Also on hand at Comdex will be Nvidia's long-awaited GeForceFx graphics chip, which boasts 125 million transistors, can support more lifelike animation, and will run twice as fast as rival chips from ATI Technologies, according to sales executive Dan Vivoli.
    http://www.siliconvalley.com/mld/siliconvalley/4545327.htm

  • "Businesses, Big and Small, Bet on Wireless Internet Access"
    New York Times (11/18/02) P. C1; Markoff, John

    Both large and small companies are investing in wireless Internet technologies such as Wi-Fi, which are being touted as the next major communications advance, promising anytime, anywhere high-speed Internet access. San-Francisco-based Vivato, for example, has a Wi-Fi antenna that more than a thousand people can access from as far off as four miles; Surf and Sip and Boingo Wireless are pursuing the establishment of Wi-Fi "hot spot" networks; and Intel, IBM, and AT&T's Project Rainbow is dedicated to setting up even larger Wi-Fi networks in U.S. metropolitan areas. The development of Wi-Fi has been nurtured by the federal government's decision to reserve a small area of unlicensed radio spectrum that anyone complying with a small set of rules could share. Wi-Fi proponents expect the next major development will be "location-aware" communications services. Meanwhile, Wi-Fi is a hot area of development and entrepreneurism in an otherwise muted Silicon Valley. Wireless Internet technology could flourish even more if a recent FCC recommendation for a more flexible consumer-centered policy that favors systems that boost radio spectrum efficiency and simplify the creation of new digital wireless services is institutionalized. Some economists believe that the pace of innovation will be ratcheted up by such a development, but Gregory L. Rosston of the Stanford Institute for Economic Policy Research warns that Wi-Fi could cause airwave congestion. Furthermore, Silicon Valley-based digital wireless entrepreneurs caution that the major wired Internet players are likely to resist such a huge paradigm shift.
    http://www.nytimes.com/2002/11/18/technology/18WIFI.html
    (Access to this site is free; however, first-time visitors must register.)

  • "New Light Shed on Unbreakable Encryption"
    ZDNet (11/15/02); Junnarkar, Sandeep

    A Northwestern University research team led by Prem Kumar and Horace Yuen have developed a quantum encryption technique in which bundles of photons are transmitted over a fiber-optic line at 250 Mbps, which the scientists estimate surpasses the speed of current quantum technology by more than 1,000 times. The method uses "secret key" cryptography, in which both sender and receiver possess the same key; the sender would manipulate light with the key to build a more intricate pattern than the conventional technique of representing data as zeros and ones. Light granularity, or quantum noise, is also used by the method--by randomly polarizing the light, for example, the researchers change the granularity, and thus render the message unbreakable by eavesdroppers; the message is destroyed whenever the eavesdropper disturbs the photons. Still, quantum-encrypted messages can only be transmitted over dedicated fiber-optic lines, and only over distances no greater than approximately 90 kilometers. IBM, NEC, and Verizon Communications are just a few of the firms working to solve this problem, and advocates believe that the military, intelligence, and finance sectors are likely to replace their current algorithm-based encryption technologies with quantum encryption once these breakthroughs are achieved. Although Alexei Trifonov of Magiq Technologies says that quantum cryptography will end the "vicious circle" in which the latest encryption algorithms are continuously challenged by code breakers, critics worry that quantum computing could allow malicious parties who have stolen sensitive data encrypted by impenetrable mathematical algorithms to finally crack the code. The Defense Advanced Research Projects Agency (DARPA) is funding the Northwestern research, while Kumar says that his team is collaborating with BBN Technologies and Telcordia Technologies to commercialize their breakthrough.
    http://zdnet.com.com/2100-1104-965957.html

  • "Study: Linux's Security Problems Outstrip Microsoft's"
    NewsFactor Network (11/15/02); Maguire, James

    Open source software has more security holes than Microsoft software, according to an Aberdeen Group report. The report backs up its conclusions with findings from the Computer Emergency Response Team (CERT), in which 16 out of 29 advisories issued in the first 10 months of 2002 were related to open source and Linux software; in contrast, Microsoft software accounted for only seven reported problems. CERT also reports that the number of Trojan horse and virus advisories revolving around Microsoft applications fell from six in 2001 to zero in the first 10 months of 2002. Aberdeen Group research director Eric Hemmendinger, who co-authored the report, notes that the greater number of open source vulnerabilities runs counter to the assumption that Microsoft software has the weakest security. He attributes the rise in open source security flaws to a lack of a quality assurance testing entity. Meanwhile, Hemmendinger believes that the shrinkage of Microsoft security problems could demonstrate that the company's increased emphasis on security is having a noticeable effect. "[T]here have been a number of things that have gone on [in Microsoft] over the last couple years reflecting that they know security matters, and that they had to pay attention to it," he declares. Hemmendinger anticipates that Microsoft security problems will either continue to fall or plateau, while open source security advisories will continue to rise.
    http://www.newsfactor.com/perl/story/19996.html

  • "Wi-Fi Encryption Fix Not Perfect"
    Wired News (11/15/02); Batista, Elisa

    Most of the security headaches associated with Wi-Fi stem from users failing to activate the Wired Equivalent Privacy (WEP) encryption program, but the program itself is not immune to hacker attacks. That is why cryptography consultant Niels Ferguson developed Wi-Fi Protected Access (WPA), which verifies authentic users and blocks access to unauthorized users via mathematical algorithms. However, Ferguson admits that WPA is susceptible to denial of service (DoS) attacks, including a unique form of assault in which a hacker sends two packets of unauthorized data during a one-second period, thus convincing the system it is under siege and triggering self-deactivation. Computer consultant Arnold Reinhold adds that several attacks could cut wireless users off from their networks for a minute at a time, while tracking down the culprits would be difficult. Furthermore, Ferguson and Wi-Fi Alliance members note that end users may not even notice such an attack, given its speed and how easy it could be to launch. Reinhold posits a scenario in which a retailer would resort to such an attack in order to sabotage a rival whose business operations rely on wireless communication--one of several worries that have made many businesses reluctant to adopt Wi-Fi networks. Nevertheless, analysts say that Wi-Fi will continue to penetrate markets such as college campuses and households despite its security drawbacks.
    http://www.wired.com/news/business/0,1367,56350,00.html

  • "New Life for Old PCs"
    San Francisco Chronicle (11/16/02) P. B1; Norr, Henry

    Users with old PCs that they no longer want or can use can donate them to the nonprofit World Computer Exchange (WCE), which gives them to students in developing nations. Donors receive tax deductions and do not have to worry about paying recycling fees, while Richard Gingras, who heads the WCE's Bay Area branch, says the organization collaborates with recipient nations to "make sure the educational implementations within each country are appropriately planned and appropriately funded." WCE has shipped 4,000 units to 585 schools in Asia and Africa since its inception in March 2000, while hardware donated at a weekend event hosted by the Bay Area branch will go to the Republic of Georgia; other countries that may soon receive old computers include Bolivia, Afghanistan, Vietnam, and Zimbabwe. WCE has also worked out deals in which it receives volume donations from companies that are replacing their older PCs with newer systems. Computers are usually shipped with their operating systems intact, but if donors remove the hard drive for security reasons, WCE installs a copy of the free Linux OS. Gingras estimates that it costs an average of $75 to collect, process, ship, and install a computer, and this money is provided by sponsors in the recipient countries, such as universities, foundations, or government grants. Goa Schools Computers Project representative Daryl Martyris says that, with the help of the Exchange, his organization has supplied 380 computers in community Internet centers in 100 schools. WCE recommends that 10 percent of the donated computers be used for spare parts.
    Click Here to View Full Article

  • "AbiWord Up"
    Salon.com (11/15/02); Leonard, Andrew

    Open-source software continues to make headway despite the boom and subsequent bust in the technology sector, primarily because open-source developers are best motivated by elements unassociated with IPOs or marketability. AbiWord, donated to the free software community after its parent company surrendered the effort of trying to make money with it, is one example of the sustainability of open-source software. The word processing program is improved upon continually as contributing developers from around the world submit new feature updates and add-ons. AbiWord is also available for free download from the AbiSource Web site. Developers communicate on the IRC channel #abiword, encouraging one another and sharing technical details about the ever-improving program. AbiWord, along with other desktop open-source applications, does not have the same potential for influencing the corporate IT market as open-source system software, for example, but the project endures because of the dedication of its contributors. University of Melbourne particle physicist Martin Sevior says he decided to join the AbiWord effort after finding the application more useful than others in opening different file formats. He says the people he found working on AbiWord also drew him to work on the project, and now he is in charge of making tables work on the program. Erik Sink, founder of AbiSource, the company which spawned AbiWord, says open-source work is good for unemployed coders because they can hone their skills while enhancing their resume.
    http://www.salon.com/tech/col/leon/2002/11/15/abiword/index.html

  • "Supercomputer to Use Optical Fibers"
    New York Times (11/18/02) P. C5; Markoff, John

    The California Institute for Telecommunications and Information Technology today will unveil a new design for a campus-wide supercomputer that will place an optical router at the center of the system, allowing for some information sharing to take place at the speed of light. Using a grid computing design, the new "optiputer" will initially feature about 500 Intel microprocessors running the Linux operating system and linked to an optical switching system. The California Institute, led by Dr. Larry Smarr, is a joint venture of the University of California at Irvine and the University of California at San Diego, which will house the optiputer. Unlike in traditional computing systems, the optiputer's optical communications lines makes the processors the slower link in the supercomputer system. Smarr says, "We're moving to an optical-centric world in which the computers are slow things and you reluctantly add them in."
    http://www.nytimes.com/2002/11/18/technology/18OPTI.html
    (Access to this site is free; however, first-time visitors must register.)

  • "Talking Innovation: Marc Andreessen"
    Boston Globe (11/11/02) P. C1; Weisman, Robert

    Marc Andreessen, Netscape pioneer and co-founder of automated software firm Ospware, says technology is becoming cheaper and more useful for companies as it moves toward commoditization. Linux, the open-source software that can run any UNIX application, and Intel-based servers are examples of what Andreessen predicts will be the future of IT. He says that companies whose business models leverage commoditization will do well, while other companies will have to adapt by creating highly specialized and intelligent software or developing technologies to handle growing IT infrastructure. Andreessen says the Internet has exceeded his expectations in terms of its breadth and use. He admits that not even those who were responsible for developing pioneering Internet technologies anticipated correctly how the Internet would take off, but nearly everyone wrongly expected interactive television to be the next disruptive technology. Over the coming years, Andreessen predicts that ideas for new technologies will continue to come from skilled teenagers who have innovative ideas, such as the file-trading network Napster--except that those teenagers will not be based in the United States, but in China, Russia, Indonesia, and other places where the Internet has given them the knowledge and tools necessary to innovate. He forecasts that a decade from now, "there's going to be a lot more software in the world than there is today."
    http://digitalmass.boston.com/news/tech_innovation/news/1111_qa.html

  • "Shortage of IT Workers Reaches a Critical Stage"
    Government Computer News (11/18/02) Vol. 21, No. 33, P. 35; Walker, Richard W.

    A year ago, a report from the National Academy of Public Administration (NAPA) commissioned by the federal CIO Council concluded that there is a dearth of IT personnel across the United States, a situation that will be exacerbated by the imminent retirement of about 50 percent of the current federal IT workforce in the next five years. Adding to the difficulty is inadequate government funding of IT training and continuous education, a prolonged hiring process, and lower salaries than those offered by private industry. Ira Hobbs, co-chair of the CIO Council's IT workforce and human capital for IT committee, observes that the dot-com shakeout was not a windfall for the government because many professionals late of failed startups were hired for the same or similar positions they had before they left. In order to reverse the federal IT shortage, the NAPA panel recommended key changes to the employment system, including competitive nonpay benefits, the development of a learning culture, market-driven pay, a more simplified recruiting and hiring process, and managerial flexibility. The establishment of a Homeland Security Department, which is ever more likely thanks to Republican victories in the mid-term elections, will have 170,000 employees and the need for an information framework that will require a huge IT contribution. Experts say that leadership and teamwork across all management levels is critical if the ranks of federal IT employees are to swell. Hobbs declares that the NAPA report brought the federal IT workforce issue out of the domain of "anecdotal conversation by different people at different times and brought qualitative analysis by a well-respected organization."

  • "Good Morning, Dave..."
    Computerworld (11/11/02) Vol. 36, No. 46, P. 36; Melymuka, Kathleen

    The Defense Advanced Research Projects Agency (DARPA) is soliciting research proposals into cognitive computer systems, and Ronald J. Brachman of the agency's Information Processing Technology Office expects to have a viable concept ready within three to five years. Such a system would be self-aware and capable of self-maintenance, could learn from experience and adapt to changing conditions, and be able to anticipate and plan responses to multiple future scenarios. Brachman adds that such capabilities could be used to predict actions by terrorists or business rivals, in order to develop effective countermeasures. He also notes that soldiers in battlefields or civilians involved in disaster response could be helped or even replaced by such systems. Other advantages of a cognitive system include no need for training. "DARPA research tends to be visionary, and [although it] provides building blocks for future weapons systems, there's also applicability throughout society," explains artificial intelligence expert and Kurzweil Technologies CEO Raymond Kurzweil. Brachman says that self-testing software, speech recognition, and machine learning could provide the building blocks of self-aware systems. DARPA is courting neurologists, psychologists, philosophers, and others in order to conceive of safeguards that would prevent cognitive systems from going out of control.
    Click Here to View Full Article

  • "Faster Wireless LAN Technology on Tap"
    Network World (11/11/02) Vol. 19, No. 45, P. 9; Cox, John; Hochmuth, Phil

    A faster wireless LAN specification that will boost data transmission speeds up from 11 Mbps to 54 Mbps is expected to gain much attention at Comdex Fall next week. Analysts say the still unratified specification, IEEE 802.11g, should boost the use of a variety of data-intensive, multimedia applications. Chip manufacturers Broadcom and Intersil, along with their device partners, are planning to unveil new 802.11g technologies at next week's Comdex Fall 2002 show. Unlike 802.11a, the new standard does not require companies to replace their existing wireless LAN infrastructure, only the radio card in access points. Users who have 802.11b interface cards can still use the updated access points because both standards work on the 2.4 GHz frequency, but they will not be able to reach the 54 Mbps data rate afforded by 802.11g, which translates to about 19 Mbps under normal conditions. The new standard will allow for higher speeds while still offering the longer range of 802.11b, unlike the fairly recent 802.11a standard, which operates at 5 GHz. The 2.4 GHz range is crowded, however, with interference from cordless phones, Bluetooth chips, and other wireless household devices. IEEE is expected to give final approval to 802.11g in March, and the Wi-Fi Alliance plans to begin testing device compatibility soon after the standard passes. Gartner analyst Ken Dulaney suggests that administrators keep their existing 802.11b infrastructure in place and include 802.11g in their vendor request for proposals only after the interoperability issues are defined by the Wi-Fi Alliance.
    http://www.nwfusion.com/news/2002/1111comdex.html

  • "Cleaning Up Clean Rooms"
    Wired (11/02) Vol. 10, No. 11, P. 32; Moran, Susan

    The microchip manufacturing process that takes place in supposedly ultra-sterile clean rooms involves a massive consumption of hazardous chemicals, water, and toxins that has prompted environmental groups and employees to allege that working in clean rooms carries considerable health risks that could even lead to death in some cases. Los Alamos National Laboratory is developing an environmentally safe chip-manufacturing process that could save money and help shrink the size of wafers even further. The traditional photolithographic technique of chip production requires the application of a photoresist layer on the chip's surface prior to circuit etching; excess resist is later scoured away by toxic compounds and rinsed off by water, while isopropyl alcohol is used in the drying process to prevent defect-causing residue buildups. In Los Alamos' supercritical carbon dioxide resist remover (Scorr) process, the toxic chemicals are replaced with supercritical carbon dioxide, which removes the resist much faster and leaves behind almost no residue. Unlike water, supercritical carbon dioxide has no surface tension, so it is able to remove residue from smaller spaces, which would be a significant advantage in the manufacturing of smaller chips--in fact, IBM has embarked on a pilot program in its East Fishkill, NY, semiconductor factory to prove the effectiveness of the process. Furthermore, machines that support the Scorr process are considerably less expensive than conventional rinse-and-cleanse presses. The process could be especially beneficial for Intel, whose chip plant in drought-plagued Rio Rancho, N.M., consumes 4.9 million gallons of water each day and produces noxious vapors that have allegedly caused serious medical problems for people living in the vicinity.
    http://www.wired.com/wired/archive/10.11/start.html?pg=7

  • "Tiny Tips Probe Nanotechnology"
    Industrial Physicist (11/02) Vol. 8, No. 5, P. 16; Malsch, Ineke

    Scanning probe microscopes (SPMs) used in scientific research to measure and manipulate nanoparticles are being adopted by industry for industrial production and quality control. Classes of SPMs include scanning tunneling microscopes (STMs) that can image the atomic structure of conducting and semiconducting surfaces, and are used by the semiconductor industry to study layer growth; scanning near-field optical microscopes (SNOMs), which are used to characterize and analyze optoelectronic and telecommunications applications, and can measure very delicate structures; and atomic force microscopes (AFMs), which can facilitate a three-dimensional image of surface structures with atomic-scale resolution. Variants of the ATM with industrial uses include the magnetic force microscope (MFM), which can determine the location of surface defects and is used for quality control in the data-storage industry, while the friction force microscope (FFM) analyzes chemical properties and is used to study machine part interaction and surfaces. The research and industrial SPM market is dominated by Veeco, which sells AFMs to semiconductor manufacturers who use them to measure circuit features during production. Meanwhile, conducting atomic force microscopy is an advantageous method for measuring transistors and other minuscule electronic components so that developers can gain nanometer-sized insights on production processes. Speed, not price, is the main obstacle to industrial deployment of SPMs. One issue that may also be hindering SPM adoption is a lack of standards, but American, European, and Japanese standards experts recently convened at England's National Physical Laboratory to discuss standards guidelines for SPM operations, tip calibration, and more interlaboratory comparisons of SPM measurements.
    http://www.aip.org/tip/INPHFA/vol-8/iss-5/p16.pdf

  • "The Wisdom of the Anthill"
    Business 2.0 (11/02) Vol. 3, No. 11, P. 59; Mucha, Thomas

    French scientist Eric Bonabeau has devised "ant algorithms" based on his study of social insects, and consultancies are using them in a variety of applications. In his book, "Swarm Intelligence," he posits that the social organization of ants could be applied to businesses in order to make them more efficient. His algorithms simulate the movements and interactions of an organization's employees, products, and clients, a technique known as "agent-based modeling." French gas company Air Liquide, with the help of Santa Fe's Biosgroup, employed such algorithms to improve its supply chain: By programming trucks to find and outline the shortest delivery routes so that subsequent trucks can retrace them--much like foraging ants leave pheromone trails--Air Liquide was able to save time and manpower in the organization of its delivery schedules. Meanwhile, Bonabeau's Icosystem company is employed by the U.S. Office of Naval Research to simulate the operation of unmanned aerial vehicles in an effort to improve communications and pave the way for smarter vehicle designs. Icosystem is also working to speed up Eli Lilly's drug delivery time by up to 80 percent by building a model of the firm's clinical development processes. Bonabeau acknowledges that adoption and implementation of his methods requires managers to cede a certain amount of control, and make considerable investments in information and expertise. Meanwhile, IBM and an EU-funded project at the Santa Fe Institute are undertaking agent-based modeling.

  • "Hyper Rapid Express"
    CommVerge (11/02) P. 34; Sartori, Gabriele; Neshati, Ramin; Fuller, Sam

    Three technologies--HyperTransport, PCI Express, and RapidIO--are competing in the race for standard chip-to-chip interconnects that offer improvements over PCI, which loses performance as data speed increases. Gabriele Sartori of the HyperTransport Technology Consortium declares that HyperTransport I/O link technology--which has significantly penetrated the networking and communications markets--is fast, flexible, scalable, cost-effective, and can support existing infrastructure while setting up infrastructure for future high-bandwidth systems; over 30 HyperTransport products have been approved or are being shipped, and many industry players support the standard. Other advantages of the scheme include support for streaming digital video and real-time voice, a lower data-transmission voltage requirement than conventional low-power I/O technologies, and availability. PC-SIG's Ramin Neshati writes that the differential signaling-based PCI Express is "a standard, serial I/O technology that allows for modular, scalable platforms that enhance system reliability, design flexibility, and ease-of-use" as well as offers a high degree of performance scalability that should last through to the next decade. The architecture should provide ample bandwidth to comply with the various requirements of multiple market segments spanning the computing and communications sectors; the technology also boasts easy deployment, low cost, and sophisticated performance, and is already available. Sam Fuller of the RapidIO Trade Association explains that no volume semiconductor elements currently exist that can serve both the desktop computing market and the high-performance embedded market, and RapidIO was developed to address the interconnect technology needs of the latter. "RapidIO is an open standard, available for adoption by all vendors of embedded products, and it is well supported and enabled by the leaders of the embedded industry," he writes. The scheme's developers collaborated with embedded-systems designers to make the architecture capable of supporting, among other things, direct peer-to-peer communication between devices; standard electrical I/O technology; distributed globally shared memory; and error detection and recovery protocol.
    Click Here to View Full Article

 
                                                                             
[ Archives ] [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM