HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.

To send comments, please write to [email protected].

Volume 4, Issue 429: Monday, December 2, 2002

  • "Black Market For Software Is Sidestepping Export Controls"
    New York Times (12/02/02) P. C1; Schwartz, John

    Export restrictions are in place to bar the sale of scientific and engineering software to rogue nations such as Iraq or North Korea, but such countries can acquire these technologies easily through a black market. Worse, clamping down on this market is extremely difficult, given the near impossibility of controlling digital piracy, and the United States' reluctance to impose restrictions on China and other emerging global trade partners that support it. For instance, Chinese entrepreneurs are selling bootleg copies of engineering software from Intelligent Light and 120 other firms at a dramatic discount, and Intelligent Light CEO Jeanne L. Mara reports that her complaints to the Small Business Administration and the Commerce, State, and Justice Departments were greeted with sympathy, but little was done to stop such activity given the delicate political and economic situation between the United States and China. Although Justice Department official John G. Malcolm insists that his department "will review any matter, and consider taking appropriate prosecutorial measures" against software theft, organizing international probes is a difficult proposition. Meanwhile, assistant United States attorney Scott S. Christie notes that enforcing U.S. copyright and export controls in other nations is an unresolved issue. Tom Kurke of Business Software Alliance member Bentley Systems, another victim of black market piracy, warns that this is matter of national security. "There is a legitimate concern for these technologies being used in countries where you wouldn't want them used," he says.
    (Access to this site is free; however, first-time visitors must register.)

  • "Hollywood, Tech Become Wary Partners Against Piracy"
    Seattle Times Online (12/02/02); Peterson, Kim

    Executives of entertainment companies hope that partnerships with technology companies will better their chances of curtailing the digital piracy of music and movies, and are making efforts to overcome long-term animosity between the two sectors. Fox Group CEO Peter Chernin called for such cooperation at last month's Comdex trade show. At the same event, Microsoft chairman Bill Gates highlighted the company's alliance with Disney, Fox, and NBC. Although media companies do not like the fact that Microsoft technology has empowered consumers to copy and distribute their content, they realize the tech firm will soon be capable of monitoring and controlling how consumers use that content. At the same time, analysts note that copyright holders do not want to become dependent on Microsoft, and are considering other ways to control their content. For example, Sony made a $453 million acquisition of InterTrust Technologies last month, while its PlayStation 2 game console runs on software from Microsoft rival RealNetworks. Meanwhile, companies such as Apple Computer, with its Rip-Mix-Burn motto, and Gateway, with a commercial campaign emphasizing consumers' digital rights, are trying to capture a piece of the lucrative consumer entertainment market in the face of flagging PC sales, but copyright holders claim such policies advocate piracy. Although film industry executives acknowledge that it will be years before a mutually beneficial solution is agreed upon by the two sectors, time is of the essence. Some argue that a major reason why digital piracy thrives is the lack of an easy and convenient way for consumers to acquire online content through legal channels.
    Click Here to View Full Article

  • "Listening to the Internet Reveals Best Connections"
    New Scientist Online (11/27/02); Ananthaswarmy, Anil

    Chris Chafe of Stanford University's Center for Computer Research in Music and Acoustics has developed a way to check the quality of Internet connections by converting latency variations into a musical format. The conventional method of assessing an Internet connection is to "ping" a data packet to a remote computer and measure its latency, but this technique cannot gauge the sub-second behavior of its variation, or jitter, over time. Chafe, who is also a cellist, modeled the Internet connections as guitar strings, and used the transmission of short sound pulses to simulate string pluckings. Using this technique, lower-pitched notes signal that it is taking longer to transmit data, while a loss of sound indicates a missing data packet or an interrupted connection. Chafe thinks that his method could be used to check the status of the network connections of the academic supercomputers that form the Grid, although he believes that the simulation of a drum skin or other stretched membrane, rather than guitar strings, would be more apt for such a two-dimensional network. He also thinks that the field of telemedicine, which relies on constant monitoring of Internet connections to carry out telesurgery and other operations, could benefit from his work.

  • "Total Info System Totally Touchy"
    Wired News (12/02/02); Singel, Ryan

    The Total Information Awareness System proposed by the Pentagon's Office of Information Awareness will require new database mining technologies that leave many in the industry unsettled. The project, which has received $137 million in funding for 2003, aims to consolidate various consumer databases and mine for information that could signal a planned terrorist act. Terrorists have to make preparations before they can mount a strike, and the new system would highlight patterns indicating such preparations, according to Jan Walker of the Defense Department. Research administrators at the Office of Information Awareness, part of the Defense Advanced Research Projects Agency, are looking for private-sector partners to help supply the new technologies needed to make the project work. Before the system goes live, it will be tested by inserting fake transactions into the data to see if it can be accurately and reliably ferreted out. Herb Edelstein of data-mining firm Two Crows says the project is not an efficient use of government resources because it is not expected to produce any tangible results in the near future. He also points out that incorrect data, non-standardized database fields, and the lack of unique identifiers would result in too many errors. Paul Hawken, chairman of information-mapping company Groxis, adds because of the nature of pattern-recognition systems, "there would be an incalculable expense to monitor a thousand wrong hits for one correct inference." Still, others such as Search Engine Watch associate editor Chris Sherman say the project is feasible, but challenging, pointing to software the Treasury Department uses to uncover data patterns for tracking financial crimes.

  • "Fishing for Data"
    Christian Science Monitor (11/27/02) P. 18; Spotts, Peter N.

    As the amount of digitized information created continues to increase, ways of extracting needed data are struggling to keep up. Meanwhile, researchers are also working on ways to accommodate wireless computing technologies, such as a project at the University of Maryland that draws on streamed data to provide stock alerts to people's PDAs. Dr. Hillol Kargupta is leading the wireless data-streaming project, and says it poses new challenges because of the limited capabilities of wireless devices, including processing power and bandwidth. Besides financial services updates, she says the University of Maryland team is also developing a system to monitor truck shipments via wireless-connected PDAs. Streamed data would include the condition of the vehicle and its cargo, as well as its location as calculated by satellites. At Washington University in St. Louis, another project is focusing on improving hardware so that database searches do not take as long. Electrical engineering professor Ronald Indeck says placing special processing capabilities directly on the computer hard drive eliminates the need for sections of the database to pass from the hard drive to the computer's main memory and processor. He says searches should be speeded by up to 200 times using localized database processing. New database mining technologies are expected to be pioneered with the Defense Department's Total Information Awareness project, meant to scan multiple consumer databases for possible terrorist activity. Kargupta and Indeck say such a system should have built-in privacy safeguards that will mask an individual's identity through aggregation.

  • "From Darwin to Internet at the Speed of Light"
    ScienceDaily (11/29/02)

    The European Space Agency (ESA) is pursuing integrated optics technology as a way to detect Earth-like planets through the Darwin project and the ESA/ESO Ground-based European Nullifying Interferometer Experiment (GENIE); such research could also be applied to computer networks to boost data transmission speeds. Rather than relying on the conversion of optical signals to electricity, which slows down data transmission, integrated optics retain light signals and channel them through the chip. Such technology could boost Internet speeds between 100 and 1,000 times. The ESA is financing research from Astrium and Alcatel to test the viability of a traditional optics or integrated optics solution for GENIE; Astrium will work on the former and Alcatel will focus on the latter. "We shall take the decision on whether GENIE will use integrated optics in just over one year," declares GENIE and Darwin Project Scientist Malcolm Fridlund. GENIE itself is slated to be online by 2006. Integrated optics may also be incorporated into Darwin later; the technology is much less advanced than integrated circuit technology, but has great potential. Fridlund says he is "highly optimistic" about the technology's prospects, but admits that "I don't yet know whether mid-infrared integrated optics will have any commercial applications, but until we develop them, we'll never know." Fridlund says he is considering proposals from industrial firms offering to develop the integrated optics technology.

  • "Don't Write Off Existing IT Skills"
    vnunet.com (11/28/02); Fielding, Rachel

    New research published in vnunet.com's sister publication Computing concludes that demand for basic IT skills still exists, but many professionals are in danger of devaluing them and casting them aside too quickly because of newer technologies. There appears to be greater demand for XML than Java, and IT staff with specialist training tend to be paid more, but analysts caution that workers could be misled by hype to focus on attaining skills that employers are not really looking for. "Organizations...should look at the skills gaps they have and look to retain existing employees with transferable skills within their organization to individuals who have yesterday's skills," urges Richard Chappell of Learning Tree. In a poll of over 1,200 Computing technical readers, almost 50 percent were worried about redundancy, particularly those in the technology, financial services, and manufacturing segments. Last year, 58 percent agreed that IT professionals could be picky about job opportunities, a figure that fell 30 percent in this year's survey. Meanwhile, only 22 percent said that employers were eager to train IT workers for the purpose of retention, an 8 percent decrease from last year's respondents. Nearly 50 percent reported that they had not participated in any technical training courses in the past year, while 25 percent said they had just been on one IT training course. Forty-two percent of IT staff declared that the amount of training their employers provided was insufficient for them to be effective employees.

  • "Biology May Help Shrink Electronic Components at NASA"
    Nanotech Planet (11/26/02); Pastore, Michael

    Researchers at NASA's Ames Research Center are working to construct electronics that are many times smaller than current components using proteins that are genetically manipulated to self-assemble into nanoscale structures. Principal project investigator Jonathan Trent reports that NASA plans to use this technique to build equipment that will aid the search for extraterrestrial life. "Our innovation takes advantage of the innate ability of proteins to form into ordered structures and for us to use genetic engineering to change nature's plans, transforming these structures into something useful," he explains. Ames co-investigator Andrew McMillan says the experiment involved a protein-producing gene from Sulfolobus shibatae, an extremophile microbe that can survive in hot acid mud. He says the gene was modified to produce proteins that coalesce into a two-dimensional lattice that adheres to metal and semiconductor particles, thus forming quantum dots that are 1 to 10nm wide. "After further development, an array of nanoparticles could serve as computer memory, a sensor or as a logic device that could calculate," McMillan predicts. Project co-investigator Chad Paavola notes that the altered gene segment was combined with E.coli bacteria that replicates rapidly, allowing the protein to be produced in large volumes; the protein was then crystallized into nano templates composed of 20-nm-wide rings.
    Click Here to View Full Article

  • "DARPA Looks to Quantum Future"
    InternetNews.com (11/22/02); Mark, Roy

    The Defense Advanced Research Projects Agency (DARPA) is using its High Performance Computing Systems (HPCS) program as a vehicle for making progress on quantum computing. DARPA has asked five vendors (include Cray, Hewlett-Packard, IBM, SGI, and Sun Microsystems) to study the architecture of high-performance computers that would serve the needs of the defense and intelligence communities in the years to come. Quantum computing would provide a fundamental change in the way information is processed that results in a faster performance of multiple computations that occur simultaneously. DARPA is focusing on the kinds of high-performance computers that would be used by the end of the decade, compared with existing powerful computers that have design roots from the late 1980s. As part of HPCS, DARPA will eventually ask as many as three vendors to present more detailed plans, and then ask one or two vendors for a detailed engineering plan.

  • "The Rogue DNS Phenomenon"
    NewsFactor Network (11/25/02); Brockmeier, Joe

    ICANN's cancellation of elections for ICANN board seats was unpopular among Internet users, most of whom do not realize that there are alternatives to dealing with ICANN. ICANN governs the servers that transform domain names like various .com names into IP addresses, and another system called OpenNIC governs an alternative domain name system that also works on the Web. To access OpenNIC's system, one has to add an OpenNIC server to one's operating system. Pacific Root alternative domains can be accessed through OpenNIC servers, and six years ago Pacific Root launched its own .biz address, according to OpenNIC founder Robin Bandy. Because ICANN now has a .biz domain, only one .biz can work in the OpenNIC system, and OpenNIC recognizes only Pacific Root .biz addresses. In contrast to ICANN reform, OpenNIC has a voting process empowering anyone who owns a domain name within its system to vote. Mooneer Salem, administrator of the .oss domain for open-source projects, says that if more people used OpenNIC, OpenNIC would have more clout with ICANN. Registering Web sites with OpenNIC is free of charge, Bandy says.

    To read more about ICANN, visit http://www.acm.org/usacm.

  • "Purdue Panel Maps Safer Wireless World for United Nations"
    Newswise (11/25/02)

    Addressing the security of wireless networks is critical, especially for developing countries hoping to enter the Information Age without taxing their limited financial resources, according to a report that Purdue University's 2002 Wireless Security Forum presented at the United Nations. "Wireless technology holds the promise of providing...enterprises with economical access to global data networks, but the risk is that their most sensitive information could be stolen right out of the air if their wireless systems are not adequately protected," declared forum organizer Eugene Spafford. Forum members recommended several strategies to help businesses ensure secure wireless network deployment, including giving vendors more feedback to instill security in new products, and establishing security and employee awareness initiatives. Technical suggestions on subjects such as encryption levels and wireless access points were also furnished in the report. Accenture global security technologies expert David Black, who presented the report to the U.N., said the market-driven global deployment of wireless technology is inevitable. "It is our goal to help this happen as securely as possible," he added. The report is the end product of discussions between 18 federal, industrial, and academic computer specialists who convened in Washington, D.C., at the invitation of Accenture and Purdue's Center for Education and Research in Information Assurance and Security (CERIAS). Black commented that the report was apparently "well-received."

    Eugene Spafford is co-chair of ACM's U.S. Public Policy Committee, http://www.acm.org/usacm.

  • "Future Security"
    InformationWeek (11/25/02) No. 916; Hulme, George V.

    Software vendors know that the market is overcrowded with information security products, while enterprise customers are clamoring for applications that are designed for security up front and that offer real-time system monitoring and rapid response to changing conditions. Microsoft and others claim that they are channeling more resources to software design and quality control in order to satisfy market demand for security, and are collaborating with hardware vendors to make applications with known security holes less exploitable. In the meantime, security vendors are organizing a strategy to dramatically improve several areas in the next few years: Security event-management tools that centralize and coordinate attack data across applications and networks; intrusion detection and prevention systems; and multipurpose network-security devices. Firms such as Symantec and e-Security are focusing on the first area with the development of products and services such as e-Security Advisor and DeepSight Alert Services. More companies will widen the scope of firewalls and intrusion detection software so that they can compile data from IT infrastructure components such as network-access logs, smart cards, user application login and access data, and biometric devices. Meanwhile, other security vendors are working on devices that can integrate multiple security applications to facilitate faster operation and better manageability. Gartner security analyst John Pescatore forecasts that such tools will become popular in two years, since they will be able to boost security and cut costs; he further predicts that vendors will enhance these platforms' application integration by the second half of 2004. Of all the security problems vendors need to solve, the most pressing is the tendency for intrusion detection tools to generate false positives.

  • "Coding with Life's Code"
    Scientist (11/25/02) Vol. 16, No. 23, P. 36; Constans, Aileen

    Various projects in the emergent field of DNA computing--which postulates that biological processes are defined by computational algorithms--are testing whether DNA molecules can carry out computations and can be applied to the design of nanotechnology, among other things. The University of Southern California's Leonard Adleman used DNA computing to solve the traveling-salesman problem, which attempts to find a flight path from one point to another that passes through a given group of cities exactly once. The researcher represented each city and flight path with oligonucleotides that were allowed to randomly combine into a set of possible solutions, then selected for routes that fulfilled the problem's requirements. Meanwhile, Harvey Rubin of the University if Pennsylvania theorizes that cells could be engineered to react to certain environmental conditions through the introduction of DNA. Such cells could be integrated with recording devices to form sophisticated sensors with tunable sensitivity. New York University's Ned Seeman has induced DNA-based molecules to spontaneously assemble into nanoscale structures, and conducted an experiment with Duke University's John Reif in which branched DNA molecules were synthesized and combined in a reaction tube to perform a cumulative XOR operation. DNA nanofabrication holds particular promise as a lithography substitute and an assembly method for molecule-sized components used in the construction of nanorobots. Regardless of whether these projects yield viable tools, the theories stemming from DNA computing research have the potential to transform the study of biology and computation.
    (Access to this site is free; however, first-time visitors must register.)

  • "Seething over Spam"
    CIO (11/15/02) Vol. 16, No. 4, P. 90; Levinson, Meridith

    There is a diverse array of tools and services on the market designed to block junk email, or spam, thus saving companies money and improving productivity. Joyce Graff of Gartner recommends that small- and medium-sized businesses outsource spam filtering to third-party application service providers (ASPs), since the cost and processing power required to deploy local filters can be prohibitive; all companies really need to do is alter a few network routing and domain name system records. However, outsourcers' infrastructures may not have the scalability to handle large companies' email, which is why it may pay to implement local filters, the best of which integrate multiple methods such as content filtering, heuristics, keyword matching, and real-time black hole lists. The pluses of this option include more CIO control of spam filtering, while minuses include hefty maintenance and processing requirements. Meanwhile, industry experts argue that ISPs need to be more proactive in blocking spam from corporate networks, and a few are taking steps to do so. However, as with the ASP option, ISPs are tasked with the unenviable duty of differentiating spam from legitimate email, and this situation carries the risk of generating false positives and alienating clients. Furthermore, some U.S. states have banned spam, but industry observers note that such measures will hardly deter spam sent from overseas. Graff estimates that spam accounts for 25 percent to 35 percent of a company's total email volume, which means email bandwidth and storage capacity is proportionally increased.

  • "The Making of a Policy Gadfly"
    Chronicle of Higher Education (11/29/02) Vol. 49, No. 14, P. A27;Foster, Andrea L.

    Princeton University computer scientist Edward W. Felten is part of a growing number of academic researchers who are opposed to legislation that seeks to regulate digital technology, a move that reportedly threatens important scientific research and development. Politicians have argued that laws such as the Digital Millennium Copyright Act (DMCA) will protect the intellectual property of copyright holders and promote technological advancement, but Felten and others counter that such mandates would stifle innovation by blocking researchers' right to reverse-engineer technology in order to improve it and make new discoveries. Felten is writing a book illustrating the importance of such tinkering, and is posting blogs such as Fritz's Hit List, a collection of technology whose use could be restricted by Sen. Fritz Hollings' (D-S.C.) Consumer Broadband and Digital Television Promotion Act, which calls for electronics manufacturers to include copy-control safeguards in all their products. Last March, Felten told the Senate Judiciary Committee that the bill would inevitably fail, because general-purpose computers cannot be enabled to distinguish between authorized and unauthorized data usage. Six months later, he criticized legislation from Rep. Howard L. Berman (D-Calif.) designed to block peer-to-peer file sharing; Felten's argument was that the Internet itself is a peer-to-peer network, and such a maneuver would spark "an arms race" between copyright holders and network creators. This is not the first time Felten has clashed with digital copyright legislation: Last year he filed suit against industry groups and the Justice Department to overturn the DMCA on the grounds that it is unconstitutional, and even though he suit was dismissed, it prompted officials to propose revisions to the law that allow scientists more freedom to develop tools designed to circumvent copyright protections. Cindy Cohn, Felten's legal counsel, adds that the suit encouraged other researchers to continue their computer security initiatives, and made technology companies more reluctant to sue them for DMCA violations.

    To learn more about DMCA and ACM's arguments against it, visit http://www.acm.org/usacm.

  • "Software Doesn't Work. Customers Are in Revolt. Here's the Plan."
    Fortune (11/25/02) Vol. 146, No. 11, P. 147; Aley, James

    Software developers are striving to make software compatible amid customers refusing to invest in more products because they have yet to realize returns on existing systems, which are often non-interoperable and result in added implementation costs--for instance, every dollar used to purchase IT equals between $4 to $10 spent to make it work properly. To reverse this trend, tech vendors have started designing software according to business processes by embedding interoperability via the enablement of Web services. Microsoft, IBM, Sun Microsystems, SAP, Oracle, and others are all building Web services, with Microsoft pushing its .NET platform and others using the Java-based J2EE. The advantages of Web services include inexpensive, modular installation and the potential to significantly lower switching costs; unfortunately, the latter is so important to the software industry that the concept is likely to encounter resistance. Other pluses from compatible platforms include the automation of routine business procedures and system interchangeability. Web services will never be widely adopted unless the major developers can agree on standards for security and others areas besides interoperability. Nevertheless, the transition to Web services could take a decade, or perhaps even longer. American Airlines CIO Monte Ford says, "I think there is a compelling business case for this technology...but it's going to be incremental steps with a couple of leaps and bounds."

[ Archives ] [ Home ]