HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.

To send comments, please write to [email protected].

Volume 4, Issue 427: Monday, November 25, 2002

  • "IT Warns Against Slippery Slope to Regulation"
    eWeek Online (11/22/02); Carlson, Caron

    The IT industry responded to the White House's draft National Strategy to Secure Cyberspace last week, in which hardware and software vendors commented on recommendations that were both promising and troubling. The Business Roundtable praised the suggestion that cyber-security be implemented through voluntary action in the private sector, but advised that the cost of deploying solutions should be incorporated into the strategy. Meanwhile, the Business Software Alliance (BSA) urged that the contributions of small businesses and individual citizens must remain "a key and growing component" of the plan. However, the Computer and Communications Industry Association (CCIA) doubts that average computer users will bother to upgrade their systems, and suggested that deploying a wide spectrum of products and services and implementing open source software is a more effective plan. Less welcome is a recommendation that calls for federal evaluation of private sector security service providers--the BSA wants assurance that the measure would apply to individuals, not certain products or systems. The alliance also does not want a new standards-setting body to be established by the National Security Telecommunications Advisory Committee and the National Infrastructure Assurance Council: The BSA commented that it "can foresee only duplication of existing efforts--or, of more concern, government-guided efforts at regulation from such a body, either directly or through the migration of procurement specifications." The BSA also opposed the establishment of a public/private fund to find and handle the Internet's technology requirements, on the grounds that such needs are already well known, while the fund itself "could effectively become a hidden tax on industry and a mechanism for aggressive regulation of the information technology sector."

  • "Agency Weighed, but Discarded, Plan Reconfiguring the Internet"
    New York Times Online (11/22/02); Markoff, John

    The Defense Advanced Research Projects Agency (DARPA) looked into the possibility earlier this year of creating a new Internet environment where every user would necessarily leave uniquely identifiable imprints, just as criminals leave DNA evidence at a real-world crime scene. The proposed project, dubbed eDNA, was ultimately scrapped after the panel convened to discuss it uniformly condemned the idea. DARPA's director, Dr. Tony Tether, was said to have instigated the eDNA project as a way to track down the source of Internet crimes. Under the eDNA scheme, portions of the Internet would be cordoned off and require users to have an electronic signature backed by a unique personal identifier, such as their fingerprint or voice. Every data packet sent or Web site visit would be tagged with the signature. Government and law enforcement were seen as the first adopters of the system, with security-minded organizations such as financial institutions joining later. Those invited to weigh in on eDNA included Matt Blaze of AT&T Labs, Roger Needham of Microsoft Research, and Marc Rotenberg of the Electronic Privacy Information Center, among others. Dr. Victoria Stavridou of SRI International, the research firm hired to review the concept, co-headed the group and was accused of not accurately representing the consensus report to DARPA. Among those interested in eDNA was DARPA's Information Awareness Office, which was created after the Sept. 11 attacks to pursue terrorist communications. The Information Awareness Office is also the group behind the Total Information Awareness database proposal, which would create technology allowing the government to cull existing electronic transaction records for possible terrorist activity.
    (Access to this site is free; however, first-time visitors must register.)

  • "New Gizmos May Spark Deregulation"
    Associated Press (11/24/02); Bergstein, Brian

    Technology experts argue that a series of breakthroughs should lead to a rethinking of how people use the airwaves: Among them are wireless technologies being developed by the Defense Advanced Research Projects Agency (DARPA), Intel, Bell Laboratories, SIGFX, and Vanu, that could find use as multipurpose cell phones, "smart" radio receivers that can switch to idle spectrum to avoid interference, or techniques to boost data transfer rates by exploiting network interference. The government's current policy is to divide the spectrum and license the use of each band while setting very strict regulations; the military owns the lion's share of the spectrum, while space for new technologies in unlicensed bands is limited. However, an FCC-appointed taskforce recently presented a plan to re-tool the federal spectrum, and this overhaul is expected to commence in 2003. The group recommended that wireless carriers' spectrum licenses should be more flexible so that unused parts of the airwaves may be leased, and supported an open spectrum policy so that innovative technologies will have room to operate in regulated as well as unlicensed bands, provided they do not disrupt radio broadcasts or cell phone conversations. This policy overhaul is likely to spark conflict in Washington--Bryan Tramont, senior legal advisor to FCC Chairman Michael Powell, predicts that "certain ossified licensees will inherently be resistant to change."

  • "Researchers: Pull Plug on Battery Attacks"
    ZDNet (11/22/02); Junnarkar, Sandeep

    Virginia Tech scientists Tom Martin, Dong Ha, and Michael Hsiao are conducting research on ways to counter cyberattacks on mobile computers that aim to incapacitate their targets by draining their batteries. The researchers are concentrating on three potential attack scenarios: An attack in which networks are flooded with repeated service requests, forcing victims to waste power deciding whether or not to follow the request; a power virus assault, in which victims are forced to carry out authentic but power-hungry tasks over and over; and the use of malignant power viruses that alter programs to devour more energy than they are originally designed to. "[T]he point of this project is to stop these attacks before they become common, by showing their potential effects and solutions for mitigating those effects," Martin explains. His team is focusing on the creation of a mobile computer authentication scheme that ensures battery life is kept to a minimum, as well as a technique to detect security holes related to power. The authentication architecture would make all unconfirmed requests less power-hungry, while watching energy levels would enable systems to identify commands that trigger programs involving a high degree of energy consumption. The project has received over $400,000 in funding from the National Science Foundation.

  • "'Here's Looking At You' Has New Meaning: Eye Contact Shown to Affect Conversation Patterns, Group Problem-Solving Ability"
    ScienceDaily (11/22/02)

    Dr. Roel Vertegaal of Queen's University has discovered a correlation between the amount of eye contact people receive and their involvement in conversations, and believes this research could have valuable ramifications for the development of future communication interfaces, such as videoconferencing and Collaborative Virtual Environments. Eye contact, both synchronized and random, was studied in an experiment in which subjects watched computer-generated images of actors displaying different levels of attention in a mock three-way videoconferencing session. The researchers found that people in group conversations will participate more if they receive more eye contact from other group members. The timing of the eye contact did not appear to have any significant effect. "Now that we are attempting to build more sophisticated conversational interfaces that mirror the communicative capabilities of their users, it has become clear we need to learn more about communicative functions of gaze behaviors," Dr. Vertegaal explains. He presented the results of the experiment this week at the ACM's Conference on Computer Supported Cooperative Work.

  • "Nano Research Should Study Consequences"
    United Press International (11/20/02); Burnell, Scott R.

    A study authored by Glenn Harlan Reynolds of the University of Tennessee College of Law and released by the Pacific Research Institute (PRI) calls for nanotechnology researchers to openly disclose the nature of their research to the public while also accepting modest government regulation. Reynolds says such an approach will help ensure that nanotech applications are "primarily civilian and beneficial, rather than military and destructive." He adds that the government must contribute more money and resources into studying and disclosing the implications of nanotech. Woodrow Wilson Center public policy scholar Julia A. Moore says a lack of public information could severely inhibit nanotech progress. Meanwhile, Vicki Colvin of Rice University's Center for Biological and Environmental Nanotechnology estimates that only one-half of 1 percent of the National Nanotechnology Initiative's budget is devoted to analyzing nanotech's impact on the environment and people's health. In the PRI report, Reynolds postulated three possible scenarios for nanotech development: A complete ban on research, which he says is impossible; research that is exclusively military or classified; and self-regulated civilian research coordinated by the government, which he calls the best option. The purpose of issuing the study is to help lawmakers and nanotech enthusiasts find common ground, according to Sonia Arrison, director of PRI's Center for Technology Studies.

  • "Radical Physicist Flatters Computer Fans"
    CNet (11/20/02); Shankland, Stephen

    This fall's Comdex trade show was unusual in that a key speaker presented radical ideas that could be applied in the distant future, as opposed to the present. Physicist Stephen Wolfram, the author of "A New Kind of Science," explained his view that the universe and all physical phenomena are determined by simple programs, or algorithms. "There's a broader range of [computing] things, and they don't all have to be based on CMOS and gates," he declared. "Systems out there in nature are already doing computations as complex as the ones that correspond to human intelligence." Wolfram later said that technologists will one day be able to apply such programs practically. He foresees applications such as the development of computer vision using algorithms derived from human vision; improved transplants and medicine based on computational modeling of cells; and economic research. Physicists and mathematicians have been very critical of Wolfram's views, but he predicted at Comdex that his theories will one day have great value. He said that his algorithms are at least as important as mathematical equations, if not more so, as a way to explain phenomena. Wolfram also told attendees that they should look a century ahead for the day when his ideas will take root.

  • "Planning for the Day When Silicon Rules No More"
    Nanotech Planet (11/19/02); Pastore, Michael

    In their respective keynote speeches at the Nanoelectronics Planet Conference & Expo on Thursday, Dr. Thomas Theis of IBM's Research Division and Dr. Yong Chen of Hewlett-Packard Laboratories' Quantum Science Research agreed that silicon still has long-term prospects in microelectronics, but that has not stopped industry from pursuing alternative technologies for the inevitable day when silicon and lithography reach their thresholds. Theis believes that nanoscale components are likely to eventually supplant logic and memory, and discussed IBM's work with nanotubes and other materials. He said that nanotube applications will remain limited until a consistent mass production technique is worked out, but noted that IBM has made significant strides in this direction, and patented a process for unadulterated single-wall nanotube fabrication. In the memory department, IBM is focusing on magnetic tunnel junction (MJT) MRAM as well as molecular memory. The company has high hopes that MJT MRAM can be combined with silicon. However, Theis warned that the intense rivalry for coming up with cheap, low-power memory that is also nonvolatile could have a detrimental effect, especially for researchers concentrating on molecular memory and polymer resistive memory. With lithography becoming increasingly costly, IBM is investigating self-assembly as the key to atomic-scale manufacture of wires and logic devices. Theis reasoned that silicon has at least 10 more years of life ahead of it as the number-one material used in microelectronics. Chen says his company's work on molecular electronics research is still years away from challenging silicon, and expects the two technologies to coexist.
    Click Here to View Full Article

  • "Straining Digital Copyright Law, Junior Paper Exposes Protection Flaws in CDs"
    Daily Princetonian Online (11/21/02); Tauberer, Joshua

    Alex Halderman, a senior computer science major at Princeton University, has acknowledged the possibility that he could be sued by the music industry for allegedly violating the Digital Millennium Copyright Act (DMCA) if he presents a junior paper at the ACM Conference on Computer & amp;Communication Security in Washington in the spring. His paper focuses on weaknesses in the copy-protection systems of certain CDs, which exploit certain software security holes. However, Halderman does not include any methodology for bypassing such safeguards, and notes that he would only if the DMCA did not exist. The student says the university has promised to provide him with legal defense if a DMCA lawsuit is filed against him. Princeton has previously supplied indemnification for students and faculty when they were carrying out certain duties for the university, but General Counsel Peter McDonough notes that defending research for research's sake is without precedent. He adds that such a move carries the risk of having his office accused of censorship if it refuses to recommend legal defense for researchers; nevertheless, there are strong arguments that the DMCA represents a serious threat to legitimate academic research. Last year, the music industry sent a letter of warning to Princeton professor Edward Felten, claiming that they would sue him for breaking the DMCA if he published certain research. The university responded by forming a committee that assesses threats to academic freedom by judicial strong-arming, according to committee chair Edward Groth.

    To read more about ACM's activities in regard to DMCA, visit http://www.acm.org/usacm.

  • "A Visionary Pays a Visit"
    Toronto Globe & Mail Online (11/15/02); Kapica, Jack

    Vinton Cerf spoke about envisioning the Internet last week at the well known Toronto Empire Club that counts many of Canada's leading business figures as members. Cerf believes that issues such as the assignment of domain names, taxing e-commerce, online censorship, and other unresolved problems must be addressed before his utopian vision of the Internet can ever be realized. Cerf noted that American Internet users basically face uniform ISP offerings, a lack of competitive choices in some regions, and also the lack of DSL access in some regions. Cerf advocates that Internet service should be sold at a fixed price, and he notes that access-links and cell phones already are offered at fixed prices, with cell phones charging more for minutes that exceed a certain threshold. In Canada, ISP service is cheaper, and some Internet users face a "bit cap" that limits Internet use under certain ISP plans. Cerf says that de-monopolizing the cable infrastructure that is often used to carry Internet service will create a more competitive environment, and that increasingly all types of media broadcasts are being sent via TCP-IP format. One day all telephone, radio, and television signals will use this method, says Cerf, and therefore these services should all become flat-rate services. Cerf also notes that under today's ISP model, providers define a company by the traffic it generates and assume that broadband customers want to receive information more than send information--an erroneous assumption. Cerf foresees a consumer model predicated upon consumers wanting to send and receive equally, a model that may necessitate Internet users having their own Internet servers.

  • "The Next Chapter"
    Computerworld (11/18/02) Vol. 36, No. 47, P. 54; Laube, Sheldon; Tickle, Pat; Nitschke, Markus

    CenterBeam Chairman Sheldon Laube sees disposable PCs as the future of information technology, considering personal computers can be purchased for less than $200 today. Because it would not make economic sense to ship a PC back for repairs, within five years Laube believes a consumer will be able to throw away their computer if it fails, and computers makers will make use of a central service to send a new machine--with "reincarnated" data, preferences, and applications--to the consumer. Attachmate's Markus Nitschke says mainframes will continue to fill a key role in IT, residing at the center of Web-based technologies such as Web services, and will be used for integrating existing applications. And Gilbert Houtekamer, chief technology officer of Consul Risk Management, adds that improving mainframe technology will result in a significant shift of business information and intellectual property from small servers to traditional mainframes by 2005. Scott Testa, chief operating officer of Mindbridge, believes floppies, CD-ROMs, DVD players, and other moving parts will disappear from computers in 10 years as servers take on the roll of master docking stations. John Peters and Vivek Kapur, consultants with IBM Business Consulting, see hardware for computing, storage, and networking becoming entirely commoditized by 2005 so that vendors must compete on service and pricing only.
    Click Here to View Full Article

  • "Retooling the Programmers"
    InformationWeek (11/18/02) No. 915; Ricadela, Aaron

    Aspect-oriented programming seeks to relieve companies of many headaches, such as the intense difficulty programmers face in converting the needs and ideas of non-technical personnel into usable code, as well as organizing and updating vast numbers of scattered code fragments dedicated to computing-critical policies. The result of over 10 years of research by IBM, the Palo Alto Research Center (PARC), Northeastern University, and the University of Twente, aspect-oriented programming aims to reduce the complexity and size of software programs by sharing and reusing more code across their components. The proliferation of the methodology throughout the mass market could lead to better software quality and reduced IT and maintenance costs, since it would automate more of the discourse between developers and businesspeople. "Aspect-oriented tools force you to think at a higher and more concrete level," declares JPM Design's Juri Memmert. "If you don't do that, then you're basically hosed, no matter what you're using." PARC issued an update this month to version 1 of its Java-based AspectJ software, the development of which was financed by the Defense Advanced Research Projects Agency (DARPA). About 12 organizations, including the U.S. Air Force, Siemens, and Sirius Software, use the application in a commercial capacity. Other aspect-oriented programming tools currently available or under development include IBM Research's HyperJ and Cosmos.

  • "The FBI's Cybercrime Crackdown"
    Technology Review (11/02) Vol. 105, No. 9, P. 66; Garfinkel, Simson

    Cybercrime is a growing concern in Washington, especially with experts warning that an online assault on the nation's critical infrastructure coupled with a physical terrorist attack could trigger chaos, confusion, and loss of life, to say nothing of the financial damages. Partly in response to these fears, the FBI has organized a Cyber Division that features 16 Computer Crime Squads tasked with investigating a wide range of cybercrimes, including network intrusions, Web site defacements, etc. The initiative could be improved--the squads suffer from a lack of manpower, and agents must struggle to keep pace with technological advancements. For agents to investigate a reported cybercrime, the crime must cause at least $5,000 in damages or involve a federal interest computer; other barriers to effective cybercrime deterrents include a lack of evidence and corporations' reluctance to report security breaches. Complicating the situation is the Internet's anonymity, which is making cybercriminals brazen, and technologies that allow hackers to commandeer innocent systems to launch attacks or switch from system to system to evade traces. Hacker identity is also changing: FBI Special Agent Nenette Day notes that as recently as five years ago most hackers were stereotypical teenagers who committed crimes for the bragging rights, but now practically all hackers are adults who are much more secretive and use more sophisticated methods. In her opinion, a key consideration to developing an effective cybercrime defense is for society to be able to distinguish between petty online vandalism and cyberterrorism. The cybercrime squad's problems are likely to mount as wireless technology proliferates and enables hackers to launch attacks from more remote locations.

  • "The Ghosts of Computers Past"
    IEEE Spectrum (11/02); Wallich, Paul

    The Computer History Museum in Mountain View, Calif., contains an archive of precedent-setting hardware and software, and museum board member John Mashey says the need to historically document the evolution of today's computers is critical, especially since computer technology pioneers are still available to offer expertise and equipment. Noteworthy items in the collection include the Hollerith tabulating machine, which set the standard for punch-card reading mechanisms for three quarters of a century; Rand's JOHNNIAC, which was designed by pioneer John von Neumann; Gene Amdahl's Wisconsin Integrally Synchronized Computer (WISC); LINC, the first laboratory minicomputer, featuring a 2,048-word memory capacity and analog-to-digital and digital-to-analog converters; the Cray 2 and Cray Y-MP, whose respective breakthroughs were a torus configuration to equalize the distance for electrical transmissions, and modular plumbing; and an assortment of groundbreaking Apple models. Failed products such as the KSR-1 and the ETA-10 are also accepted by the museum, because curator Mike Williams believes that failures are just as important a historical component as successes. He says that rapid innovation in high-performance machines has reached its conclusion thanks to flagging demand from industry and the military. Innovations involving garden-variety CPUs are the new focus, Williams adds. Documentation--oral, filmed ,and written--is also preserved in the collection, and some machines are being restored while others that are beyond restoration are being simulated. IEEE Fellow C. Gordon Bell says the museum could serve even more visitors if its entire collection was disseminated over cyberspace.

[ Archives ] [ Home ]