HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.

To send comments, please write to [email protected].

Volume 4, Issue 419: Monday, November 4, 2002

  • "Judge Accepts Settlement in Microsoft Case"
    Washington Post (11/02/02) P. A1; Krim, Jonathan

    U.S. District Court Judge Colleen Kollar-Kotelly has allowed Microsoft's settlement with the federal government to go through largely unscathed, despite harsh criticism from nine dissenting states and competitors in the computer industry. The settlement disallows Microsoft from forcing its digital media and Internet Explorer software onto the desktop and requires the company to share some of its source code so that competitors can better create compatible products. The five-year agreement was also expanded to include more provisions governing Microsoft's computer server business and to force the company to allow rival software applications to launch automatically when Windows starts up. Theoretically, computer makers will be free to configure Windows to their own liking on machines they resell by hiding Microsoft applications in favor of other software, without threat of retaliation by Microsoft. Microsoft Chairman Bill Gates said Kollar-Kotelly's approval would provide the company with freedom to continue to innovate, adding that Microsoft would faithfully adhere to the terms of the settlement, which critics say is difficult to enforce. Those that say the settlement is too lenient hold out hope that a continuing antitrust suit brought by European Union regulators will address failings in the U.S. agreement. EU regulators are specifically pursuing allegations that Microsoft illegally quashed competitors in the network server and digital media player markets. Representatives from the U.S. Department of Justice, however, are urging the EU not to go further than the recently approved settlement.
    Click Here to View Full Article

  • "Tech Firms Push for Less Regulation of Broadband"
    SiliconValley.com (11/03/02); Phillips, Heather Fleming

    A coalition of 15,000 technology firms has ramped up efforts to lift federal regulations limiting local phone companies from investing in broadband infrastructure. Local phone monopolies are required to lease their infrastructure, including new equipment used to offer DSL services, to smaller competitors and long-distance phone companies at a discount rate. This, according to Baby Bells such as SBC Communications, leaves little economic incentive to invest in broadband infrastructure. As it is, broadband subscriptions are not growing fast enough to restart the economy and cable TV firms are not pressured to reduce the price of their high-speed Internet offerings. Out of the 15.2 million U.S. homes with high-speed Internet access, just one-third is connected via DSL modems. Technology firms such as Intel, Cisco Systems, and Microsoft are lobbying the FCC to move faster on broadband deregulation because they say that more infrastructure and greater competition between DSL and cable high-speed access will lower prices and spur greater consumer adoption. Cisco's Jeff Campbell says, "Everyone understands that getting some serious movement in deploying broadband is critical to restoring growth in our industry." Tech companies say that more broadband users will lead to more demand for new PCs, digital media devices, software, and online content. FCC officials say the technology industry's involvement has broadened the focus of the debate beyond the boundaries of the telecommunications industry, and FCC Chairman Michael Powell has already made broadband one of his top priorities.

  • "Trying to Shift Shape of PC Screens"
    New York Times (11/04/02) P. C4; Tedeschi, Bob

    E-commerce companies and online media firms are hoping to get a big boost in business with the advent of thin and highly flexible computer screens, which are currently under development. Such displays, offering portability and speed comparable to printed catalogs, would increase consumer convenience and spark increased online shopping; the technology would also offer wider access to information and a higher level of detail, according to International Data (IDC) analyst Bob O'Donnell. He adds that, for instance, consumers could use such products to play DVDs via the Internet rather than renting them, or watch a baseball game Webcast from virtually anywhere. Analysts think that organic light-emitting diode (OLED)-based technology holds the most promise. OLED displays, which are based on luminescent molecules, would be lighter and more energy efficient than liquid-crystal displays, and require no backlighting. Experts also believe that roll-up displays could be fabricated if the molecules can be patterned onto a plastic substrate. Despite the possibilities, Les Polgar of Kodak's display products unit does not expect OLEDs to supplant LCD technology until 2010. Before pursuing OLED technology, display manufacturers are waiting to draw substantial profits from their LCD technology investments, while a method for mass-producing OLED displays has yet to be perfected. However, Cambridge Display Technology has made progress in the development of a method to make OLED screens using a process based on ink-jet printers.
    (Access to this site is free; however, first-time visitors must register.)

  • "New PCs Likely to Cede Some Control"
    Associated Press (11/03/02); Fordahl, Matthew

    Whether it goes by the name "Palladium," "LaGrande," or "trusted computing," the initiative to make digital copyrighted content secure is seen by some as a threat to consumers' freedom. The goal of these efforts is to create an environment where all communications are facilitated by "trusted agents" that would dictate how content can be used or distributed, in accordance with policies set by copyright holders, senders, or recipients. Advocates claim that such a development will encourage users who have been dissuaded by the Internet's security holes to engage in online transactions, thus spurring e-commerce. On the other hand, end-user computers that do not comply with the rules imposed by content creators could be labeled as "untrusted" systems, and subsequently blocked from accessing or copying content; some of the enforcement methods used could be draconian, warns Cambridge University security researcher Ross Anderson. Responding to allegations that sensitive documents and email could be deleted by copyright owners as one method of enforcement, Microsoft Palladium manager Peter Biddle asserts that such material would be protected by laws that prohibit the shredding of vital documents. Another way to limit user usage of copyrighted content would be to make software or hardware incompatible, notes Seth Schoen of the Electronic Frontier Foundation.
    Click Here to View Full Article

  • "Future Gear: Tiny Chips, Everywhere"
    PCWorld.com (10/30/02); Captain, Sean

    Radio-networked smart chips will be embedded in everything in the future, in what AccentureTechnology Labs researcher Paul Mackinaw calls Reality Online. In that scenario, people could identify products they see on the street, even clothing worn by other people, using an Internet-enabled PDA. Small and inexpensive smart chips, which today cost just 17 cents, would give every product and tool an identity and make many systems self-managing. Sensors could be tiny and cheap enough to be distributed throughout a crop field, for example, where they would be powered by sunlight and heat. Each node would send a signal to the nearest other one, so that they all would constitute a huge peer-to-peer sensory network that could turn on the sprinklers when soil conditions showed the plants need watering. According to estimates from the MIT Auto-ID Center, a 96-bit key system would be able to provide individual digital IDs for every single product and person in the world. All these numbers would be stored in Internet-accessible databases, though people would retain control over who tracks their personal belongings. Questions remain as to the willingness of people to participate in such a system, especially when productivity technology already exists that people do not utilize, as evidenced by PDA owners who only learn a few essential functions on their device. Mackinaw predicts that chips the size of sand grains could be ready in about five years, while the University of Southern California at Berkeley has developed 12-cubic-millimeter-sized sensors that could hit the market in about two years.

  • "Is a Larger Net Attack on the Way?"
    MSNBC.com (10/28/02); Sullivan, Bob

    Last week's denial of service attack against the Internet's core domain name servers, despite its failure, has security experts and government officials worried that it was a preview of stronger, more sophisticated attacks that aim to take the Net out of commission. Moreover, although the incident demonstrated that the Internet's defenses and redundancy system are effective, it could generate a false sense of security. There is speculation that the assault, which took down nine out of 13 core servers with no noticeable impact on users, could have been a hacker prank, or a trial run for a group wishing to test the Net's security so that a larger, more coordinated attack could be organized later. The short duration of the attack and the abruptness with which it ended got Predictive Systems' Ed Skoudis thinking that it was a carefully planned probe. Many experts agree that there are much weaker Internet components than core servers for hackers to exploit, such as core routers, which Skoudis believes could be the next target. This particular scenario was threatening enough to prompt Howard Schmidt of President Bush's Critical Infrastructure Protection Board to make the president aware of the problem. However, Matrix NetSystems CEO Bill Palumbo feels that these dire cyberattack forecasts are "overly exaggerated," given the many routers and diverse routes in the network. Skoudis expects the entire Internet to be knocked out in an attack within two to five years, but it is more likely to cause inconvenience rather than death and destruction.

  • "'Middleware' Advances Collaborative Research and Education"
    Newswise (11/02/02)

    The National Science Foundation (NSF) released a new suite of software this week via its National Middleware Initiative (NMI), which aims to serve a variety of educational and corporate needs, including the management of physics datasets, earthquake simulation, and secure Web-based collaboration. NMI teams supported by the NSF include the Enterprise and Desktop Integration Technologies (EDIT) Consortium and the Grid Research Integration Deployment and Support (GRIDS) Center; NMI's GRIDS Center Software Suite is the foundation of the Virtual Data Toolkit developed by the NSF-funded International Virtual Data Laboratory (iVDGL) and the Grid Physics Network (GriPhyN). The product is used to identify galactic clusters and model complex particle collisions. Eight universities, including Georgia State University, the University of Florida, and the University of Texas at Austin, have been chosen to evaluate NMI software and services. NMI applications and goals include the use of Internet2's Shibboleth by the National Science Digital Library, an initiative that caused an 80 percent to 85 percent decline in help desk calls, according to Pennsylvania State University's John Hopkins; the convergence of grid research environments with the campus enterprise; and secure single sign-on and authorization services for various computer systems at the University of Alabama at Birmingham. "It is a good sign that large NSF projects like GriPhyN, the Network for Earthquake Engineering Simulation [NEES] and the multi-site TeraGrid supercomputing system are coming to rely on NMI for stable software releases," says NMI program director Alan Blatecky. "The initiative fills a need the community has identified for persistent middleware infrastructure to benefit research and enterprise computing."

  • "A New Cryptography Uses Photon Streams"
    New York Times (11/04/02) P. C6; Markoff, John

    MagiQ Technologies will announce a quantum cryptography scheme on Monday in which code keys are transmitted as a photonic stream via fiber-optic cable. Quantum physics dictates that any attempt to observe such encrypted transmissions would change the photons and distort the information. Such a technique would be able to effectively shield electronic conversations from eavesdroppers, according to outside researchers. "MagiQ seems to be ahead of the research community in terms of making this affordable and practical," notes RSA Laboratories chief scientist Dr. Burton S. Kaliski Jr. Industry experts expect the military to be the primary market for quantum cryptography. Meanwhile, MagiQ also intends to research quantum computing with an eye toward commercialization. A commercial version of its quantum key distribution system will be available in the second half of 2003, according to the company. MagiQ and its Swiss rival ID Quantique are the first companies to pursue commercial applications of quantum cryptography technology.
    (Access to this site is free; however, first-time visitors must register.)

  • "Professor's Fame Draws Minority Students to Science, Tech"
    EE Times Online (10/30/02); Quan, Margaret

    Arizona State University electrical engineering professor Armando A. Rodriguez has drawn upon his experience as an underprivileged youth and the support he received from mentors during his education to organize a mentoring program of his own, Mosart Fame (Modeling, Simulation, Animation and Real-time control of Flexible Autonomous Machines Operating in an Uncertain Environment). An alumnus of New York Polytechnic Institute and MIT, Rodriguez has earned a Presidential Award for Excellence in Science, Math, and Engineering for Mosart Fame, which receives financial support from the National Science Foundation (NSF), Intel, Microsoft, and Lockheed Martin, among others. His program offers scholarships and mentors to minority and female graduates who opt to pursue multidisciplinary electromechanical research in his laboratory, and thus far it has apportioned 130 $1,000 grants. The NSF reports that underrepresented minorities accounted for only 8.8 percent of U.S. science and engineering master's degree recipients in 2000. Rodriguez says that corporate scholarships designed to increase the ranks of minority science and tech professionals are rare, and are critical to the security of the United States. The AT&T and Lucent Technologies Cooperative Research Fellowship Programs are two initiatives that have reportedly granted almost 500 fellowships to minority and female grad students in science and engineering since 1972. Eight annual fellowship recipients are selected in each program to receive money for tuition, books, fees, summer studies, living expenses, and conference attendance support for six years, while other students get $2,000 stipends.

  • "Commerce Department Unveils Security Guidelines for U.S. Agencies"
    Computerworld Online (10/29/02); Thibodeau, Patrick

    National and economic security could be at risk because of the flawed certification and accreditation processes used to secure federal IT systems, according to a new National Institute of Standards and Technology (NIST) report. Released by the Commerce Department, the report includes guidelines for improving the way in which the federal government assesses the technology that agencies use. NIST describes the current certification and accreditation process, which involves a number of competing security certification procedures, as being too complex, outdated, and expensive. "We would like to move toward the adoption of a standardized process because it allows federal agencies to better understand how their partners are dealing with the security issues," says NIST computer researcher Ron Ross, who co-authored the guidelines. NIST will accept public comment on the guidelines through the end of January. The first section of guidelines address certification and accreditation, system controls, and verification procedures and techniques; the remaining two sections will be released next spring.
    Click Here to View Full Article

    To learn more about ACM's activities in regard to security issues, visit http://www.acm.org/usacm.

  • "Nanoscale LED Debuts"
    Technology Research News (11/06/02); Patch, Kimberly

    Government-funded research in Switzerland has yielded a nanoscale-size light-emitting diode (LED) that could be used in high-speed Internet cable, as well as to produce the single photons needed for quantum cryptography. Andrea Fiore of the Swiss Federal Institute of Technology at Lausanne says the difficulty in creating such a small device lay in aligning the electrically conductive components and the stream of electric flow. The super-small laser was created using modified laser-making equipment, but would have been useless without the additional application of a wet oxidation technique. By quickly rusting a layer of aluminum-gallium arsenide outside the semiconductor material, which was etched using lithography, they were able to isolate a smaller area of conductive material. The rust exposure grows from the outside in, and was timed so that only the portion of the semiconducting material the researchers wanted to be exposed was. Fiore says high-resolution manipulation tools would have accomplished the same job, but they are expensive and cannot be used in mass-produced devices. By channeling the electricity to quantum dots in this way, the researchers produced light with a 1.3-micron wavelength, which is more efficient than current light sources used in high-speed fiber. Additionally, existing hardware that produces single photons for quantum cryptographic devices must be super-cooled and work irregularly. This device can operate at room temperature, and would be smaller and much less expensive.
    Click Here to View Full Article

  • "Paper View TV"
    Nature Online (10/31/02); Gerstner, Ed

    Peter Andersson and colleagues write in Advanced Materials that an active matrix display based on organic semiconductors can be printed on paper. These semiconductors, unlike the liquid crystal pixel components used by most current flat-screen displays, can be processed at much lower temperatures and cheaply patterned via ink-jet printing or spin-coating methodologies. The paper display pixel elements are a combination of poly(3,4-ethylenedioxythiophene) (PEDOT) and poly(styrene sulphonate) (PSS). Applying a voltage to the pixel induces a change in the PEDOT's chemical oxidation state, making the polymer transparent, while reversing the current renders it opaque. Such a pixel can be switched back and forth between light and dark. The voltage and current requirements of the paper display are much lower than those of conventional displays, while the bistable switching of the PEDOT's chemical state keeps power-use restricted to the updating of display content. Such a breakthrough could enable dynamic printed displays on textbooks, cereal-boxes, and other paper-based products. The authors write that the technology could usher in a new form of information dissemination, although they doubt that it will rival conventional display technology.

  • "Q&A: Kevin Mitnick"
    San Francisco Chronicle (10/28/02) P. E1; Kopytoff, Verne

    Former computer hacker Kevin Mitnick, who is now a computer security consultant and co-author of the new book, "The Art of Deception," attributes many instances of hacking to social engineers who trick people to revealing sensitive passwords, source code, and other information that allows them to exploit corporate networks, often in the guise of co-workers or suppliers. He says that to protect themselves, companies should be aware of the threat so that workers can recognize such intrusions and report them to other departments. Mitnick contends that companies may have the hardware and software to defend themselves against cyberattacks, but remain vulnerable because workers have not received adequate training to use such tools. In his opinion, the cyberterrorism threat is overexaggerated, since terrorist groups prefer the more symbolic impact of physical attacks than the financial losses and headaches associated with cyberattacks. Mitnick notes that most antivirus companies do not offer spyware detection services, which is unfortunate because most people are using spyware to monitor others. He thinks that the goal of the White House's cybersecurity plan is to garner more funding and generate fear so that the government's surveillance powers can be broadened, but he says the government should let individual organizations build security without regulation. Mitnick admits that security companies are sensationalizing the hacker threat in order to boost sales, but does not deny the existence of the threat.
    Click Here to View Full Article

  • "Daniel Burrus: On 50 Years From Now"
    Electronic Design (10/21/02) Vol. 50, No. 22, P. 138; Schneiderman, Ron

    Futurist Daniel Burrus has authored several books about the use of technology and helps companies such as Microsoft, IBM, Toshiba, and AT&T forecast future tech trends. When asked how technology will be used in the next 50 years, he says that many existing technologies, such as the keyboard, will not be eliminated but become one of many options users have. Information will grow as database technology and connectivity advance, but humans will be able to keep up with technology and information through the use of better learning methods, such as just-in-time learning and virtual reality. Electrical engineering, however, will be specified in the same way that medicine is today, since the underpinnings of technology will become so complex. However, for end users, some applications and devices will actually become more simple and easy to use, such as cell phones that fit in a person's ear. These phones, for instance, would be used only for voice calls and voice mail, but would perform those functions very well using discrete technology. Burrus says the current hype surrounding wireless computing is justified to the extent that the technology satisfies a social need. Additionally, although the current generation is repelled by the thought of implanted technology, future generations will not see an implanted GPS device, for example, as unappealing. Burrus sees a huge opportunity opening up in the convergence of biology and IT, especially as scientists are able to create custom molecules that serve specific electronic purposes.
    Click Here to View Full Article

  • "Feeling Flat"
    Economist (10/26/02) Vol. 365, No. 8296, P. 75

    The integration of both photons and electrons in computer chips would allow for better ways of handling data. However, engineers are having a difficult time building "optoelectronic" chips for general use because photons are much more wayward and bigger than electrons. Nonetheless, a breakthrough appears to have been made in Graz, Austria, as researchers work to bridge the gap by using packets of energy known as surface plasmon polaritons (SPPs), which are not photons or electrons but rely on them for their existence. While the Austrian researchers have been able to transfer SPP-borne data to electrons for processing and back into photons, more conventional components would make the breakthrough even better. One physicist at the University of Exeter in England, Bill Barnes, is already working to find materials that would allow for the passage of polaritons. Materials that would change state from a conductor to a dieletric when an SPP passes by would turn on corresponding switches in electronic circuitry. This would complete the link between optical and electronic data.
    Click Here to View Full Article

  • "A Spatial Web Is Being Spun"
    GeoWorld (10/02) Vol. 15, No. 10, P. 28; Schell, David

    OpenGIS Consortium President David Schell writes that the growth of spatial content, services, and applications on the Internet is forming a "Spatial Web" that should be looked upon "as one of humanity's critical infrastructure elements and cultural resources." He posits that a Spatial Web will facilitate faster decision-making and workflow by providing information in real time, and foster a more creative, intuitive mentality that should significantly impact all geoprocessing application domains, including urban planning, research, transportation, and environmental management. Schell writes that it must be established that the Spatial Web's self-organization is based on a series of basic principles, precedents, and human intentions. A Spatial Web is built upon early geoprocessing systems that helped make the geoprocessing field invaluable to many activities; it is organized around interoperability, decentralization, systems that can be trusted, and democratic, open, and universal access; and it must be run in a manner that maintains its use as a community resource that is not unduly controlled or influenced by any government or company. Schell notes that the geospatial community is composed of companies, agencies, and academic bodies that are becoming increasingly involved in the setting of Spatial Web standards. He writes that the consensus standards processes they participate in will protect the public interest and encourage healthy business. Some spatial content, services, and applications will be offered for free, while others will be available for purchase. Likewise, some will rely on decentralized delivery, others on centralized delivery.

  • "New Links for Learning in a Changing Profession"
    Aerospace America (10/02) P. 28; Noor, Ahmed K.; Lobeck, William E.

    NASA is supporting a consortium of universities to create an advanced learning network for future engineers, using the latest in networking, e-learning, and collaborative technologies. The Hierarchical Learning Network (HLN) will also enlist assistance from industry, IT vendors, and other government entities who will each contribute their expertise. Separate portal-based networks will be created for e-learning, virtual classrooms, test simulators, and telescience laboratories. The e-learning component includes synchronous and asynchronous expert-led learning, as well as individual learning augmented with software agents, and collaborative learning environments enhanced with the latest in human-computer sensory interfaces. Haptics, natural speech communication, and other advanced interface technologies being developed at MIT Media Lab will be included in the learning modules. Also unique to the HLN will be the availability of NASA's wind tunnel and structural testing facility simulators, which are currently under development. Special equipment such as the University of Illinois' scanning probe microscopes will also be available over the HLN. These telescience capabilities are expected to work in conjunction with physical experiments, thus allowing the most effective use of both methods of experimentation. Organizers of the HLN expect that it will produce a generation of engineers that are able to make use of advanced learning tools to learn throughout their careers, use technology to do their job, and adapt to a rapidly changing and multidisciplinary environment.

  • "The Big Picture"
    CIO Insight (10/02) No. 29, P. 48; Baker, Edward H.

    Yale University professor and Mirror Worlds Technologies co-founder David Gelernter has developed a software program that aims to arrange all information into a chronological stream of files that can be instantly rearranged by topic; such a product serves his view that knowledge management should mirror the narrative structure of life and human memory, which has a past, present, and future. He insists that "People...don't want to be boxed in by an operating system or any particular machine," and describes information as isolated pieces of data that can be organized into knowledge, which he terms "the big picture." Gelernter observes that individuals, businesses, and the country itself are poorly informed, and attributes this to the fact that people are receiving an overwhelming amount of data that they cannot fathom without an overarching context, and he explains that reducing this data does not mitigate the problem, which stems from flawed information and knowledge structures. Gelernter subscribes to the belief that software must directly support corporate or communal memory as an organization's most basic resource and most critical asset. He complains that we are still stuck in the old, flawed ways of knowledge management, and emphasizes the poor relationship between the public and technology. Gelernter argues that the industry's refusal to consider actual user needs while focusing on adding bells and whistles to its products has curtailed the public's enthusiasm toward technology, while the public seems reluctant to admit its timidity. "What I see in the technology world is more continuation of the habit of going back to old solutions and making them more complicated and adding new features and squeezing them harder rather than considering the possibility of what can we do differently now that computers are so much more powerful than they used to be," he says.

[ Archives ] [ Home ]