HomeFeedbackJoinShopSearch
Home

       HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.

To send comments, please write to [email protected].

Volume 5, Issue 540:  Monday, September 3, 2003

  • "As Digital Vandals Disrupt the Internet, a Call for Oversight"
    New York Times (09/01/03) P. A1; Harmon, Amy

    The growing sophistication and frequency of computer virus attacks, such as those that afflicted systems in recent weeks, is making government oversight of cybersecurity a more palatable concept for many people. Michael A. Vatis, former head of the FBI's National Infrastructure Protection Center, says that voluntary, private-sector initiatives to produce more secure software and more robust systems are inadequate. An Aug. 31 survey from the Pew Internet and American Life Project estimates that almost 60% of Internet users want the government to force U.S. companies to disclose more information about their security flaws, and 50% of respondents expressed concern about cyberterrorism. Policymakers and computer security experts are considering several federal regulatory proposals, such as promising tax incentives for businesses that beef up their security and requiring public companies to reveal possible computer security risks in filings to the Securities and Exchange Commission. Proponents of more cybersecurity regulation think that a recently passed California law could be used as a template for national policy: The mandate requires companies that do business in California to notify residents of intrusions that may have compromised their personal data, and allows customers to file civil suits against businesses that fail to comply. Critics complain that the White House has made cybersecurity a low priority--former chief of the Homeland Security Department's cybersecurity division Richard A. Clarke laments, "I kind of despair of the government doing anything." The position Clarke once held was filled by Howard Schmidt, who resigned from the post in April. There are also Internet users who caution that too stringent federal cybersecurity legislation could hurt the openness and usability of the Internet.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "European Patent Law Draws Fire"
    Wired News (09/01/03); Scheeres, Julia

    The European Parliament is scheduled to vote in September on a proposed patent law that is supposed to reconcile software patent legislation between European Union member states, but critics contend the mandate endangers both Europe's open-source movement and software innovation. "Computer-implemented inventions are increasingly important, yet many of the 20,000 patents for software-related patents already granted in Europe are in non-European hands," wrote European Parliament member and proposal author Arlene McCarthy in June. "Indeed, we would do small and medium-sized European software developers a disservice if we were either to leave matters as they stand, or if we were to attempt to ban all patents for such inventions, thus potentially putting our software developers at a disadvantage when they seek to compete in the U.S." Opponents such as the Foundation for Free Information Infrastructure (FFII) counter that the measure would only benefit large software companies, allowing them to drive smaller competitors out of business by charging excessive licensing fees. Most software is built atop existing code, meaning programmers must spend a lot of money on a lengthy patent search in order to make sure their code does not violate other patents. Furthermore, the proposal is especially galling to open-source proponents, because the open-source movement relies on the free distribution of software code. Leading economists from European and American universities recently sent an open letter to the Parliament criticizing the proposal, while opponents held a protest demonstration in Brussels and instigated the shutdown of Web sites last week. Over 170,000 people have signed a petition calling for the renunciation of the proposal. The European Parliament now says it will vote on the proposed law at a plenary session starting September 22.
    Click Here to View Full Article

  • "Scientists Plan New Supercomputer Linking System"
    Associated Press (09/02/03)

    Researchers worldwide are collaborating to create an international data grid that would make supercomputing resources available to normal users. Fermi National Accelerator Laboratory (Fermilab) scientist Lothar Bauerdick, whose laboratory is one of the project participants, calls the effort "a democratization of science." He expects the international data grid to transform society. Argonne National Laboratory computer scientist Ian Foster is also working on the global grid, which he says will allow researchers worldwide to manage the dramatic increase in scientific data. Foster is considered the father of the data grid, and he says this type of data grid challenges scientists to find questions that make use of available resources. Fermilab and U.S. and European research centers have already set up grid computing links to test initial feasibility, but the final vision is much larger, according to Fermilab computing division head Vicky White. She says the international data grid would allow users in poorer regions to run simulations for any purpose, whether it be betting on horses, analyzing the stock market, or conducting high-energy physics experiments.
    Click Here to View Full Article

  • "Heavyweights Sending R&D Overseas"
    Silicon Valley Biz Ink (08/29/03); Tanner, Steve

    Offshore IT outsourcing is no longer limited to low-level programming, as evidenced by research and development work being exported to inexpensive overseas labor markets by major Silicon Valley companies. Washington Alliance of Technology Workers President Marcus Courtney says this trend will have an impact on almost every U.S. tech worker, and thinks Silicon Valley and other traditional centers of tech innovation could become less attractive as a result. Intel recently announced plans for an $80 million R&D facility in Malaysia, and company representative Chuck Mulloy did not mention how many employees would be hired, insisting that Intel's long-term goal "is to keep the U.S. work force stable and to expand outside the United States." Yahoo! disputed claims of a new Indian R&D center reported by Reuters and its own Indian news site, though Yahoo! India CEO Venkat Panchapakesan was quoted on Rediff.com India for saying that an already established R&D facility will boost its staff soon. Meanwhile, an Aug. 19 Indiatimes.com story quoted Silicon Graphics COO Warren Pratt's statement that his firm was open to offshoring both low-end and high-end jobs. Adobe outsources R&D work in several countries, and company representative Katie Juran says such a strategy complements the global reach of Adobe's products and services. But Courtney argues that companies are using such claims as a rationale for outsourcing jobs to minimize operational costs. "They do not want the public and their employees to know about their true plans," he claims, adding that he believes high-level tech professionals will mobilize to fight outsourcing.
    Click Here to View Full Article

  • "MIT to Uncork Futuristic Bar Code"
    CNet (08/29/03); Gilbert, Alorie

    Researchers and large corporate interests are uniting behind a new product-tracking technology called electronic product code (EPC). The 96-bit format would assign a unique number to each individual item, unlike the group identifiers that today's bar code standard allows for. Combined with radio-frequency identification (RFID) technology, EPC would allow retailers to manage their supply chains and inventory at an unprecedented level of accuracy and efficiency. MIT's Auto-ID Center is set to unveil its EPC Network system on Sept. 15 at the EPC Symposium in Chicago: The meeting will host companies such as Colgate-Palmolive, General Mills, GlaxoSmithKline, Heinz, Kraft Foods, Nestle, and others. Although PC Group analyst Pete Abell calls the symposium a milestone in the development of EPC technology, he says a worldwide system will not be deployed for at least another decade. Microchips that would be used in EPC systems today cost just pennies, but the price would need to drop still further for them to be used on every single item; also, the Auto-ID Center's efforts have been more about research and are not very useful for actual deployment, says Abell. Another EPC group called AutoID is sponsored by the Uniform Code Council, a worldwide standards body in charge of Uniform Product Code. That group will work on the actual standards needed for retailers to use EPC technology and should release draft specifications soon. Wal-Mart, meanwhile, is spurring RFID adoption by requiring its largest suppliers to attach RFID tags to pallets and cases beginning in 2005.
    Click Here to View Full Article

  • "Processor Adapts to Shifting Loads"
    EE Times (09/02/03); Lammers, David

    Computer architects at the University of Texas in Austin have been granted $11 million from the Defense Advanced Research Projects Agency (DARPA) to co-develop with IBM's Austin Research Lab a prototype polymorphic processor that uses instruction-level parallelism. The devices will incorporate a quartet of Tera-op Reliable Intelligently Adaptive Processing System (Trips) processors, each consisting of 16 execution units arrayed in a 4 x 4 grid. "Our thinking is that the hardware really ought to adapt to the software running on it, so Trips morphs to the characteristics of the software running on it," explains Chuck Moore of UT-Austin. Trips program project leader at IBM Research Jeff Burns says the company will devise targeted design tools to control power, and employ them to organize a grid that deactivates the power supply whenever the opportunity presents itself. The UT-Austin architects behind Trips, Steve Keckler and Doug Burger, were convinced that increased wiring delays would impede the ability to scale the performance of conventional processor frameworks. Trips adjusts to different applications by using different levels of parallelism at the instruction, thread, and data tiers. Trips compiles conventional programs and splits them into hyperblocks, which are loaded so that they travel down trees of interconnected execution units. Trips developers expect to have working prototypes ready before 2006.
    Click Here to View Full Article

  • "At CMU, Scientists Are Building Sense Into Cell Phones"
    Pittsburgh Post-Gazette (09/02/03); Spice, Byron

    Carnegie Mellon University graduate and undergraduate students have developed SenSay (Sensing and Saying) cell phone technology that is "context-aware" of the user's situation and activities, which Human-Computer Interaction Institute director Dan Siewiorek calls "a natural evolution" of the mobile communications medium. SenSay's sensory element consists of a microphone to pick up the user's voice in order to determine if he or she is engaged in conversation; another microphone to pick up background noise; a light sensor to ascertain if the device is enclosed in a pocket; and an accelerometer to establish if the user is in motion or stationary. Input from these sensors can help the phone decide, for instance, if it should hold a call so it will not interrupt the user's conversation, ring louder if its surroundings are especially noisy, or vibrate if it is inside the user's pocket. If the user needs to return a call, a SenSay phone could "ping" the number of the other person to determine if that person is unoccupied, and then prompt its user to return the call. The CMU researchers want to make SenSay capable of recognizing four individual states--busy and not to be interrupted, idle, physically active, and "normal." The phone is not yet able to distinguish between a user who is idle and one who is engaged in a passive activity, such as watching a movie or a play. Siewiorek says such capabilities could be added if the device can access the location-sensing technology of the cell phone network. Several dozen self-contained SenSay phones are expected to be built for testing in the Defense Advanced Research Projects Agency's Reflective Agents with Distributed Adaptive Reasoning project, which aims to produce a "personalized cognitive assistant" that can act as an electronic secretary, according to Siewiorek.
    Click Here to View Full Article

  • "Spamming Sleazebags Ruining E-Mail"
    SiliconValley.com (08/31/03); Gillmor, Dan

    Dan Gillmor places the blame for email's declining appeal mainly on the shoulders of unscrupulous, corrupt virus authors and spammers who exploit poor software and oblivious users, ISPs, and systems administrators. He notes that spammers would be undone if enough people would stop purchasing things in response to their unsolicited entreaties, but the futility of this gesture prompts the need to institute more stringent legislation. Yet Gillmor estimates the chances of resolving the spam dilemma relatively quickly are "next to zero." He adds that virus and worm writers, who cover their tracks in much the same way spammers do, are exacerbating the situation, and they have found unwitting help in companies such as Microsoft, whose software architecture is notoriously exploitable and homogeneous. Worse, Gillmor severely doubts that Microsoft will voluntarily strive to overcome the business model that fosters the continued support of consistently lousy software. He lays better odds of more systems administrators and users prevailing against their own laziness and updating their systems, but does not think enough are doing so. Meanwhile, chances are higher that ISPs will improve their security measures by providing firewalls and email virus protection to customers as standard service features. Gillmor acknowledges the potential of a radical new email architecture that would track miscreants through authentication and other safeguards, but warns that deploying it would be very difficult, and would cost users their anonymity. His own strategy is to not open email attachments under any circumstances unless he has foreknowledge of the message contents or is very sure it does not contain malware; relegate critical communications to private email addresses Gillmor hands out to a small group of people; and use instant messaging or other alternative Internet communication options.
    Click Here to View Full Article

  • "Chilly Future May Await Tomorrow's Computers"
    NewsFactor Network (09/02/03); Martin, Mike

    David Reid of the Institute of Physics notes that quantum computers could compute 1,000 times faster than classical machines, but Laszlo Kish of Texas A&M observes that they may also produce at least 100 times more heat than conventional computers. Reid explains that quantum computers perform calculations so rapidly that the buildup of errors, coupled with the heat output, could cause the hardware to burn out. However, he adds that Slovenian researchers Tomaz Prosen and Marko Znidaric from the University of Ljubljana have devised a "quantum freeze of fidelity" that can eliminate most of these errors and keep quantum computers operating at high speed. The quantum freeze is a mathematical model that enables programmers to concentrate on, impede, and dispose of the biggest errors before they can spread. "The mechanism of the quantum freeze of fidelity, which drastically enhances the stability of quantum computation, is simple and yet has no analogue in classical physics," notes Prosen. "It is a kind of quantum demon at work." It may be more of a challenge to cool down overheated quantum hard drives, Kish observes.
    Click Here to View Full Article

  • "More Power to Portable Electronics"
    San Francisco Chronicle (09/01/03) P. E1; Evangelista, Benny

    The growing power consumption of increasingly sophisticated portable devices is boosting the appeal of micro fuel cells, which promise to boost conventional battery life by a factor of 10. Allied Business Intelligence analyst Atakan Ozbek forecasts that the first micro fuel cell-powered portable devices will debut in 2004, and he expects worldwide micro fuel cell battery sales to yield approximately $2 billion to $3 billion by 2011. Startup companies such as PolyFuel and major firms such as Motorola and NEC are developing micro fuel cells, which generate electricity through a chemical reaction between oxygen and a fuel such as methanol or hydrogen; NEC plans to roll out a methanol-powered laptop by 2005, and PolyFuel has produced a thin polymer membrane designed to help channel the chemically-produced electrons that drive the device. PolyFuel CEO Jim Balcom says the far-sighted goal of the micro fuel cell industry is to sell off-the-shelf fuel cell cartridges so that used-up cells can be thrown away like conventional AA batteries. He estimates that micro fuel cells and their replacement cartridges' daily cost will range between $1 and $5, but developers face the double challenge of lowering the cost and shrinking the size of the cells so they can be plugged into standard cell phones or laptops. Hydrogen & Fuel Cell Investor editor David Redstone doubts that commercial fuel cells will hit the market next year, given that the technology has been under development for over two decades yet still fails to match the energy capacity of current lithium ion batteries. Meanwhile, methanol-powered micro fuel cells will be barred from usage in certain situations--such as on aircraft--without government approval. Balcom insists that "The wireless revolution can't happen without portable fuel cells."
    Click Here to View Full Article

  • "Voting By Net Proxy?"
    ABCNews.com (09/02/03); Jordan, Andy

    Futurist Jason Tester foresees a chilling corruption of democracy if the development of electronic voting systems continues to follow its current path, in the form of software agents that vote for candidates on their users' behalf, even when users are not cognizant of, or interested in, the candidates' backgrounds or their stance on critical issues. "I think the service, if it existed, would try to play upon the guilt people feel for not voting," muses Tester. The futurist has gone so far as to develop an agent, dubbed Constituty, that combs through users' computer habits--frequently visited Web sites, keywords and emoticons in instant messages, online bank accounts, etc.--to construct a political ideology and make voting recommendations. On his AcceleratedDemocracy.net Web site, Tester outlines four possible future voting scenarios. One scenario features Constituty being commercially available and able to both recommend candidates and offer to vote for the user, regardless of whether the user inquires how the agent made its recommended decision; another scenario involves location-based voting in which a voter is required to spend time in the woods in order to vote on a park preservation ballot measure; the "Exercise Your Vote" scheme would grant voting power only to people who are knowledgeable about candidates and issues, with an incentive system set up to reward such voters with political payback; and a tracking system that allows voters to determine how well candidates come through or fail on campaign promises. Stanford University computer science professor David Dill does not think voting agent technology will emerge in the near future. "To have a computer program try to guess how you're going to vote when who knows what kind of logic it's using, and who programmed that agent, is well beyond anything I would consider acceptable," he explains.
    Click Here to View Full Article

  • "Cellophane Turns LCDs 3D"
    Technology Research News (09/03/03); Smalley, Eric

    University of Toronto researcher Keigo Iizuka has discovered a cheap method to create 3D displays by covering half of a laptop screen with a sheet of cellophane. The screen displays two copies of an image with different polarization that look three-dimensional when one stares at them through cross polarized glasses. Iizuka observes that there is no need to even wear such glasses if a crossed polarized sheet is suspended between the screen and the watcher. The top layers of the screen itself act as polarizer sheets, while Iizuka confirmed through experimentation that a sheet of ordinary, 25-micron-thick cellophane can rotate the direction of white light polarization 90 degrees. "The advantage of such a 3D display is that it is easy to fabricate with readily available components at minimum cost," he notes, adding that cellophane does a better job of rotating the white light polarization direction than commercial half-waveplates that are 3,500 times more expensive. The material can also be used to make very large 3D displays because it is available in large sheets, says Iizuka. Cellophane gains its polarization ability when it is extruded during manufacture; Iizuka explains that the stress the material undergoes gives it anisotropic properties. The researcher is currently attempting to use the method to devise 3D displays that can accommodate sign language.
    Click Here to View Full Article

  • "Platform Internet: The Promise of Grid Computing"
    TechNewsWorld (08/29/03); Chui, Willy

    The Internet continues to evolve, becoming the platform for even more services and applications: Since its inception as a Defense Department project 30 years ago, the Internet and the subsequent World Wide Web have benefited from open standards and initial involvement from technical and academic groups; now, grid computing technology is following the same path, with universities and pharmaceutical research labs pushing the concept and underlying standards. When mature, grid technology will turn the Internet into an ubiquitous computing resource that unlocks unused processing power of connected server machines. Many server machines currently operate at just a fraction of their full capacity, in the same way as the human brain does. Linking them together through special software and standards that go beyond interoperability would create a global-scale computing resource and allow the creation of "virtual organizations," or loose collaborations exploiting grid computing. Eventually, the maturation of grid computing technology will mean a computing utility as reliable and omnipresent as today's electrical utilities. Research groups are already exploring uses of grid computing, such as the Mayo Clinic, which has connected its own medical databases to outside resources for more effective cancer treatment; the University of Pennsylvania is similarly constructing a computing grid that provides powerful breast cancer diagnostic tools available to a wide number of institutional users. In the commercial realm, grid computing is being used by manufacturing firms for engineering, by financial services companies for analysis, and by the energy industry for seismic exploration. IBM's Blue Gene supercomputer represents the development of another important grid computing component, autonomic computing--the supercomputer features 8 million autonomous elements that work together and self-correct.
    Click Here to View Full Article

  • "'Smart Dust' Could Lead to Tiny Robots"
    Photonics.com (08/28/03)

    University of California at San Diego (UCSD) chemists have created sensors approximately the width of an average human hair that can identify oily liquids in water. The researchers say the small silicon chips are a first step in the development of dust-sized robots that can operate in small environments to destroy cancer cells or identify toxic substances. UCSD professor Michael Sailor says, "The vision is to build miniature devices that can move with ease through a tiny environment...to specific targets, then locate and detect chemical or biological compounds and report this information to the outside world." The researchers used chemicals to etch pores in one side of a silicon chip, making it hydrophobic at the same time because of the chemical qualities. The other side of the silicon chip was treated differently so that it was of a different color and hydrophilic, or attracted to water. The chip was broken into tiny pieces using vibrations, then added to water. When an oily substance was put in the water, the hydrophobic side attached to it and absorbed some of the substance into its pores. Certain substances affected the outward-facing chemically doped side so that observers notice a color change with their naked eye. The technique is significant because the sensor particles effectively work together and are not reliant on a single reflective sensor, says graduate student Jamie Link, who was the first author of the research paper. The research was funded by the National Science Foundation and the Air Force Office of Scientific Research.
    Click Here to View Full Article

  • "Corporate Data in Hand"
    InfoWorld (08/25/03) Vol. 25, No. 33, P. 42; Thompson, Tom

    The J2ME platform--a streamlined version of Java--supports the widest spectrum of embedded and mobile devices and is highly secure, but there are tradeoffs. J2ME applications can run on any embedded device with a Java runtime with little if any alterations, but differences in Java runtime deployments can lead to compatibility difficulties. Developers can generate custom applications with a minimum of effort by taking advantage of the platform's application programming interfaces (APIs), but they cannot use features that do not have an available API, leaving them with little choice but to use vendor-specific APIs. J2ME APIs offer available and easily accessible network support through wired or wireless links; however, J2ME's Mobile Information Device Profile (MIDP) does not support data transfers via any protocol other than HTTP, thus forcing developers to rely once again on vendor-specific APIs. Programmers can test code prior to trying it out on scanty hardware, but compatibility or UI decision problems may stem from discontinuities between the PC simulator and the device's J2ME deployment. J2ME sets up equality for all developers and vendors and delivers best-of-breed software frameworks, but the standards approval process is slow. Such issues are being looked at: Last November, the Java Community Process (JCP) revised and enhanced the MIDP standard with various critical APIs that support secure HTTP-based links; the amended specification also allows trusted code to be supported with digital signatures. The JCP has additionally proposed a Java Technology for the Wireless Industry spec that converts currently optional J2ME APIs into standard services.

  • "Genoa II: Man and Machine Thinking as One"
    Computerworld (09/01/03) Vol. 31, No. 41, P. 22; Verton, Dan

    The Defense Advanced Research Projects Agency's (DARPA) $54 million Genoa II project focuses on potential information technology that could enable humans and computers to think in unison in real time so they can predict and forestall terrorist threats, as stated in official program documents. Though Genoa II's future is in question because of the part it plays in DARPA's Terrorism Information Awareness program, private-sector researchers point to other significant developments in the field of cognitive computing. Research into cognitive machine intelligence, Bayesian inference networks, associative memory, and biologically inspired algorithms all aim for the overarching objective of making computers capable of reducing complex input in the same way the human brain does. Melanie Mitchell of Oregon Health & Science University's School of Science and Engineering explains that mimicking biological systems is one way to boost the intelligence of computers, and notes that genetic algorithms that allow programs to evolve answers to problems and security applications based on the human immune system are areas of continuing study. Inference networks, such as an Autonomy program that combines Bayesian statistics and Claude Shannon's Information Theory, are an example of breakthroughs that could be turned into practical applications and products within the next year or two. "We're able to produce an algorithm that says here are the patterns that exist, here are the important patterns that exist, here are the patterns that contextually surround the data, and as new data enters the stream, we're able to build associative relationships to learn more as more data is digested by the system," explains Autonomy technology director Ron Kolb. He adds that computers of the future will be able to ascertain when several intelligence analysts are devoted to solving the same problem and connect them and their data automatically. Meanwhile, A4Vision CEO Grant Evans foresees cognitive computers that can monitor people with 3D avatars.
    Click Here to View Full Article

  • "'Conversational'" Isn't Always What You Think It Is"
    Speech Technology (08/03) Vol. 8, No. 4, P. 16; Byrne, Dr. Bill

    Dr. Bill Byrne of Stanford University argues that, while "conversational" speech interfaces should boost usability overall, their true range of applications is limited by designers' tendency to have "conversational" refer to the type of exchange a person expects to have with a customer-based call center agent who has no previous relationship with the caller. This hurdle will have to be overcome if the technology is to penetrate the enterprise; the interface must support a conversational style aligned to both the task at hand and the expected relationship between the caller and the virtual agent. Byrne observes that the most efficient conversational speech interfaces may seem brusque and even rude when taken out of context. Imbuing conversational interfaces with "personality" can be another barrier to usability: Byrne cites Stanford University's Byron Reeves, who recently stated that "Personality [at least on the street] usually means 'a lot' of personality. That often results in over-the-top interfaces that can overdo what real people [even those with great personality] would do in similar face-to-face encounters." Byrne reports that designers often make the mistake of adding bells and whistles--witty turns of phrase, for example--that may impress first-time callers, but become irritating with frequent use. The challenge lies in achieving a balance between the maintenance of the conversational interface's dynamic quality and its usability. Byrne suggests that history trackers be embedded into the code or application design tool so prompts can be changed to suit the level of user familiarity, and adds that designers must realize that not all call situations apply to certain basic design credos.
    Click Here to View Full Article

 
                                                                             
[ Archives ] [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM