ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 7, Issue 745:  Monday, January 24, 2005

  • "New Group Will Promote Grid Computing for Business"
    New York Times (01/24/05) P. C7; Lohr, Steve

    Technology companies that include Hewlett-Packard, IBM, Intel, and Sun Microsystems today intend to announce the formation of the Globus Consortium, a group whose mission is to accelerate the corporate adoption of grid computing by developing appropriate software tools and coordinating educational efforts. IBM's Ken King says grid computing is following the same trajectory as the Internet and Linux open-source software, in that it is migrating from government and academic labs into the mainstream commercial sector. "The consortium is a corporate vote of confidence for and commitment to Globus software," comments University of Chicago professor and Argonne National Laboratory researcher Ian Foster, who co-created the basic standards for the Globus software toolkit. Fellow Argonne scientist Greg Nawrocki, who will serve as president of the Globus Consortium, says he wants to boost the group's membership. Microsoft's financial contributions to the Globus project in the past make it a potential member, although the Globus software's open-source nature and operating-system agnosticism may be at odds with Microsoft's business strategy. Analysts expect a major challenge to the consortium will be effectively educating corporations on grid computing and its applications. At the heart of grid computing technology is virtualization, in which intelligent software pools computing resources from a series of machines to execute a single task.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Reaching CONSENSUS on 3G Applications"
    IST Results (01/24/05)

    The aim of the Consensus project is to find a suitable common markup language for third-generation (3G) mobile devices without impacting usability. Usability guidelines were developed out of rigorous user analyses and combined with the definition of targeted user groups in order to compare existing markup languages to find the most appropriate Consensus candidate. Reinhard Sefelin with the Center for Usability Research & Engineering says the project's first challenge involved the creation of an authoring language that developers could accept while retaining enough functionality to uphold adaptation; the resulting Renderer Independent Markup Language (RIML), which was based on the XHTML2.0 and XFORMS general-purpose markup languages, combined familiarity for Web authors with additional single-authoring capabilities. Consensus project coordinator Dr. Thomas Ziegert of SAP Research explains that "RIML allows an author to write content only once in a standardized format based on the Extensible Hypertext Mark-up Language [XHTML] and XFORMS. The content is then automatically adapted for different end-user terminals including voice." A reverse proxy prototype that allowed RIML to be used without the need for user installation was then deployed, and the final step was a field trial with real users and a real corporate application to certify RIML and the prototype architecture. A prototype application platform supporting an array of mobile devices has been implemented by Consensus and issued as open source under the Common Public License; it is available for download on the project Web site. Ziegert expects the Consensus findings to be incorporated in worldwide standards.
    Click Here to View Full Article

  • "Blind Engineering Student 'Reads' Color-Scaled Weather Maps Using Cornell Software That Converts Color Into Sound"
    Cornell News (01/21/05); Oberst, Thomas

    Software developed by a team of Cornell researchers enables engineering student Victor Wong, who has been blind since age seven, to perceive color-scaled weather maps by translating color into sound. Wong co-developed the software with undergraduate student Ankur Moitra and research associate James Ferwerda so that he could read maps of the Earth's upper atmosphere as part of his doctoral work for professor Mike Kelley in Cornell's Department of Electrical and Computer Engineering. Ferwerda says the project was accomplished on a shoestring budget, and he is readying a proposal to the National Science Foundation to do further research. Critical to Wong's work is his ability to read tiny changes in an image and ascertain the numerical values of the pixels in order to develop mathematical models corresponding to the image. Last summer, Moitra developed a Java program that converts images into sound, and then introduced a software program that translates pixels of diverse colors into correlating piano notes; blue is represented by the lowest notes, while the highest notes represent red. The software can also read aloud the numerical values of the x and y coordinates, and the value corresponding to a color at any given point on the image. Wong tested the software using a rectangular Wacom tablet and stylus to explore images. A major challenge is determining boundary lines within images, a problem that the researchers first attempted to solve by printing the lines in Braille and then laying the sheet over the tablet; they are also developing software that can perceive major boundaries for printing, as well as eliminate the time delay between notes.
    Click Here to View Full Article

  • "Machine Learns Games 'Like a Human'"
    New Scientist (01/24/05); Knight, Will

    Researchers at Britain's University of Leeds have developed a computer system that uses observation and mimicry, as humans do, to teach itself to play the children's game "scissors, paper, stone." The system, dubbed CogVis, constructs its own "hypotheses" about the rules of the game by studying video and audio input of human players for specific patterns. The system watched people playing the game with cards marked with scissors, a piece of paper, or a stone; the players were instructed to announce when they won or when the game ended in a draw. After several hours of observation, CogVis was able to successfully call the outcome of each game. CogVis team member Chris Needham explains that the system's visual processor deconstructs action into periods of movement and inaction, and then distills color- and texture-based features; the addition of audio allows the system to formulate theories about the game's rules via inductive logic programming. CogVis was demonstrated in December at an event sponsored by the British Computer Society, and won the prize for Progress Towards Machine Intelligence. "A system that can observe events in an unknown scenario, learn and participate just as a child would is almost the Holy Grail of AI," notes the University of Leeds' Derek Magee. Portsmouth University researcher Max Bramer thinks machines could one day use CogVis technology to learn to control maintenance robots or spot intruders on video, while Imperial College London AI expert Stephen Muggleton says enabling the system to learn more complicated games such as tic-tac-toe will be a major challenge.
    Click Here to View Full Article

  • "Softbots Stride Forward"
    Computerworld Australia (01/24/05); McBride, Siobhan

    The rapid progress of intelligent agent, or "softbot," technology has scientists convinced that people who do not have time to participate in conferences or perform business activities will be able to employ computer-generated avatars that act on their behalf within a decade. Swinburne University researchers are working toward this vision through a $1.5 million project funded by private industry and the Department of Education, Science, and Training; the project is a component of an Australian-European Union Consortium's initiative to develop future service-oriented computing systems. The 21-partner consortium, of which Swinburne is the only non-EU member, convenes leading researchers in the fields of enterprise systems, telecommunications, and telematics services. The end goal of the two-year project is to create agents that automate the Internet-based interchange and formation of software and services, including programs that manage supply, distribution, and sales. These systems promise to be interoperable, secure, and high-quality. Swinburne professor Jun Han is leading an initiative to gauge the security characteristics of software services and service-based systems by constructing a security architecture to deal with the increased potential for break-ins that is an inevitable consequence of the service-oriented computing revolution. Meanwhile, Swinburne professor Ryszard Kowalczyk's team is focused on making softbot agents capable of negotiating and "orchestrating" complex services with their counterparts over the Internet by designing them to adaptively respond to environmental changes, evaluate risks, and learn via experience.
    Click Here to View Full Article

  • "Hitting the Gone Button"
    Los Angeles Times (01/21/05) P. A1; Pham, Alex

    Consumer electronics firms are adopting recycle-friendly design in order to ease compliance with government regulations and adjust to changing realities in the electronics market. In many cases, it is easier and cheaper to replace electronic devices than it is to fix them because of lower costs and fast-changing technology. Panasonic plans to eliminate all hazardous materials from its electronics, including mercury and brominated flame retardants, says Matsushita environmental affairs director David Thompson: For example, the $400 Panasonic SD Video Camera uses none of those materials, uses recyclable lithium batteries, and has an aluminum casing that is easier to recycle than plastic. The amount of electronics Americans accumulate is staggering, with an estimated three-quarters of home PCs unused and stored away because owners do not know what to do with them. Los Angeles has rules against dumping CRT monitors or old TVs, while the European Union plans a ban on hazardous material in electronics. In the United States, electronics make up 70 percent of hazardous waste, but less than 4 percent of solid waste. PC makers are keen to reduce the recycling burden by standardizing plastic materials, eliminating paint, reducing the number of parts, and making systems easy to dismantle. Recycling companies such as Silicon Salvage in Anaheim, Calif., tear apart old computers for recycling, sorting metal components and packing CD-ROMs and hard drives for reuse in Pakistan, Sri Lanka, and India; Dell is moving quickly to LCD displays partly because they are easier to ship to recycling centers than bulky CRT monitors. Recycling proponent and author William McDonough says electronics companies should be pushed to not only encourage recycling, but also reuse materials themselves to make new products.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "A Virus Writer Tests the Limits in Cellphones"
    New York Times (01/24/05) P. C1; Zeller Jr., Tom

    Brazilian software developer Marcos Velasco has launched the cell phone virus age with a worm that passes itself from device to device via Bluetooth connections and other uploads. Security researchers disapprove of Velasco's efforts because he offers the virus code for free download from his Web site, along with code updates to inoculate cell phones against infection; though Velasco says his work is sharing knowledge, people in the security software and cell phone industry worry that anonymous individuals could release the Velasco virus into the small but growing population of Bluetooth-enabled smart devices. The Velasco virus improves on a previous release from virus-writing group 29A, allowing multiple uploads per boot-up and building in the ability to infect system files so that the worm is also spread through memory cards and wireline links. It carries no malicious payload, targets only Symbian devices, and still requires users to approve its download; security software firms rate the virus as a low threat for these reasons. The software platform market for smart devices is still very fragmented, but is quickly moving toward uniformity even as more devices are shipped. Symbian software is consolidating market share with half of all shipments in the fourth quarter of 2004, while PalmSource and Symbian both recently announced their participation in the Open Mobile Terminal Platform group for increased interoperability. Eventually, a more uniform software environment and larger user base will create a ripe target for virus writers and other cybercriminals who could potentially leverage the cellular broadcasting system to download malware en masse. Organizations need to begin developing strategies to protect themselves and their users against cell phone viruses in 2005, even though such viruses have yet to become widespread, says Gartner security analyst John Pescatore.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Whither the ITU?"
    Red Herring (01/24/05)

    Shifting oversight of the Internet to the Geneva-based International Telecommunications Union (ITU) or another organization under the United Nations is expected to be the subject of a working group decision sometime before December 2005. The U.N. agency made its intentions known during the Geneva U.N. meeting in December 2003, and the matter became more contentious when ITU proposed that a portion of the next-generation protocol for Internet addressing, Internet Protocol Version 6 (IPv6), be allocated to ITU, which wants to have a central role in next-generation networks. ICANN is currently responsible for IPv6 addressing. The ITU, which offers each of its 188 member countries a single vote, says it would offer a more democratic alternative to ICANN. Developing countries appear to favor ITU, and according to Robert Shaw, the union's Internet strategy and policy advisor, "the Ciscos, the Junipers, the big telcos and Internet Service providers--they all want to do this work at the ITU." However, opposition to the plan has come from the Internet Governance Task Force of Japan and regional Internet registries, among other interests.
    Click Here to View Full Article

  • "CCT Officially Launches Their Newest Cluster Supercomputer"
    HPC Wire (01/20/05)

    LSU's Center for Computation & Technology (CCT) has a new high performance computer in Nemeaux, which was officially unveiled this week. Co-sponsored by Apple and CCT, Nemeaux is a cluster of 24 Apple Xserve G5 machines that is currently being used as a test-bed for the research center's Cactus Framework and Apple's Xgrid, which are grid computing applications, assisting researchers in their effort to develop tools to conduct distributed audio rendering on the cluster to analyze, process, and generate sound. CCT purchased Nemeaux with a $114,000 grant from the Louisiana Board of Regents, and will use the supercomputer to conduct computational arts research. The computer cluster will aid in video composition and animated rendering, and will assist in other computational areas involving music, film, video, and art. In addition to computational arts research, Nemeaux will be used for scientific computing involving numerical relativity, fluid dynamics, and scientific visualization. Nemeaux will also be used with CCT's two other high performance computers to create a supercomputing cluster to address other tasks and disciplines.
    Click Here to View Full Article

  • "Linux Server Attacks Declining"
    Techworld (01/20/05); Broersma, Matthew

    The Honeynet Project, which consisted of 12 honeynets in eight countries, determined that unpatched Linux systems last about three months on average on the Internet before becoming compromised, with one system lasting as long as nine months. By comparison, an Internet Storm Center project focusing on Windows-based computers measures survival time in minutes, with most lasting an average of 55 minutes in 2003 and just under 20 minutes in late 2004. The life expectancy of Linux systems has improved significantly since 2001-2002, when an unpatched Linux system lasted an average of 72 hours before being compromised. Honeynet researchers believe attackers prefer targeting Windows systems due to their prevalence and ease of attack. In fact, attacking end users, such as personal computers and small-business computers, is more lucrative than attacking highly secured banks. Most of the compromised Honeynet Project computers were used by attackers for Internet Relay Chat bouncing and hosting phishing schemes with one attacker attempting to establish a fake banking Web site in order to collect data from unknowing suspects. Honeynet Project President Lance Spitzner notes that high-value Linux systems such as company Web servers are still targets for attackers, because the potential payoffs for hackers are significant. He says such systems "are prime targets and are attacked every day, if not every hour. If vulnerable, they would be hacked very soon." The Honeynet Project also noted that default Linux installations are increasingly more secure, while older Linux systems are easier to hack.
    Click Here to View Full Article

  • "Gadget Growth Fuels Eco Concerns"
    BBC News (01/20/05); Twist, Jo

    Though national and international efforts to recycle old electronics are making strides in terms of corporate and consumer awareness and responsibility, the EPA in the United States says much more needs to be done--not just in terms of recycling, but in increasing the energy efficiency of products, particularly power adapters. Adapters use out-of-date technology that consumes more energy than is needed to run a gadget, and the EPA estimates that adapters will account for more than 40 percent of electricity used in American households if current trends continue. To counter this, the EPA introduced new Energy Star guidelines for developing adapters that are about 35 percent more efficient at the recent Consumer Electronics Show (CES). CES also set the stage for eBay's unveiling of its Rethink initiative, a collaborative PC recycling promotion venture between leading tech firms, environmental groups, government agencies, and eBay customers. Mobile devices are another source of electronic waste, and CES served as a launch pad for RIPMobile, a project to make recycling "learned behavior" among young people, according to RIPMobile's Seth Heine. "This system allows for the transformation of a drawer full of unused mobile phones into anything from music to clothes to electronics or games," he notes. Electrical manufacturers in Europe will become responsible for recycling returned products when the European Union's Waste Electronic and Electrical Equipment directive goes into effect in August. Meanwhile, hazardous materials such as lead, chromium, and cadmium will be disallowed in all products in the EU by next year.
    Click Here to View Full Article

  • "Federal Role in Ensuring Cybersecurity Isn't Clear"
    InformationWeek (01/20/05); Greenemeier, Larry

    When President Bush submits his 2006 budget to Congress next month, it will incorporate funding for all federal agencies to implement necessary cybersecurity technology. The budget will not address exactly what technology is required due to potential for such requirements to become obsolete, according to the White House Office of Management and Budget. The Cyber Security Industry Alliance and others are calling for creation of a person within the Department of Homeland Security to oversee cybersecurity at both the federal and public levels. However, Congress did not agree and recently removed such a position from consideration. Bush has provided much leadership in the development of cybersecurity for government agencies, but he needs to focus more on private industry in order to ensure overall U.S. infrastructure protection, says Cyber Security Industry Alliance executive director and former White House Homeland Security Council critical infrastructure protection senior director Paul Kurtz.
    Click Here to View Full Article

  • "Preventing a New World Internet Order"
    CircleID (01/18/05); Levinson, Bruce

    The United Nations Education Scientific and Cultural Organization (UNESCO) will be holding a conference on "Freedom of Expression in Cyberspace" in Paris from February 3-4, 2005. This conference should give plenty of fuel to the argument that the U.N. has no business being put in charge of the Internet, writes Bruce Levinson of ICANNfocus. The UNESCO conference will address "whether universal free expression standards should be applied to the Internet and how free expression can be protected while respecting individual privacy, national laws, and cultural differences." The fact that UNESCO is even debating whether free expression is applicable to the Internet sheds light on how UNESCO views the subject. The conference is being held in support of the second part of the UN's World Summit on the Information Society. For years UNESCO has been dogged by suspicions that it endorses censorship. For example, UNESCO once gave its support to a New World Information and Communications Order that called for governments to exert more control over the media, and a human rights group claimed last year that it was being censored by UNESCO. The United States temporarily severed its ties with UNESCO in 1984 due in part to the organization's political stances.
    Click Here to View Full Article

  • "W3C Specifies Architecture for Whole Web"
    SD Times (01/15/05) No. 118, P. 1; Lee, Yvonne L.

    The World Wide Web Consortium's (W3C) Technical Architecture Group has issued a definitive document on how the Web should operate, especially how to identify resources, convey information about those resources, and interact with online elements. Called the Architecture of the World Wide Web Volume One, the document was needed to resolve high-level disagreements between W3C working groups that shared responsibility for some aspects of Web architecture, says W3C communications head and document co-author Ian Jacobs. The Technical Architecture Group is comprised of eight senior Web technologists, including Tim Berners-Lee, XML creator Tim Bray, HTML 2.0 creator Dan Connolly, Universal Resource Identifier (URI) co-author Roy Fielding, SOAP 1.1 co-author Noah Mendelsohn, Microsoft's Paul Cotton, BEA's David Orchard, and Sun Microsystems' Norm Walsh. Much of the document focuses on URIs, which are more important than other foundational Web technologies, says Jacobs; URIs represent every single Web object, such as files, images, or applications, and the inclusion of as many objects as possible increases the value of the Web. HTTP and HTML can be replaced by other technologies, but Jacobs says URIs are more fundamental, while the document's section on interaction also specifically states how to restrict access and handle broken links. Separately, the W3C is also investigating whether to establish a binary XML format, which would prove especially useful for mobile applications where bandwidth, processing capabilities, and power use are of major concern. The W3C working group studying binary XML will have to decide in March whether it is possible to create a binary XML format that can be used in all situations and whether or not to pursue such a standard.
    Click Here to View Full Article

  • "Small Wonders"
    CIO (01/15/05) Vol. 18, No. 7, P. 74; Dragoon, Alice

    There are a number of relatively small technologies whose returns can potentially exceed their size. Hilton has deployed Xybernaut tablet PCs at seven of its 350-plus hotels: The wearable, easy-to-use PCs communicate wirelessly with local access networks, and can function in tandem with small wearable printers and magnetic stripe readers to enable guest service agents (GSAs) to streamline guest check-in, swipe customers' credit cards, encode room keys, print arrival, deposit, and room number confirmations, and produce itemized folios at checkout. Hilton's Robert Machen reports that equipping GSAs costs less than $5,000 per agent, whereas the potential to build customer loyalty and enhance the company's reputation as a technology innovator is invaluable. Another promising small technology is wireless mesh networks composed of low-power communications nodes, one of the most notable examples being Dust Networks' SmartMesh. SmartMesh marries low power consumption (each node only activates to capture and transmit data on an as-needed basis) with redundancy (information flows are automatically rerouted along alternate paths if a node malfunctions or is impeded) and self-configuration, and can be used to relay commands as well as collect data. Dan Bertocchini, Supervalu's corporate director of energy management, is testing a 19-node SmartMesh network to monitor power usage in a Minneapolis-based grocery store, and he thinks the deployment will be 80 percent less costly than the installation of wired sensors and help improve the efficiency of Supervalu's energy management effort. Other small technologies are being used to assist emergency first-response services: Tachyon Networks' Auto Deploy, which starts at around $18,000, eliminates the need for setting up a redundant T1 line in favor of Internet access via satellite. Tachyon's Quick Deploy product, meanwhile, offers rapid setup of wide-ranging satellite-based data connectivity.
    Click Here to View Full Article

  • "Enemy at the Gates: The Evolution of Network Security"
    Business Communications Review (12/04) Vol. 34, No. 12, P. 14; Wilson, Jeff

    Enterprise network security may be cheaper, simpler, and more readily available, but the threat of security breaches continues to keep pace with new defensive technologies and products as hackers prove more crafty, intelligent, and aggressive than originally assumed. Furthermore, the danger of internal attacks goes largely ignored and network protection becomes exponentially harder as businesses extend access to others in order to reduce costs, boost productivity, and increase revenue. Network security solutions fall into two camps: Standalone security products and network-integrated security products. The former category defines tools purpose-built for security with few networking functions (if any), while the latter category features routers and switches with integrated security capabilities such as firewalls and intrusion detection. One strategy many standalone security product companies have adopted to contend with the internal security threat involves refining existing products or inventing new ones--for example, enterprise deployments of large wireless LANs were accompanied by the emergence of vendors selling WLAN-specific security gateways that offered a combination of authentication, firewalls, and VPN/encryption. The rationale for implementing network-integrated solutions can differ according to the size of the company. Large companies tend to follow a "defense in depth" strategy which reasons that the likelihood of attack prevention and security cost reductions increases as security technology is more widely deployed; small and mid-sized companies, meanwhile, see networked-integrated products as a tool for lowering both capital and operational costs. A recent Infonetics Research study of 240 organizations concludes that most security professionals mix standalone and network-integrated solutions together.
    Click Here to View Full Article

  • "R&D Budget Brings Modest Increases to Most Civilian R&D"
    Physics Today (01/05) Vol. 58, No. 1, P. 24; Dawson, Jim

    Military and homeland security R&D received the largest boosts in funding in the recent congressional budget, continuing their three-year-long spree. The National Science Foundation (NSF) was shunned after what Office of Science Director Raymond Orbach called a six-year "golden era" for the foundation. The NSF cuts came in spite of President Bush's authorization to double the NSF budget over five years just 19 months earlier. Rep. Vern Ehlers (R-Mich.), a physicist, decried the congressional paring of the NSF budget, pointing out that basic scientific research is the foundation to future innovation and national competitiveness. During the presidential election run-up, former congressman John Porter (R-Ill.) told an American Association for the Advancement of Sciences meeting that national R&D would inevitably have to bear its share of budgetary pressures. Instead of advocating for spending increases, science advocates should only concentrate on not taking on more of a load than what is fair, he said. House Committee on Science Democratic staff director Bob Palmer noted that traditional Democratic bastions such as the EPA and NSF received budget cuts, while NASA and other "less liberal" sectors fared better. Nobel laureate Burton Richter, who joined a petition to support the Kerry campaign, said the scientific community would be responsible in its current duties despite support for the opposition during the election run-up. The 2005 budget resulted in a 1.9 percent budget drop for the NSF, a 19.9 percent increase for Department of Homeland Security R&D, a 4.3 percent increase at the Department of Energy's Office of Science, and a 4.5 percent increase for NASA. The Department of Defense saw its budget rise 7.1 percent to a record $70.3 billion, with a 23 percent increase for DARPA's basic research program. The National Institute of Standards and Technology's budget dropped 0.5 percent, but the figure is misleading because of administration and congressional disagreement about the future of the Advanced Technology Program, which ended up receiving a 24 percent cut.
    Click Here to View Full Article
    (Access to full article is available to paid subscribers only.)

  • "What We Can Learn From Robots"
    Technology Review (01/05) Vol. 108, No. 1, P. 54; Huang, Gregory T.

    Director of Japan's ATR Computational Neuroscience Laboratories Mitsuo Kawato values robots for the insight they can provide into human brain functions, a concept that deviates from the usual economic or assisted-living motivations behind robot development, according to Carnegie Mellon robotics expert Christopher Atkeson. Kawato is convinced that experiments with humanoid robots can yield streamlined simulations of brain-cell behavior that can be compared to the actual workings of neurons in human and primate brains using sophisticated imaging methods. The data from such research could be applied to the creation of therapies for brain damage as well as neurological, cognitive, and behavioral disorders. ATR researchers are employing humanoid robots such as Dynamic Brain (DB) to test neuroscience theories: Kawato and fellow ATR scientist Gordon Cheng believe people use "internal models" to measure connections between neural signals and subsequent body movements, and have extended that hypothesis to DB by having the robot use software to compute what commands will generate the proper series of motions needed to fulfill a certain objective. Determining how big a role a brain's operations play in a robot's execution of tasks is the focus of a project whereby human subjects learning to use an unfamiliar tool are analyzed via magnetic resonance imaging in the hopes that the acquired knowledge will lead to better robots. One of the hoped-for goals of such research is a remote brain-machine interface that will enable the user to participate in geographically distant events. Another objective is to make robots more autonomous, and an $8 million upgrade to bring DB's anatomy, neural architecture, power requirements, and strength to a more human level will be employed to study gait disorders and falls among the elderly. Kawato is also urging Japan's government to help fund a global initiative to build a robot that matches a five-year-old child in terms of cognitive and physical ability.

  • "Considerate Computing"
    Scientific American (01/05) Vol. 292, No. 1, P. 54; Gibbs, W. Wayt

    Digital gadgetry's propensity to interrupt users with alerts is not only a source of social embarrassment, but of declined productivity: Studies support the idea that interruptions in normal routines slow people down and make them more likely to commit errors. "If we could just give our computers and phones some understanding of the limits of human attention and memory, it would make them seem a lot more thoughtful and courteous," notes Microsoft's Eric Horvitz; he is part of a small but expanding group of researchers trying to develop "attentive" systems capable of inferring their owners' whereabouts and activities, weighing the value of the messages they wish to send against the consequences of interruption, and selecting the best time and manner of interjection. A study of human "interruptibility" conducted by Carnegie Mellon University and IBM Research found that truly useful attentive systems must be over 65 percent accurate in detecting when users are close to their cognitive thresholds. The researchers determined that adding microphones to pick up conversations within earshot increased that accuracy to 76 percent, while the detection of mouse movement, keyboard activity, and computer application status raised accuracy to 87 percent. Carnegie Mellon's Scott Hudson recommends that the attentive system analyze but not record input data streams in order to address privacy concerns. Roel Vertegaal with Queen's University in Ontario has made everyday appliances capable of responding to users' vocal commands and shutting themselves off when the user's gaze is no longer fixed on them through a combination of speech recognition and infrared scanning. Another approach to attentive system design is Bayesian networks, which are employed in spam filters and network firewalls to statistically learn a user's preferences in terms of wanted and unwanted messages. However, the University of Maryland's Ben Schneiderman reports that more attentive systems are less predictable, and notes that the scientific community has a history of building "smart" technologies that go unused because their operational principles are not easily understood.

    [ Archives ]  [ Home ]