Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM. To send comments, please write to technews@hq.acm.org.
Volume 5, Issue 583:  Monday, December 15, 2003

  • "Criticism of Electronic Voting Machines' Security Is Mounting"
    IDG News Service (12/12/03); Heichler, Elizabeth

    Controversy continues to simmer over how secure direct recording electronic (DRE) voting machines currently being deployed or planned for deployment in various U.S. states are. Spurring these installations is the Help America Vote Act of 2002, in which the government is offering up to $3.8 billion in funding to states that transition from punch cards and lever election equipment to e-voting systems by January 2006. Doubts about the security and reliability of such systems are being fueled by evidence of malfunctions as well as evaluations of DRE machines manufactured by Diebold, Election Systems & Software, Sequoia Voting Systems, and Hart InterCivic that uncovered high-risk vulnerabilities. Compuware, for example, found security issues with both Hart and Sequoia e-voting systems in the state of Ohio; Hart Chairman David Hart reported that his company will be able to easily accommodate the changes the Compuware report calls for, while Sequoia's Alfie Charles declared that many of the changes Compuware suggested have been made. Six DRE machine vendors have joined forces by founding the Election Technology Council, an organization that Hart said wants to counteract some of the "misinformation" being spread around about the voting machine industry, as well as deal with ethical and security-related issues. One of the biggest critics of DRE security is Seattle software developer Erik Nilsson, who is convinced that a printed audit trail is still necessary in order to guarantee safe elections. Kennesaw State University professor Brit Williams, on the other hand, believes a paper audit trail is unnecessary, and insisted that electronic voting machines are trustworthy enough when combined with physical, legal, and procedural safeguards. Ted Selker of MIT's Media Lab has claimed to have developed a secure voting scheme that employs multiple redundant software elements, allowing voter verification without a paper trail; he further suggested that many lost votes can be traced back to voter registration problems, and recommended that IT professionals should devote more research to that area.
    Click Here to View Full Article

    For more information on ACM's activities related to e-voting, visit http://www.acm.org/usacm/Issues/EVoting.htm.

  • "In India, a High-Tech Outpost for U.S. Patents"
    New York Times (12/15/03) P. C4; Rai, Saritha

    U.S. technology companies are increasingly investing in Indian research and development operations, with the total number of internationally employed engineers in India expected to double in the next 18 months, according to personnel experts. As thousands of Indian engineering graduates enter U.S. employment each year, they will contribute to a rapidly rising portfolio of U.S. intellectual property originating in India. Intel wireless broadband technology researcher Ajith Prasad says he and his team are doing the same type of work they might if they were located in Santa Clara. Intel is also building a new 32-bit processor with 1 billion transistors wholly in Bangalore. Intel Technology India President Ketan Sampat says the cost of setting up comparable research groups in the United States would easily double the amount spent in India. Texas Instruments, the first company to heavily invest in Indian research operations nearly two decades ago, has already begun reaping the fruits in the form of 225 U.S. patents awarded to date. A recent design coup from Texas Instruments' Indian engineering group is the fastest-ever analog-to-digital converter. Although much of the new growth comes from new Indian workers, experts say about 5,000 veteran Indian-born technology workers have returned to the country from stints in the United States; many of these experienced workers head up Indian operations for their U.S. employers and help bridge culture gaps. Intel's Sampat acknowledges the Indian scene is still at a nascent stage, however: "The ecosystem of design tools, silicon design, systems design is not completely formed yet," he says.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "NASA Takes the Internet Into Space"
    TechNewsWorld (12/13/03); Halperin, David

    One of the experiments carried out on the ill-fated Columbia mission in early 2003 could pave the way for Internet communications in space: The Communications and Navigation Demonstration on Shuttle (CANDOS) project employed a low-power transceiver (LPT) on the Columbia to communicate with bases on Earth, while the LPT's accompanying antennas were also put through their paces. The Internet used for shuttle-to-Earth communications was NASA's IONet, which uses the same protocols as the Internet while remaining separate from the Internet at large. CANDOS chief investigator David Israel says the system offers clear cost, simplicity, and reliability benefits--it promises to eliminate the need for custom-made hardware, easing testing and integration. The CANDOS experiment involved some 134 tests and demonstrations using regular IP formats, covering the categories of commanding, file delivery, and telemetry. The User Datagram Protocol was used to control the shuttle from the ground with a telemetry-free "blind command;" Multicast Dissemination Protocol (MDP) was employed to upload stored file commands; and the signals were encrypted with Secure Copy Protocol and Secure Shell Protocol. Uploading and downloading of files was carried out through MDP and FTP, while artwork and class photos from schools were also uploaded for a time. The groundwork for CANDOS goes back five years, beginning with the deployment of NASA's TEDRIS satellite and continuing with the launch of the UoSAT-12 satellite; the CHIPsat launch in January 2003 paved the way for the CANDOS project. Israel believes the success of the experiment, despite the tragedy, will enable the more powerful and flexible deployment of tools in space. In the foreseeable future is communication between both manned and unmanned space vehicles and ground-control centers via IP.
    Click Here to View Full Article

  • "Renovating E-Mail With Identity in Mind"
    EarthWeb (12/11/03); Parker, Pamela

    Internet firms are beginning to coalesce around a technical solution to spam that would involve setting up an identity verification scheme, possibly tied to a reputation rating. Yahoo! recently proposed a system called DomainKeys that would link digital signatures generated by private keys to every email, allowing recipients to verify the signatures with public keys stored on a public server; Yahoo! is in discussions with major ISPs and email providers, and is expected to provide more details about DomainKeys soon. Another proposal comes from email infrastructure firm IronPort Systems, which is in partnership with the Network Advertising Initiative's Email Service Provider Coalition (ESPC). IronPort proposed SMTPi (Simple Mail Transfer Protocol with identity features added), a scheme that matches the last sending server's IP address with a registry whitelist. Spammers can forge every part of the email header except the IP address of the last server used to route the message. Similar to DomainKeys, SMTPi would also use digital identity certificates generated with a private key into the email header, which would then be used to validate the message sender. SMTPi is the first practical step in a more ambitious ESPC proposal called Project Lumos first fielded in September; under that system, IronPort would become a "federated registry" used to record identity and reputation of email senders. ESPC technology committee co-chair Margaret Olson says Yahoo!'s DomainKeys system is "completely consistent" with her group's plan and that the entire industry is quickly converging on this type of technical solution. She says the road forward will see several more proposals and perhaps some balkanization among standards, but that eventually common interests will converge around the same protocols.
    Click Here to View Full Article

  • "High Tech's Government Dilemma"
    CNet (12/12/03); Kanellos, Michael

    Michael Kanellos writes that a dichotomy about the government is pervading the high-tech community: Some technologists and CEOs view federal legislators as bunglers who ruin innovation and promising new technologies because of their failure to understand the high-tech industry, while others see the government as a nurturer of innovative technologies for public use. The dilemma is that the same officials accused of hurting the high-tech industry are also receiving kudos for helping it. Kanellos notes that Sen. Joe Lieberman (D-Conn.), who recommended that American companies be granted research funds and tax cuts, also has suggested limiting such companies' authority to transfer operations overseas, which is viewed by many firms as a key stratagem to remaining competitive. Getting rid of this dichotomy will be important for the U.S. tech industry, as other countries aim to extend their economic influence through government-mandated tax incentives and university funding; during a press conference at a Semiconductor Industry Association meeting last month, Intel CEO Craig Barrett declared, "Competition for jobs is going to be the key battle in the next several decades. Ultimately, we are all in this together." Kanellos points out that the public is not exactly greeting the prospect of tax breaks and other government-approved perks for corporations warmly, given their difficulty in seeing how such benefits will trickle down into communities, a large portion of which is currently unemployed. He adds that many in the tech industry have yet to face the fact that there has been a profound shift over the last 40 to 50 years away from the Cold War-era race to beat the Soviets' technology efforts. Barrett noted at the press conference that federal and state governments in the United States will have to make infrastructure investments and boost funding for elementary and secondary education.
    Click Here to View Full Article

  • "China Tries to Establish Homegrown Tech Rules"
    SiliconValley.com (12/14/03); Gillmor, Dan

    China is attempting to break away from its reliance on Western technology standards--and the royalties Chinese companies must pay--by establishing domestic tech specifications in diverse areas, much to the chagrin of American and other Western companies who rail at its protectionist motivations. Dan Gillmor writes, "With the largest domestic market on the planet, at least potentially, plus an increasingly creative and well-educated workforce, China is creating its own competitive set of standards for its own market, although the global potential is obvious." In early December, the Chinese government called for the establishment of a wireless communications encryption standard to eliminate the nation's dependence on Wi-Fi, whose security specs are less than sterling--and though Gillmor expects global trade rules to scale back China's success in this area, he admits that the move could be a boon to local industry. The Chinese government, in partnership with software firms, academic institutions, and even other East Asian countries, is making major investments in the adoption of the Linux open-source operating system and the rollout of Linux-related applications. Meanwhile, last month China declared its plans to compete with the DVD standard with the rollout of Enhanced Versatile Disk (EVD) technology, which promises to save Chinese businesses huge royalty payments to Western standard owners and give Chinese video-machine manufacturers a shot in the arm. Other Chinese efforts to set up a homegrown technology base include processors such as the Dragon Chip, which runs Linux; and the next-generation TD-SCDMA mobile phone standard. Gillmor comments that, ironically enough, "the U.S. hasn't been shy about blatantly illegal protectionist measures, either, when the political circumstances deemed to warrant them."
    Click Here to View Full Article

  • "Will Open Source Be Forced to Go Proprietary?"
    Enterprise Linux IT (12/11/03); Maguire, James

    Open-source's mainstream adoption requires interoperability with proprietary applications, and such a development is taking place; as this trend continues, there are worries that open source could be forced to abandon its openness in response to competitive pressures. Forrester analyst Ted Schadler notes that the wide availability of open-source code allows interoperability to be created by building new components atop existing elements, while Meta Group analyst Thomas Murphy points out that email, version control, and other standard interfaces are used to integrate open-source and proprietary technologies. Aberdeen Group analyst Bill Claybrook says the open-source applications can certainly be tweaked to run on Windows, Unix, and other proprietary environments, provided there is enough incentive. However, he notes a lack of interest in such a development. International Data analyst Dan Kusnetzky says that certain applications have a greater likelihood than others to become compatible: For instance, applications for specialized uses such as real-time process control are likely to be proprietary, given the limited number of people knowledgeable in such areas. He adds that "where the knowledge of things is broad-based, those are potentially open-source projects" and thus more likely to boast interoperability. Schadler is doubtful that open source will move to a more proprietary model--he is confident that the General Public License is a strong obstacle to such a development. Murphy agrees that "in general, the pressure will be on Linux vendors to maintain the 'purity'" of the open-source model.
    Click Here to View Full Article

  • "A New Standard for Fabric Intelligence"
    EarthWeb (12/10/03); Clark, Tom

    The multi-vendor fabric application interface standard (FAIS) promises to allow enhanced storage services that are simple to manage from the user's standpoint. Virtualization and other services using FAIS hide complexity from the user and embed high-value functions in the network fabric itself, moving SAN technology from simple connectivity to intelligence. Storage network vendors understand the value of FAIS standardization, which will open up new market opportunities for them and save their traditionally high-margin business from commoditization. With FAIS, vendors will be able to craft more comprehensive solutions of greater value to the customer. Technically, FAIS involves separating the control and data paths, where data paths appear as an abstraction apart from physical storage elements. For normal SCSI transactions, fabric switches will be able to represent multiple storage arrays as a single storage pool. Back-end discovery and configuration will require virtualization software and a method for resolving errors and exceptions between data and control paths. In terms of actual products, some early offerings involve bladed PCs fitted into a switch chassis and silicon solutions that are smaller in footprint. Benefits of FAIS-enabled systems are tremendous, and include storage virtualization, data replication between heterogeneous systems, snapshots, data journaling, and the ability to focus on applications instead of infrastructure; complications, however, include increased back-end complexity, while fabric vendors will need to differentiate their products based on reliability and smooth operation on the back-end.
    Click Here to View Full Article

  • "Electronic Voting No Magic Bullet"
    CNN (12/12/03); Walton, Marsha

    Election officials are encountering flaws in high-tech electronic voting equipment, even though many have supported e-voting as a way to get past the voting debacle of the 2000 presidential election. With problems occurring this year in California, Virginia, and Indiana, some observers now say technology should not be expected to solve all problems. In fact, Harvard University computer professor Rebecca Mercuri questions the lack of action taken in response to the well-publicized problems with e-voting systems. "Officials are not removed from their posts, fired or sent to trial; vendors are not banned from participation; equipment is not recalled; standards are not rewritten; and elections are not re-held," says Mercuri. The National Institute of Standards and Technology is charged with setting the standards for e-voting equipment, but the agency within the U.S. Commerce Department does not have the power to enforce the guidelines. A NIST sponsored meeting this week titled "Building Trust and Confidence in Voting Systems" brought together computer scientists, voting machine vendors, and election officials to discuss the issue. Vendors maintain their equipment is secure, while election officials say the human side of running an election is equally important. Meanwhile, new legislation in Congress calls for a mandatory paper trail for e-voting systems. Nonetheless, updated standards and equipment are not likely to be in place before the 2004 presidential election.
    Click Here to View Full Article

    See http://www.acm.org/usacm/Issues/EVoting.htm for more information on e-voting.

  • "World ICT Rankings: US Stays at Top, UK Falls to 15th"
    Out-Law.com (12/11/03)

    The World Economic Forum's Global Information Technology Report assesses the impact of information and communications technologies on economies around the world, and its Networked Readiness Index measures how prepared each economy is to benefit from developments. The United States is at the top of the index due mainly to its usage of technology in the public and private sectors, and it is also the most innovative, according to the report. Singapore took second place because of its public-private partnerships to promote technology penetration and usage, followed by Finland, Sweden, and Demark with high technology penetration rates. "The use and application of [information and communications technologies] remain one of the most powerful engines for economic growth," says World Economic Forum founder and executive chairman Professor Klaus Schwab. "More than ever, we must all intensify our efforts to enable individuals, businesses and governments to benefit more fully from the use and application of [information and communications technologies]."
    Click Here to View Full Article

  • "Cluster Crunchers Take the Biscuit"
    New Scientist (12/06/03) Vol. 180, No. 2424, P. 28

    According to the latest supercomputing TOP500 speed chart, NEC's Earth Simulator in Japan--a dedicated machine used to simulate climate with unmatched detail--is the most powerful supercomputer on the planet. But just two places below the Earth Simulator is Virginia Tech's X System, a cluster supercomputer composed of 1,100 dual-processor Power Mac G5 desktops built for just a fraction of the cost of comparable dedicated supercomputers. For the first time, the TOP500 list has more cluster machines built from cheap off-the-shelf computers than dedicated machines, and all but three of the 10 leading supercomputers on the list are clusters. Though this demonstrates a clear shift from dedicated supercomputing to cluster supercomputing, the former type of machine is by no means out of the race, as the Earth Simulator's No. 1 status attests. Journal of Supercomputing editor-in-chief Hamid Arabnia notes that dedicated architectures are easier to maintain and less likely to fail than clusters, while their ability to facilitate maximization of inter-processor communication speeds and memory access makes them more suitable to handling problems that cannot be easily "parallelized" by splitting them into smaller subsets. IBM's Blue Gene supercomputer, which promises to be 10 times faster than the Earth Simulator, is designed as a hybrid of the dedicated and clustered supercomputing approaches. The 131,000-processor, 360-teraflop machine will consist of hundreds of units that deliver 1.4 teraflops each, lashed together into a cluster configuration.

  • "Hard Disks Go Home"
    Economist Technology Quarterly (12/06/03) Vol. 369, No. 8353, P. 33

    Consumer electronic devices that accommodate hard disks are experiencing rapid growth: InStat/MDR estimates that about 9 million such units were sold last year, with almost 90 million expected to be sold by 2007. The increasing appropriateness of hard drives in consumer electronics is being driven by their increasing robustness and shrinking cost and size, and this trend has the potential to pull hard disk manufacturers' sales revenues out of the doldrums. Hard disks are making a significant impact in portable music players, thanks to their enormous storage capacity and their lower cost per byte than memory cards. However, solid-state memory offers greater durability than hard drives and is less power-consumptive, which effectively locks hard disk vendors out of the music player market's lower end. Hard disks are also making a splash in the emerging market for digital video recorders (DVRs), which promise consumers features not available on traditional video recorders, such as the ability to record multiple shows simultaneously, pause or rewind a live TV broadcast, and tape programs according to user preferences. DVRs are still unsuitable for long-term storage and suffer from relatively low levels of user awareness, but emerging hard-drive-equipped handheld video recorders should help address the second problem. Game consoles are another consumer electronic product category where hard drives are showing up, but a lot of people question their inclusion, given that most games are distributed on DVD-like disks; Seagate Technology's Rob Pait says hard disks could allow gamers to download new game levels and characters off the Internet. The appeal of hard disks for consumer electronic devices is limited by the size of their drives, which usually measure 2.5 inches to 3.5 inches, though new markets could be penetrated with the emergence of 1-inch hard drives.
    Click Here to View Full Article

  • "Beyond Wi-Fi: A New Wireless Age"
    Business Week (12/15/03) No. 3862, P. 84; Yang, Catherine

    Intelligent network technologies under development at academic, corporate, and military research facilities aim to improve the range, efficiency, and services of Wi-Fi and allow wireless innovators to access the mostly vacant radio spectrum they need to make their inventions flourish. Smart antennas are designed to extend the range of Wi-Fi, currently restricted to a roughly 300-foot radius, by discriminate energy transmission: Vivato, for example, has been able to achieve a transmission range of up to 2.5 miles by clustering over 100 small antennas that compress about 100 milliwatts into thin seven-to-eight-degree beams. Individual antennas send out their own signals on conventional radio waves that collectively cohere into rays configured to reach a specific target, and software allows the antennas to change the rays' shape and direction as the targets change position. Mesh networks consist of wireless devices--cell phones or laptops, for example--that can function as individual base stations or hubs to relay signals to their nearest neighbors so that transmissions can be efficiently routed to their targets, regardless of obstructions. Nokia, Intel, and other companies are experimenting with ad hoc mesh networks as a way to boost the range of Wi-Fi. The military, meanwhile, is testing software-defined radio as the first step toward the creation of agile radios, which would be able to scan airwaves for unused space in order to avoid signal disruption caused by traffic congestion. A radio from General Dynamics that can communicate in 10 different frequency bands is being put through its paces by sailors on a Navy flagship. The software the radio employs allows it to transmit and receive in multiple frequencies, thus overcoming communication barriers between different radios used by individual branches of the Armed Forces.
    Click Here to View Full Article

  • "How Will Web Services Ultimately Change Business?"
    CIO (12/01/03) Vol. 17, No. 5, P. 108; Jahnke, Art

    Web services will not only allow computer systems to share a common language, but also enable them to create solutions independent of humans: This ability promises to change how companies operate in ways unforeseen today, almost certainly affecting the role of the human technology worker. When first developed, the radio was mostly seen as a way for ship captains to communicate with on-shore colleagues, notes University of California at Berkeley economics professor Hal Varian, one of the speakers discussing Web services at the Symposium on the Coevolution of Technology-Business Innovations held at IBM's Almaden Research Center this fall. Today, radio is not seen as a niche application, but a technology with wide-ranging social and technological effects. Web services technology promises similar change: In businesses, Web services is certain to take over some decision-making tasks of humans, but artificial intelligence will not be able to understand how to create new markets or customer desire until computers learn to think like humans, says Eastern Municipal Water District CIO Daniel C. Ashley. Web services are similarly unsuitable for the financial world, where in-house online transaction systems have proven far more responsive, says WorldGroup Consulting's Rodney Griffin. Though Web services may provide peripheral application functions, they will not replace companies' core applications where performance is key. Eighty percent of application maintenance labor involves specification, problem determination, and human interaction--areas that computer-driven Web services cannot compete, adds College of the Canyons associate professor Dean Cashley. Those functions will remain the domain of human IT workers, while Web services will take over many purely technical computer programming tasks and be incorporated in commercial enterprise products.
    Click Here to View Full Article

  • "Seeing the Road Ahead"
    GPS World (11/03) Vol. 14, No. 11, P. 22; Scott-Young, Stephen

    Road safety could be dramatically improved with in-vehicle augmented reality (AR) systems that keep motorists aware of road conditions and other vehicle positions despite poor visibility, and an Australian-designed prototype system promises such benefits through the integration of real-time road video footage and Global Positioning System/inertial readings. The University of Melbourne's Department of Geomatics constructed the Intelligent Navigation Aid (INA) from off-the-shelf equipment: GPS receivers, a fiber-optic gyro, and the vehicle odometer serve as the AR system's tracking component, while its retrieve and inform functionality is provided by a processor and display unit and a digital video camera. Three fixed GPS receivers are used to calculate the vehicle's heading, pitch, and roll, while a fourth receiver supplies real-time kinematic (RTK) corrections. The three GPS antennas are kept as far apart as possible to maximize the accuracy of the GPS attitude determination. Sensory data is integrated by a Kalman filter that employs geometrical restraints to boost its effectiveness; the filter also uses a three-sigma rule so the system can respond quickly to sudden changes in heading. Properly aligning digital camera imagery and augmented objects involves calibration through photogrammetry and real-time alignment via navigation instrumentation, so that augmented objects can be overlaid on the video footage in proper perspective. The accurate tracking of other vehicles on the road requires those vehicles to be outfitted with GPS receivers and transmitters, since optical recognition techniques are unsuitable in poor visibility. The INA's effective function requires accurate road boundary data and the capacity to operate during RTK and GPS blackouts. It is hoped that advancements will eventually lead to an AR system in which augmented objects are projected directly onto the driver's windshield.
    Click Here to View Full Article

  • "Contagious Media"
    CIO Insight (11/03) No. 33, P. 38; Stepanek, Marcia

    John Patrick claims that blogs can reinvigorate corporate knowledge management with their potential to "deliver the grassroots discussions and knowledge-sharing that top-down, corporate-sponsored efforts never could." The Attitude president and former IBM VP defines blogging as a new communications medium, widely underestimated, that is accessible to virtually anyone; its importance to business is apparent in its power to allow any company member to relate their knowledge across hierarchical, financial, or accessibility barriers. "[Blogging is] a way to energize the expertise from the bottom--in other words, to allow people who want to share, who are good at sharing, who know who the experts are, who talk to the experts or who may, in fact, be one of those experts, to participate more fully," Patrick comments. Many CIOs are unaware of blogging because of security, skills, and budgetary issues, but Patrick contends that blogging, like instant messaging, will lead to productivity gains across the IT department as well as the entire enterprise. He expects corporate blogs to emerge as a resource for current perspectives on topics important to employees, and a tool through which companies can share their knowledge with suppliers and clients. Patrick predicts that staffers will turn to blogs written by trusted and respected sources, rather than the corporate intranet, to seek out information. He suggests that a corporation set up a blog central Web page listing the blogs of experts or their representatives according to relevant topics, and it is essential to blogs' credibility that they originate from the organization's grassroots. Patrick believes that the next three to five years will see a vast augmentation of knowledge through blogging, which he admits can lead to disruption and destabilization, but in a positive way.
    Click Here to View Full Article

  • "More Than Videoconferencing"
    Network Magazine (11/03) Vol. 18, No. 11, P. 36; Greenfield, David

    Hewlett-Packard Labs research fellow Norm Jouppi is developing the Surrogate, a remote-controlled robotic technology designed as an alternative to videoconferencing that could have further applications outside a conference environment. The six-foot-tall, 300-pound Surrogate is equipped with screens that display the remote user's face, left and right profiles, and back of the head; speakers and microphones to project the user's voice and capture conversations, respectively; and four wide-angle cameras to give the user a visual perspective of the local office. The Surrogate can adjust its height to that of the remote user so that local parties can maintain proper eye lines. The device uses a Proxim 802.11a card to transmit four MPEG-2 video streams to the user, who sits in a 15-foot x 15-foot enclosure surrounded by screens displaying images of the conferencing environment captured by the Surrogate's cameras. The user can also employ a joystick to direct the Surrogate's microphone to capture specific conversations while filtering out the others, which Jouppi terms the "cocktail effect." The current Surrogate model operates in a fixed position, but the next model, scheduled to be ready by fall 2005, will be mobile. Jouppi says the robot could also allow users to experience remote sites such as the Smithsonian without having to travel great distances, and even envisions a use for it on the International Space Station. The HP research fellow reports that the Surrogate at one point was outfitted with a remotely controlled robotic arm and hand, but the concept was abandoned because the limb was too cumbersome to operate, and it also unnerved people. Jouppi plans to modify the Surrogate even more dramatically: "By model 6, we could have an anthropomorphic robot with a holographic face of the remote user," he boasts.
    Click Here to View Full Article