HomeFeedbackJoinShopSearch
Home
 
ACM TechNews sponsored by Learn more about Texis, the text-oriented database providing high-performance search engine features combined with SQL operations and a development toolkit, that powers many diverse applications, including Webinator and the Thunderstone Search Appliance.   Learn more about Texis, the text-oriented database providing high-performance search engine features combined with SQL operations and a development toolkit, that powers many diverse applications, including Webinator and the Thunderstone Search Appliance.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Thunderstone or ACM. To send comments, please write to [email protected].
Volume 7, Issue 770: Friday, March 25, 2005

  • "A CAPPS by Any Other Name"
    Wired News (03/25/05); Zetter, Kim

    The Government Accountability Office (GAO) will soon release an assessment of the Secure Flight passenger pre-screening system, but Rep. Loretta Sanchez (D-Calif.), computer security expert Bruce Schneier, and ACLU members held a March 24 press call in which they expressed concern that the Transportation Security Agency (TSA) is attempting to deploy the system on two national airlines this summer without GAO certification or congressional approval. They also want the GAO to reject the system for failing to satisfy several congressionally mandated deployment criteria. Whereas current passenger screening is supervised by the airlines themselves, Secure Flight would effectively put the TSA in control, and require that airlines supply the agency with passenger records. The TSA would also screen passengers against a unified watch list using third-party commercial databases, a measure that has courted controversy because of the frequent inaccuracy of the data within such repositories. The TSA's Amy Von Walter says the use of commercial databases is not a foregone conclusion, as the agency is in the middle of testing their effectiveness in passenger ID verification. ACLU legislative counsel Tim Sparapani said Secure Flight lacks a transparent redress process in which passengers incorrectly identified as suspects could clear their names, while Schneier said he expects the system to produce two kinds of false positives: One in which a person is marked simply because a name identical or similar to his appears on a watch list, and another in which his name is on the list for no apparent reason. Schneier also warned that terrorists could take advantage of security flaws in Secure Flight, for instance by traveling under an assumed name. Von Walter says the TSA is dedicated to providing a redress procedure for falsely targeted passengers.
    Click Here to View Full Article

  • "War of Words over Operating Systems' Safety"
    New Scientist (03/23/05); Biever, Celeste

    Recent reports on Linux-based Web servers, the open-source Firefox Web browser, and Apple's Mac OSX operating system raise doubts about their security, which experts contend is still better than their Microsoft equivalents. Symantec's biannual Internet Security Threat report issued on March 21 indicates that 21 new programming errors were uncovered in Firefox between July and December 2004, compared to 13 in Internet Explorer. ScanIT also released on Monday a conflicting report that low patching rates made 98% of IE users exploitable in 2004, while just 15% of Linux users were vulnerable; ScanIT founder David Michaux also notes that Symantec found fewer severe errors in Firefox than in IE. The Symantec report lists 37 vulnerabilities in Mac OSX, and takes the Repeno worm discovered last October as a sign that the Mac operating system is increasingly being targeted for hacks usually associated with Microsoft and numerous Unix-based OSes. Independent security consultant Richard Forno counters that the Symantec report inflates the significance of the Mac OSX vulnerabilities, arguing that hackers "want to go after the low-hanging fruit and the Mac OSX is still not as bug-ridden as Windows." A March 22 report commissioned by Microsoft and released by Florida Institute of Technology computer scientist Richard Ford takes note of 174 vulnerabilities in an open-source Linux server, compared to 52 in a Microsoft counterpart. In addition, the interim between reporting a flaw and patching it was substantially shorter with the Microsoft server than the Linux server. Sophos security consultant Graham Cluley calls these findings immaterial since Linux users are far fewer in number and more likely to patch their systems than Windows users, which makes them less attractive to hackers.
    Click Here to View Full Article

  • "Study: Grid Growth Requires Friendlier Software Licensing"
    Computerworld (03/23/05); Weiss, Todd R.

    The 451 Group's recently released "Grid Computing--The Impact of Software Licensing" study concludes that traditional per-processing licensing models are hindering the growth of grid computing, and friendlier licensing models should be implemented. Primary study author William Fellows contends that current licensing models fail to consider the large number of CPUs available in grid systems, which puts a significant financial strain on users; he suggests that charging a flat-rate licensing premium for grid use is a more affordable alternative to per-processor licensing. The study also indicates that other issues in grid software licensing, such as the processors' physical location, need to be resolved. The report finds that some grid customers are resistant to change, but all software vendors concur that any changes to their license model will be driven by customer demand. "What's clear is that as grid activity increases, so too will demands for enhanced license models--plus the instrumentation and management to support them," the study affirms. "Users also require more flexibility in the way software is bought and used, since the grid resource pool is shared and constantly shifting." The report expects a diverse assortment of licensing models to emerge, and some will feature measured usage based on transaction volume, time used, or an application's level of WAN-enablement. Fellows believes users of engineering design automation software will be particularly receptive to more flexible grid software licenses, because grid computing is a key component of their research.
    Click Here to View Full Article

  • "Common Sense Boosts Speech Software"
    Technology Research News (03/30/05); Smalley, Eric

    MIT Media Lab researchers are applying common-sense reasoning to the improvement of speech recognition software's accuracy through the Open Mind Common Sense Project. The software's inability to understand word meaning makes distinguishing between words that sound similar or identical difficult, and the researchers tapped a database of over 700,000 common-sense phrases to reorder close matches. The Media Lab's Henry Lieberman estimates that users who dictated text containing topics covered in the database committed 17% fewer errors and worked 7.5% faster. He also says common-sense filtering improves error correction by increasing the probability that the right word will appear at the top of the menu of alternatives the software presents when a user indicates a mistake while dictating. "One surprising thing about testing interfaces like this is that sometimes, even if they don't get the absolutely correct answer, users like them a lot better," Lieberman says. "This is because they make plausible mistakes, for example 'tennis clay court' for 'tennis player', rather than completely arbitrary mistakes that a statistical recognizer might make, for example 'tennis slayer.'" There are other ways of improving the choices made by speech recognition software, but Lieberman points out that none of them can ascertain whether a word makes sense in a given context. He says the software, which was detailed at the Intelligent User Interfaces Conference in January, is ready for use with existing commercial speech recognition technology.
    Click Here to View Full Article

  • "Tech Versus the World"
    Michigan Tech Lode (03/23/2005); LeVeque, Shaun

    The "Battle of the Brains" among the world's top collegiate programmers is scheduled for April 3-7, 2005, in Shanghai, China. Michigan Technological University's representative at the ACM International Collegiate Programming Contest will be "MTU Blue," a team that consists of Jonathan Kaus, Joe Nievelt, and Kyle Rokos. Team coach David Poplawski, as associate professor in the Department of Computer Science, is optimistic about MTU Blue's chances for success at the event because Nievelt and Rokos have had the experience of competing at the final competition in Prague, Czech Republic last year. The 78 teams competing at the finals will be presented at least eight real-world programming problems with a five-hour deadline: the group of three students that is able to correctly solve the most problems in the least amount of time will be declared the international champion, will receive scholarships, IBM prizes, and the "world's smartest" trophy. A separate POWER challenge will have the teams use IBM's POWER-based IBM eServer Blue Gene supercomputer to build applications. This year's contest attracted over 10,000 participants from 71 countries. Dr. Gabby Silberman, program director at IBM Centers for Advanced Studies, and executive sponsor, says "This event will give young programmers exposure to advanced programming environments, an experience that will help launch their careers in information technology."
    Click Here to View Full Article

  • "Small Banks Can Compete Through Niche Marketing Online"
    Penn State Live (03/22/05)

    A Penn State researcher believes small banks will be able to compete with their technology-laden counterparts if they concentrate on their niche markets. Although community banks will not be able to invest as much in wireless communications and mobile banking, adding links to their Web sites about their community and local businesses could go a long way toward facilitating cohesiveness between customers, the bank, and the local community, according to Dr. Peter B. Southard, assistant professor of management at the university. Along with Keng Siau, an associate professor of management at the University of Nebraska, Southard has authored "A Survey of Online E-Banking Retail Initiatives," an article that appeared in the October 2004 issue of Communications of the ACM. The researchers compared the Web sites of the five largest banks in the country with those of five randomly selected community banks, and found that the wealth of information and services offered on the sites of the big banks make them seem more crowded and confusing. "The current generations of customers still require, in human terms, the personal contact provided by face-to-face contact so, while e-banking will continue to grow, customers still need to know there is a human face behind the scene," says Southard.
    Click Here to View Full Article

  • "Biometric Passports Set to Take Flight"
    Medill News Service (03/21/05); Biba, Erin

    The State Department's Office of Passport Policy, Planning, and Advisory Service recently announced its readiness to start issuing radio frequency identification (RFID) chip-equipped biometric passports, but critics warn that the technology does not adequately protect the user's personal privacy. The agency intends to distribute the first such passport by mid 2005, and all passports issued in the U.S. will be biometric within a year. The passport's RFID chip will contain all personal data found on the information page of current passports, as well as a digital image of the bearer's face. The chip will feature an ID number and a digital signature, both of which will be archived in a central government database along with the personal data from the information page; the data on the chip cannot be modified once the chip is printed, which means that travelers must obtain new passports every time their information changes. Passport holders will have a year from when their information changes to apply for a new passport free of charge, a major consideration since the government will pay for the new technology by raising the cost of passports. The State Department says there is no need to encrypt the data on the RFID chip since it is identical to the data listed on the information page, and unencrypted data can be read faster using relatively simple technology. Critics are concerned that passport data can be surreptitiously read by properly equipped people in close proximity, though the State Department insists that safeguards will be deployed to thwart identity thieves. R.P. Ruiz with the Electronic Privacy Information Center says RFID technology is a bad choice for biometric passports, and thinks a contact card that the holder can slide through a slot like a credit card would more effectively deter identity theft.
    Click Here to View Full Article

  • "SHA-1 Flaw Seen as No Risk to One-Time Password Proposal"
    Computerworld (03/22/05); Willoughby, Mark

    A vulnerability in the SHA-1 one-way hash function discovered by Chinese cryptographers in February does not affect most SHA-1-based applications, including the Hashed Message Authentication Code (HMAC) from the Initiative for Open Authentication (Oath). The proposed HMAC standard is a one-time password (OTP) proposal that's considered a key technology for broadening authentication's appeal. VeriSign chief scientist Phillip Hallam-Baker says the SHA-1 vulnerability does not have a significant effect on the HMAC algorithm because the one-time password discards most of the bits from the 160-bit hash value provided by SHA-1, leaving only enough bits to form a six-digit sequential password. With so much less information, it would be much harder to break the hash. Shandong University researchers last month published research detailing how to speed computation of a SHA-1 hash value by finding two files that produced the same hash result. The method works roughly 2,000 times faster than a brute-force attack, but Counterpane Internet Security founder Bruce Schneier says the remote chance of that vulnerability being exploited would disappear once a new hash standard is released since HMAC will likely migrate to that new technology. The HMAC one-time password proposal is expected to become an Internet Engineering Task Force (IETF) request for comment soon, after which it could become a standard in as little as four weeks.
    Click Here to View Full Article

  • "Getting Smart Is All About Using Your Intelligence"
    Financial Times-IT Review (03/23/05) P. 1; Baxter, Andrew

    Despite the ongoing investment in business intelligence (BI) solutions, companies have not realized the goal of information democracy, which provides the right amount of information to the masses to provide key insights, solve problems faster, and open up new opportunities. Gartner's recent BI summit in London was well attended, demonstrating the high level of interest in the subject, but Gartner research vice president Frank Buytendijk says most companies are failing to improve their decision making over five years ago. Competitive pressures, government regulations, and increasing data loads, with growth estimated at between 30% and 50% per year for many companies, make BI a topic of continued interest despite challenges in achieving and measuring success. Most firms measure BI success by looking at adoption rates, while the systems themselves are becoming faster and more detailed; one major European bank at the Gartner summit said it provided summarized report updates every 15-25 minutes, down from four-hour intervals. Corporate dashboards are replacing older executive information systems that became popular in the 1980s, though experts warned that low-quality data would make any BI system useless no matter what the sophistication. Corporate performance management (CPM) is a new term for automated BI processes, but will not be successful at a corporate level because of the difficulties in comparing dissimilar departmental metrics, says La Suisse chief analytic officer Gabriel Fuchs. Gartner says BI systems must balance simplicity and sophistication in order to deliver meaningful tools that casual users will adopt. Currently, BI tools are too complex for most users to exploit fully, while just between 15% and 20% of users take advantage of available features, says Fuchs.

  • "Motion Filter Eases Troubles With Mouse"
    New York Times (03/24/05) P. E8; Eisenberg, Anne

    An adapter invented by IBM researcher James Levine can filter out shaky mouse movements caused by tremors in the hands of users suffering from motor skills disorders. The Assistive Mouse Adapter is about as small as a handheld calculator and plugs in between the mouse and the computer; the device is equipped with a microprocessor that intercepts the motion data sent from the mouse and uses an algorithm to screen out the high-frequency motion attributable to the tremor before passing it on to the computer. The adapter's controls can be set to reject the extra mouse clicks of trembling fingers and make it less difficult for users with motor impairments to double-click. The device was tested with assistance from Dr. Cathy Bodine of the University of Colorado School of Medicine, who helped set up trials involving disabled people who could identify what kinds of properties should be incorporated into the mouse filter. "The results of our tests showed that [the device] helped people, minimizing the impact of tremor on the use of the computer," Bodine notes. Envisioneering Group research director Richard Doherty says the Assistive Hand Adapter offers considerably more adaptability and ruggedness than other computer navigation options, such as eye tracking. The mouse filter is manufactured by Montrose Secam in Britain, which is selling the product online for about $100. Montrose Secam director James Cosgrave became interested in the device and elected to produce it himself because he is afflicted with a hereditary tremor.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Does IM Stand for Insecure Messaging?"
    CNet (03/23/05); Hines, Matt

    The threat of instant messaging (IM) worms is growing, and a key factor in their spread is the obliviousness of users and IT administrators. "A person unaware of the IM threat is the biggest risk that exists for these viruses to have some success," warns McAfee research fellow Jimmy Kuo. Most IM worms are disguised as attachments to messages that appear to originate from trusted sources, so that the recipient opens them without ever realizing that he or she has downloaded malware that rapidly spreads to all the names on their IM buddy list. Aladdin Knowledge Systems technology VP Shimon Gruper reports that IM's scant built-in security has made it unnecessary for hackers to target the IM code, but some experts think such attacks are inevitable. Furthermore, IM's popularity as a communications medium between computers and smart phones could make mobile devices vulnerable to viruses sent from PCs. The workplace penetration of public IM applications is increasing corporate networks' susceptibility to IM-borne threats, although businesses are usually better fortified against malware than consumers. There is also evidence to suggest that recent IM worms are being employed as a way for hackers to communicate with one another. VeriSign principal scientist Phillip Hallam-Baker says that although there have been few IM attacks so far, that could change. He says "that as email systems are being secured, there's a displacement effect and people are moving their efforts over to IM." America Online's Andrew Weinstein feels that user awareness of the IM threat is the best defense, and recommends that users regard every IM they receive with caution, even if it appears to come from a familiar sender.
    Click Here to View Full Article

  • "PC Forum: Where in the World Is Search Heading?"
    ZDNet (03/23/05); Farber, Dan

    Dan Farber reports on the final PC Forum panel, where scientists and vendor representatives offered their perspectives on search technology developments, leading to expectations that more personalized search could emerge in the next five years, although no definite predictions were made. The participants agreed that search must get better at understanding the context of the requested information, as well as delivering more precise and personalized answers. Medstory CEO Alain Rappaport detailed search software from his company that provides more accurate results through the application of domain knowledge. Google's Marissa Mayer said Orkut social networking software, which could yield more relevant and precise search results by tapping the search preferences of the user's friends, showed promise. Former computer science professor Udi Manber, president of Amazon's A9 search engine, discussed incorporating new information sources such as book content, video, geolocation data, and an archive of visited Web sites to augment query results; he also cited the need to strike a balance between ease of use and the enablement of powerful but more complex features. Mayer said that Google is committed to maintaining search interfaces' simplicity so that users do not have to think about how the technology works. Farber concludes that both Mayer and Manber raise valid points: Search should deliver ease of use and accurate results, but also help power users by making its operations understandable. He writes, "As the underlying technology improves (more personalized, natural language, more sources and relevance, guided navigation), users get more of what they want with less effort."
    Click Here to View Full Article

  • "Cyberterrorism Isn't a Threat Yet, One Expert Says"
    Fort Worth Star-Telegram (03/23/05); Batheja, Aman

    Cyberterrorism is a concept that has been overblown by the media and poses no threat, though someday it will evolve into a threat worth worrying about, according to longtime computer security expert Marcus Ranum, the inventor of the proxy firewall. Ranum made his comments at Texas Christian University on Tuesday during a lecture on computer hacking and terrorism. Cyberterrorism is an impractical means for terrorists to carry out their objective of striking fear into the hearts of their enemies, Ranum said. "Is it more cost effective to train yourself a cadre of cyber-ninjas or is it more effective to find idiots who will believe in your cause and wrap themselves in plastic explosives?" asked Ranum. Hackers have the capability of disrupting large parts of the Internet, but the Internet would be up and running again within 10 minutes, Ranum says. Despite his contention that cyberterrorism is not worth worrying about, Ranum does allow that the U.S. is vulnerable to cyberterrorism, pointing out that the vulnerability that produced the East Coast blackout of 2003 went undetected. Also, there is little security protecting the infrastructure that controls the nation's sewage systems, he says.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Tech That Makes a Difference"
    EE Times (03/21/05) No. 1363, P. 1; Goering, Richard

    University of California at Berkeley computer science professor and Inktomi founder Eric Brewer sees three major problems with attempts to bring technology to developing regions: Their reliance on existing, off-the-shelf technology, which may not align well with the region; their basis on donations, which makes project sustainability impossible; and a lack of infrastructure in developing nations. Brewer is involved with Information and Communications Technology for Billions (ICT4B), a National Science Foundation-funded initiative to engineer a high-tech foundation for sustainable, charity-independent businesses in the Third World. He reasons that such deployments are possible because the aggregate wealth of people in developing regions is considerable, while personal devices can be kept out of the equation. The most pressing engineering challenge Brewer and colleagues are trying to meet is establishing rural connectivity, with particular emphasis on Wi-Fi modified for extended transmission range. Other major challenges include developing a "store and forward" style network that applications can run on even though links function intermittently, and user interfaces that can recognize speech in unusual dialects. Brewer details several ICT4B projects underway in India, one of which involves the setup of an e-government wireless network in the state of Kerala. Brewer's group is working to deploy kiosks in Tamil Nadu so that residents are no more than 20 kilometers away from a rural health center. A second project based in Tamil Nadu involves enabling kiosks for speech recognition so that villagers can wirelessly access weather reports and other useful information.
    Click Here to View Full Article

  • "Taming the River of Data"
    Defense News (03/14/05) Vol. 20, No. 11, P. 14; Goodman Jr., Glenn W.

    The Information Dominance Center (IDC) of the Army's Intelligence and Security Command is testing new software tools designed to fuse data from numerous sources and present it in a comprehensible manner. Command director Maj. Gen. John Kimmons says too much of his analysts' time is taken up with data compilation and organization, adding that the majority work with pieces of data that have already been captured and reported. Deputy chief of staff of the Army for intelligence Lt. Gen. Keith Alexander says the IDC is trailblazing the employment of intelligence exploitation tools to help analysts "rapidly establish threat association and linkages, recognize threshold events, activity patterns and anomalies, understanding the significance of information 'buried' within large volumes of collected material." Some software generates signature graphs that set a baseline of normal behavior and flag any instances where thresholds are overstepped, and Kimmons likens this to anti-fraud software used by credit card companies that diagrams a similar baseline by scanning cardholders' transactions and purchases. To be practical, Kimmons says the software must support real-time access to all of the collected data; organize the data to enable rapid search and visualization of interrelated components on a computer screen; and automatically filter classified information so that lower-level commanders can access the information without putting the source at risk. Kimmons reports that database access is currently inadequate. "What holds us back is...the policy spider webs and reluctance to share intelligence at increasingly lower levels in near real time, where it is tactically relevant," he says.
    Click Here to View Full Article

  • "IETF Eyes 'Net Emergency Communications"
    Network World (03/21/05) Vol. 22, No. 11, P. 63; Marsan, Carolyn Duffy

    The IETF recently launched the Emergency Context Resolutions with Internet Technologies (ECRIT) working group to develop a way to stream emergency communications over the Internet in much the same way that 911 calls are sent across the public switched telephone network (PSTN). ECRIT is attempting to outline the needs of Net-based emergency calling and to choose technologies that can effectively describe the call originator's location and manage how the call is routed to the appropriate emergency call center. Oki Electric Industry engineer Hideki Arai says the system "must be equivalent to traditional emergency calling...and it must be easy to migrate from the PSTN to the Internet." Emergicom President Brian Rosen notes that Internet emergency calls must also be traceable in the event of error, while a call-back mechanism must be in place should the caller hang up. Other ECRIT system properties listed as essential by working group participants include reliability, redundancy, and backward compatibility. The group's first task is to come up with a document that defines Internet emergency communications terminology and outlines the system's requirements, while subsequent documents will detail security considerations, the establishment of communications sessions between callers and emergency response centers, the association of sessions with physical locations, and routing emergency calls based on location data. The success of such a venture would force service providers to bring their network hardware and software up to date. Prioritization or pre-emption of emergency calls will be handled by the IETF's Internet Preparedness working group as part of its plan to devise a general framework for disaster recovery communications systems.
    Click Here to View Full Article

  • "The 'dotCommunist'"
    Chronicle of Higher Education (03/25/05) Vol. 51, No. 29, P. A31; Foster,Andrea L.

    Columbia University law professor and Free Software Foundation general counsel Eben Moglen is a fervent believer in free software as part of his struggle to promote freedom of speech and advance knowledge. He defines software as a "public utility," and argues that software patents and other attempts to restrict its use or sharing are immoral. Creators of open-source software, which Moglen views as vital to sustained innovation, license it under usage terms designed to prevent companies from commandeering and commoditizing the software, but commercial software makers are targeting such licenses by arguing that free software is hurting their bottom line. The recently established Software Freedom Law Center, which Moglen heads, will help producers of open-source software surmount these challenges by providing free legal advice, and also by refining and enforcing open licenses. The companies supporting the new center share Moglen's vision of open-source systems eventually trouncing Microsoft's software market monopoly and helping to bring down the current system of information ownership. Moglen sees digital information-exchange media as important tools for improving society, and notes that the Internet has enabled economically disadvantaged people to access the same information that the economically advantaged have. On the other hand, Boston intellectual property lawyer Steven Henry believes proprietary and open-source software will exist alongside each other for quite a while, and says open-source software will not thrive unless licenses that fortify businesses' ability to integrate code from both free and commercial software are crafted.

  • "The Science Deficit"
    Government Executive (03/15/05) Vol. 37, No. 4, P. 54; Dickey, Beth

    General increases in federal funding for research and development belie the diversion of funding from key civilian disciplines such as physics and biology to defense and homeland security, and this trend is sparking fears that U.S. technological innovation will lag behind that of international rivals. Although President Bush's proposed total R&D portfolio for fiscal 2006 includes a $733 million increase in funding above the fiscal 2005 level, the federal budget for basic and applied research would lose $733 million. Nearly 81% of the expected $6.2 billion boost in total R&D funding for 2005 is reserved for defense projects and new weapons development; this year will see an all-time high of $74.9 billion in defense-related R&D, with the lion's share of a $4.8 billion increase being split between the Departments of Defense, Homeland Security, and Energy. Nondefense R&D budgets will expand by $1.2 billion this year, and even less next year. About 60% of all academic R&D funding is provided by the government, and American Association for the Advancement of Science budget and policy director Kei Koizumi sees a correlation between the amount of federal funding universities receive and the number of science and engineering graduates they churn out. The U.S. Commission on National Security for the 21st Century reported four years ago that major increases in privately funded R&D are usually channeled into development instead of research, making a case for additional public spending. Meanwhile, the National Research Council of the National Academies prepared an assessment for Congress in which it criticized the Defense Department for maintaining a stationary level of basic research funding and overemphasizing near-term results.
    Click Here to View Full Article

  • "Implanting Hope"
    Technology Review (03/05) Vol. 108, No. 3, P. 48; Duncan, David Ewing

    Excitement is brewing over the potential use of implantable brain-computer interfaces (BCIs) to increase the mobility and independence of paralytics and other movement-impaired patients, although the technology is in a very early stage of development. One of the most notable inventions in this area is Cyberkinetics Neurotechnology Systems' Braingate Neural Interface System, a chip that is planted under the skull so that its electrodes pick up neuronal impulses; the chip is wired to a computer that reads and translates the impulses into commands for moving an onscreen cursor or a prosthetic hand. Other experiments in this field have focused on BCIs implanted in primates so they can remotely control artificial limbs and cursors by thought, but Braingate has moved on to human trials. Brown University neuroscience professor and Cyberkinetics co-founder John Donoghue reasons that neural prosthetic research may ultimately enable the disabled to walk, and fellow neuroscientists say the time is right for human BCI research; Richard Andersen of Caltech is particularly impressed by Donoghue's work, because it proves that human motor neurons still function normally even after long-term paralysis. Duke University neuroscientist Miguel Nicolelis has reservations about Braingate, claiming that Cyberkinetics appears to be more interested in commercializing and promoting the product than in maximizing its benefits to patients. Cyberkinetics CEO Timothy Surgenor says Braingate's marketability hinges on making the device dramatically less bulky, wireless, and automated. Electrical engineer, physicist, and Braingate team member Arto Numikko expects the next-generation Braingate to support two-way communication between the brain and the device. Scientists such as University of Chicago assistant neuroanatomy professor Nicholas Hatsopolous believe advancements in BCI technology will yield new insights into higher-brain functions.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM