Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to technews@hq.acm.org.
Volume 6, Issue 607:  Wednesday, February 18, 2004

  • "Sparks Fly in E-Voting Debate"
    MSNBC (02/16/04); Boyle, Alan

    With Election Data Services predicting that 50 million voters--28 percent of the projected U.S. voting populace--will use paperless electronic voting systems this year, researchers at the annual conference of the American Association for the Advancement of Science intensely argued over the advantages and disadvantages of e-voting; however, there was little disagreement that the insecurity of e-voting systems has the potential to make this year's presidential election even more riddled with errors than the last election. E-voting advocates admitted that the technology is not perfect, but supported the argument that paper-based systems are far more problematic: The Caltech-MIT Voter Technology Project, for instance, estimated that poor ballot designs led to as many as 6 million lost votes in the 2000 election, up to 50 percent attributable to obsolete registration rolls. Meanwhile, MIT computer scientist Ted Selker said that absentee voting has more potential for abuse than e-voting security. David Dill of Stanford University and Peter Neumann of SRI International made the case against paperless voting with the argument that current e-voting software is not protected against external or internal tampering, and is set up so that sabotage could be undetectable. Both researchers agreed that reliance on paper ballots to verify votes was the most secure solution, at least until e-voting machines become trustworthy. Many election officials are equipping e-voting machines with printers to produce a paper trail, a practice that Selker criticized for not having been sufficiently tested; he contended that such systems are prone to paper jams and other technical glitches, as well as "paper-hacking." Both e-voting critics and supporters agreed that giving voters a paper ATM-style receipt that could be removed from the polling place is unacceptable, given the danger for large-scale vote-buying and coercion. Researchers and vendors are working on various projects to add security to e-voting, such as Selker's Secure Architecture for Voting Electronically, and cryptographic checksums.
    Click Here to View Full Article

    To read more about ACM's activities involving e-voting, visit http://www.acm.org/usacm/.

  • "U.S. Firms Lament Cutback in Visas for Foreign Talent"
    Los Angeles Times (02/16/04) P. 1C; Iritani, Evelyn

    An improving U.S. economy has companies such as Rockwell Scientific turning to the H-1B visa program again to hire skilled foreigners to fill their openings. However, Rockwell Scientific CEO Derek Cheung and other executives believe there are not enough H-1B visas available now that Congress has reduced the cap on the program from 195,000 a year ago to 65,000 this year. Some 43,000 H-1B visas have been approved through December, and immigration attorneys believe the ceiling will be reached within the next few weeks. The Information Technology Association of America (ITAA) is among the groups that have been calling on Congress to remove the cap on the H-1B visa program, but they may have some problems finding support on Capitol Hill these days. Washington may be too concerned about the lack of job growth in the United States to embrace the idea of outsourcing jobs to foreign workers. "The anti-immigration mood and the anti-globalization mood inside Washington is as negative as I've seen in my 25 years working in this field," says ITAA President Harris Miller. Meanwhile, critics of the H-1B visa program say U.S. companies are abusing it in their desire to lower their operating costs by hiring foreign workers for low pay. "There are plenty of qualified Americans who are dying to take these jobs," says Pete Bennett, founder of www.nomoreh1b.com.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Passwords to Guard Entry Aren't Enough to Protect Complex Data"
    ScienceDaily (02/16/04)

    Shielding complex data from unauthorized users with passwords and other access controls is only part of the equation; outgoing data must also be protected through filters, argues Stanford computer science professor Gio Wiederhold, who will discuss trusted information databases at the annual meeting of the American Association for the Advancement of Science. The access-driven security model cannot function unless data is well organized and contained in tidy boxes for use by people with authorized roles, while complex, unstructured, multipurpose data generally has poor protection. Furthermore, even the most secure access controls are useless if trusted users turn, such as when a malcontented employee with access to the database decides to hurt the company by exploiting or damaging its information assets. The biggest detriment of the access control model is its failure to take collaboration into account, which can hinder research that requires multiple types of users to access data, such as patient medical records. Wiederhold contends that complementing access control with release control, in which the content of documents being sent to the requestor is monitored, will ensure that the requestor only receives material that is appropriate for a specific project. The Stanford professor adds that diverse systems with data output such as email, file systems, Web sites, and databases are prime candidates for document release protection. However, Wiederhold cautions that though privacy may be better shielded with access controls working in parallel with release control, complicated security parameters could come into conflict or even make data less secure. "The scope of potential use of data is so large that no approach that relies on any specific data organization will be adequate for all future needs," he comments.
    Click Here to View Full Article

  • "Innovation Alive and Well at Demo"
    CNet (02/17/04); Farber, Dan

    The Demo 2004 conference in Scottsdale, Ariz., showed no shortage of innovation in the area of business software. Notable products spotlighted at the gathering include the USB-based Xkey from KeyComputing, a device that can be plugged into a home computer or a branch office system and provide a secure duplicate of the user's Microsoft Outlook application while saving money; the Xkey will feature 256 MB of flash storage and should become available in May. Among the enterprise software applications premiering at Demo 2004 was mValent's Infrastructure Automation Suite, a product that handles the construction and continuous maintenance of complicated infrastructure configurations for Java applications. The product apparently centralizes all configuration data, and can display interdependencies and how changes ripple throughout the infrastructure, as well as harmonize those changes within a distributed environment. Other enterprise apps featured at Demo 2004 include a risk management server from Proofpoint that employs technology used for genome sequencing to scan a company's outbound message traffic for policy and regulatory infractions so that risk and liability can be lowered; and Quindi's Meeting Companion, which records all conference elements for playback, both for archival purposes and to help people who miss meetings get up to speed. Microsoft founder and investor Paul Allen was on hand to plug his FlipStart PC, whose features include a 30 GB drive, Wi-Fi and Bluetooth wireless connectivity, a 1 GHz processor, a thumb wheel, a thumb keyboard, a touch pad, an optional touch screen, mouse buttons, and a 5.6-inch, HTML-compliant display. The FlipStart weighs less than one pound, and Allen described the device as "the Swiss Army knife of the PC." He confessed that its internal battery life is approximately two hours, but added that the final version of the FlipStart will include a wireless modem, a full-sized keyboard, and a peripheral docking station.
    Click Here to View Full Article

  • "New Anti-Spam Initiative Gaining Traction"
    eWeek (02/12/04); Callaghan, Dennis

    Spammers would no longer be able to send junk email anonymously if the SMTP protocol was changed so that sending servers could be authenticated; the SMTP+SPF working group is developing the Sender Policy Framework (SPF) in the hopes that the Internet Engineering Task Force (IETF) will approve it as an anti-spoofing standard. SPF only works if domain owners publish sender IP addresses, which would then be matched to client IP addresses provided by mail transfer agents; email would be rejected if the client IP address and the published domain IP address fail to match. Pobox.com CTO Meng Weng Wong plans to argue his case for the IETF to establish a working group to study SPF at the 59th IETF Meeting in late February, although he really wants the task force to adopt the framework directly, without going through a workgroup phase. He says the SMTP+SPF working group has already done most of the legwork, adding, "It may take a year from now [before SPF goes through the regular IETF process], and no one wants another 12 months of spam." Wong says that existing spam filters can be tweaked to support SPF, and anti-spam technology providers such as CipherTrust and InboxCop are backing the framework. In addition, almost 7,000 domain holders have posted their sender IP addresses at the SMTP+SPF Web site, while Wong reports that SPF would be available for free and on a voluntary basis. Mark Wegman at IBM's T.J. Watson Research Center cautions that SPF, though a good starting point, cannot halt all spam, and notes that the framework can be supported by a new spam filter his company is working on. The filter assesses email according to numerous factors, such as delivery patterns and account content.
    Click Here to View Full Article

  • "Pittsburgh Scientists Measure Productivity in Petascale Supercomputing"
    AScribe Newswire (02/17/04)

    The Defense Advanced Research Projects Agency has made it a priority to boost supercomputing power a thousand-fold to the petaflop scale by the end of the decade through its High-Productivity Computing Systems (HPCS) initiative. Funded under the aegis of the HPCS effort is the Productive, Easy-to-use, Reliable Computing System (PERCS) program, and one of the PERCS grant recipients is a project to define and measure supercomputer productivity being jointly carried out by the Pittsburgh Supercomputing Center (PSC) and the University of Pittsburgh. Rami Melhem, chair of the University of Pittsburgh Computer Science Department, is leading a team of researchers who are developing tools to measure the human time involved in software development against a supercomputer's ultimate performance. This will enable computer scientists to make more balanced decisions about the optimal design strategies for new hardware architectures. Melhem explains that system productivity considers not only "the productivity of the machine, but also productivity of the people--the scientific research team--who use the machine to solve problems." PSC researchers have declared their intentions to create the Standardized User Monitoring Suite (SUMS), a software tool designed to run in the background and record data on the full "code development" cycle, according to PSC scientists John Urbanic and Nick Nystrom. SUMS will not only collect data, but will be able to correlate and visualize the data for examination. "This will provide a holistic picture and provide the productivity analyst with a quantifiable metric," notes Nystrom.
    Click Here to View Full Article

  • "Biology Stirs Software 'Monoculture' Debate"
    Associated Press (02/16/04); Pope, Justin

    University of New Mexico biologist Stephanie Forrest and Mike Reiter of Carnegie-Mellon University have received a $750,000 National Science Foundation grant to explore methods to automatically diversify software code. The work stems from the belief that in computer networks, just as in nature, species with little variation, or "monocultures," are most vulnerable to epidemics. Computer security specialist Dan Geer lost his job when he published a paper last fall suggesting that monoculture programs such as Microsoft's software are so prevalent that one virus could cause a great amount of damage. Microsoft counters that computers and living organisms are only similar in some ways. Even another major operating system, such as Linux, would not keep out sophisticated hackers, says Microsoft chief security strategist Scott Charney. Last fall, Homeland Security Department CIO Steven Cooper, responding to questioning at a Congressional committee hearing, said the federal government is concerned about its vulnerability to monoculture and as a result would increase its use of Linux and Unix. Geer says, "When in doubt, I think of, `How does nature work?' Which leads you...to think about monoculture, which leads you to think about epidemic. Because the idea of an epidemic is not radically different from what we're talking about with the Internet." New York-Stony Brook researchers R. Sekar and Daniel DuVarney are working to diversify software by targeting the non-functional parts of code using "benign mutations" to keep software one step ahead of viruses.
    Click Here to View Full Article

  • "Search for Tomorrow"
    Washington Post (02/15/04) P. D1; Achenbach, Joel

    Google has established itself as the first Internet search engine to achieve utility-like status, with the service handling more than 200 million queries daily; however, next-generation search engines are likely to make Google seem medieval in comparison. Google's worth as a navigation tool has expanded along with the amount of valuable data on the Internet, and e-book author Seth Godin speculates that 2000 was the year that valuable online content reached a critical mass. Google's emphasis on practicality--getting quick, accurate search results through parallel computing--over bells and whistles is a key factor in its success. Google resembles other search engines in that it employs "crawler" programs that automatically troll the Web, clicking on all possible links; but it also notes how many other Web pages link to any given page, which determine how high specific pages are ranked by Google. Its success in this area signifies the next logical evolutionary step for search engines: The emergence of "intelligent agents" that personalize searches by studying individual users' search patterns and habits. IBM's Dave Gruhl calls this process user modeling, in which computers deduce users' interests and preferences by analyzing interactions between people. Godin contends that what people want in an intelligent agent is a "find engine" rather than a search engine, a digital secretary or helper that anticipates the information a person wants. Such a tool will need to better understand information on the Web, rather than base searches on the presence of keywords--and for this to happen, Web sites will need to be tagged with metadata, which is the goal of the Semantic Web project. Furthermore, future search engines must be able to carry out searches on all kinds of digital content--films, music, etc.--not just text documents.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Where to Start to Launch the 'Butterfly Effect'"
    Newswise (02/11/04)

    Cornell University computer science professor Jon Kleinberg talked about how computer algorithms can be applied to the problem of determining whether a few influential people can induce major changes in the thinking of large populations, in much the same way that the flapping of a butterfly's wings can theoretically trigger a storm halfway around the world, at the annual meeting of the American Association for the Advancement of Science. Sociologists working with computer scientists can use programs to crawl the Web and diagram a group's communications links; influential people in such a network could be identified as those who possess the most links to other people, or those who can reach the largest number of others with the least amount of "hops" through other people, which Kleinberg said adds redundancy. "After targeting the first few people you discount others, then you look for people who are still influential but in diverse parts of the network," he noted. Kleinberg's research team tried out their algorithm on the network of co-authorship in scientific papers, using Cornell's arXiv database of physics and mathematics publications as their information resource. The study considered links between paper co-authors, and discounted real-world data such as whether two people might be with the same institution. Kleinberg has also been collaborating with MIT Ph.D. student David Liben-Nowell on an investigation of how networks grow over time, focusing on such practices as predicting where new links will form in a network. Again using the arXiv network, the researchers postulated that two people with no connection to each other would probably form a link if they are close to one another in linkage terms; however, they found that the number of hops between people was not the best way to measure nearness. "It's better to look for people who have many different short paths connecting them," Kleinberg conceded. Kleinberg's work could be applied to many pursuits, such as marketing new products, predicting disease epidemics, or identifying terrorist leaders.
    Click Here to View Full Article

  • "Security Still Reigns as Wireless 'Weakest Link'"
    E-Commerce Times (02/17/04); Gallagher, Helen

    Though Amry Junaideen of the Deloitte & Touche Security Services division reports that wireless devices such as laptops and personal digital assistants have become more productive, that productivity is offset by their lack of security, which means that information could be compromised if the devices are stolen, employed, or tapped by unauthorized users. He recommends that corporations institute a top-down wireless security framework that covers why the corporation is using wireless, what its business goals are, and what policy supervises the entire enterprise in this area. "A policy should require strict adherence to standards and contain specific information on what people should do to protect their devices once wireless has been deployed," Junaideen explains. Devices used to store the most sensitive data should get the highest priority, while critical data files should be encrypted in the event the portable device is lost, even though encryption is an expensive option. Junaideen says protective measures for wireless devices include not just file encryption, but firewalls, virtual private networks, quarantining tools, and data wipe technology. He suggests that users cut wireless connections immediately if a sniffer detects that a device has been compromised, while data wipe software can erase all data from a lost device if someone attempts to exploit it. Network Associates' Sydney Fisher says the security risks of wireless are related to its advantages: "It's important to have appropriate security so data is stored properly, travels properly and is protected from people who shouldn't get it, but [is] accessible to those who do need it." Fisher notes that sniffer products are well suited for wireless environments such as WANs, LANs, or ATM networks.
    Click Here to View Full Article

  • "Spammers Exploit High-Speed Connections"
    Associated Press (02/16/04); Jesdanun, Anick

    Spammers are hijacking home computers with high-speed Internet connections to use as proxy spam relays, and email security companies estimate that between one-third and two-thirds of junk email is sent by "spam zombies" whose owners misconfigure their software or fail to implement or update their PCs' security. Proxy relays could be run from any Internet-connected machine, but most of the malware that installs these proxies targets PCs that run Windows. The shift in spamming tactics spurred the Federal Trade Commission to issue a consumer advisory in January, recommending that consumers employ firewall and antivirus programs as well as check "sent mail" folders for suspicious content. Visiting windowsupdate.microsoft.com regularly to download the latest updates to the Windows operating system is advised by others. EarthLink's Mary Youngblood explains that ISPs have a lot of trouble detecting and blocking proxy spam relays; some remain open for a short while and vanish by the time ISPs are aware of the problem, while newer, more versatile proxies constantly reconfigure themselves and are harder to lock down. "Fighting Spam for Dummies" co-author John Levine speculates that as proxies spread, ISPs could be forced to restrict the number of messages a subscriber is allowed to send in a given time period. Brightmail chief technology officer Ken Schneider predicts that the situation will only get worse, now that virus writers have an economic incentive to create malevolent code.
    Click Here to View Full Article

  • "Hackers for Hire"
    TechNewsWorld (02/13/04); Germain, Jack M.

    It has been a common practice for companies to hire "White Hat" hackers to test their network security, but some experts are questioning the wisdom of such an approach, especially as new, stronger, and more potentially damaging cybersecurity threats emerge and government regulations about data security and customer privacy increase. Former regional partner for Deloitte & Touche Security Services Group Thomas Patterson compares hiring one-time hackers to putting a fox in a henhouse, and advocates several fundamental rules for cutting risks. "We believe we can achieve the same level of success without sacrificing the trust of our own clients," notes Patterson. "We may go to the hacking conferences and stay up on what's the latest in the hacking community, but it's a fine line. We hire the good guys." Invisus co-founder James Harrison draws a very fine boundary between White Hat and Black Hat hackers, and argues that software security products and certified computer experts offer far more safety, since they engender reliability and trust. On the other hand, security consultant Gary Morse claims there are big differences between good and bad hackers: White Hat hackers, he insists, are veteran programmers with no criminal histories, and they devote more time to writing lengthy documentation on a company's security flaws than actually penetrating networks. He also downplays the threat of email worms and viruses, arguing that hacker threats are far more dangerous.
    Click Here to View Full Article

  • "Congress and Cybersecurity"
    TechNews.com (02/12/04); Krebs, Brian

    In an online discussion of cybersecurity issues, Rep. Adam Putnam (R-Fla.) raised such points as the need for increased awareness of such issues, and the progress both the public and private sectors have made. He acknowledged that there is still a lot to be done in many areas, such as improving awareness, instituting more oversight, and encouraging safe computing practices. In response to a question posed by discussion moderator Brian Krebs, Putnam said that he decided to postpone introducing a bill that would require public companies to confirm their compliance with cybersecurity standards after receiving a great deal of feedback from the private sector indicating that it would give IT security serious consideration, adding that he would vouch for an industry-promoted plan that sets up sound cybersecurity practices, even without direct federal mandate. He also praised the National Cyber Alert System launched last month, arguing that more than 250,000 visits to the system's official Web site in its first week of operation was a clear sign that "public interest and awareness are high." The congressman addressed a question from a inquirer in Jacksonville, Fla., in which he admitted that 85 percent of U.S. critical infrastructure is controlled by private industry, and explained that his subcommittee is conducting hearings that seek to make Congress more proactive about cybersecurity without hurting innovation. Putnam told an inquirer from Portland, Maine, that he has established a working group to make software companies more responsible for improving cyberattack measures, pointing out that Congress has investigated the possibility of expanding common criteria standards for sensitive defense and intelligence purchases to the software industry. Putnam maintained that making home users aware of safe computing practices is important to both the government and industry, noting that manufacturers and educators have their roles to play. Putnam contended that the White House Office of Management and Budget has improved its IT spending oversight efforts significantly under President Bush's Management Agenda, while more cybersecurity-minded issues are being bundled into the National Security Cyber Division of the Homeland Security Department.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Privacy Is in the House"
    Wired News (02/11/04); Singel, Ryan

    The Defense of Privacy Act would require all government agencies to study the privacy impact of new rules before putting them into effect and would complement the E-Government Act of 2001. The bipartisan congressional coalition that backs it has been trying for three years to get such legislation passed, but two previous bills were never taken up by the Senate despite passing the House of Representatives. Supporters believe that government agencies sometimes try to implement regulations that let the federal government invade citizen privacy, and the bill is intended to make agencies consider privacy when drafting rules instead of afterwards. "One of the best ways to protect privacy is to raise privacy concerns early in the development so those concerns can be addressed and mitigated in advance," says Center for Democracy & Technology executive director James Dempsey. The bill's author, Rep. Steve Chabot (R-Ohio), says he "introduced the bill because a reasonable expectation of privacy is too often a regulatory afterthought." A recent hearing on the legislation also served as an oversight hearing on the activities of Department of Homeland Security chief privacy officer Nuala O'Connor Kelly. The committee and witnesses praised O'Connor Kelly for her work on the privacy impact assessment of the foreign visitor biometric database system, and say that she proves that other agencies should have privacy officers. Kelly has worked on the CAPPS II passenger screening system, as well as the US-Visit biometric database system, in which she detailed the system's database structure and outlined potential security risks that could expose personal data.
    Click Here to View Full Article

  • "For Those Who Can't Wait for the Future to Arrive"
    New York Times (02/12/04) P. E5; Vance, Ashlee

    Intel has created a trio of concept PCs to be exhibited at the upcoming Intel Developer Forum: The machines marry imaginative designs to state-of-the-art technology in an effort to "predict, inspire and direct the industry," according to Intel mobile chief technology evangelist Mike Trainor. "With the concept computers, we can get live people to touch these things and see what they really like," notes Anand Chandrasekher of Intel's mobile platforms group. All three concept laptops, which are collectively known as the Florence line, support peripherals with Bluetooth wireless technology and use smart-card and fingerprint readers. Two of the Florence systems are designed for "data fiend" business users, and boast Extended Mobile Access technology. Both 15-inch laptops are equipped with a small liquid crystal display (LCD) on their casings that posts emails and notes whenever a wireless network is within range, allowing users to stay apprised of valuable data without consuming a lot of time or battery power; the secondary LCD is 50 percent less power-consumptive than a fully operational notebook, and Intel expects to reduce that consumption by 40 percent. "The belief is that people can close their laptops during meetings and just sort of glance down at the secondary display to see if urgent notes come across," Trainor explains. Intel expects the secondary LCD to support MP3 players and instant-messaging software in a few years, as well as connections to Wi-Fi and wireless carriers' data networks. The third Florence concept PC is designed for home use and more garden-variety consumers; its features include a 17-inch display, a video camera, a microphone, and a wireless keyboard outfitted with a removable phone and a detachable remote control. Users can employ the handset or the microphone/camera to make calls via VoIP technology.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Pandora's Box for Open Source"
    CNet (02/12/04); LaMonica, Martin

    Powerful tech companies are undergoing a shift in sales strategies as open-source software increasingly impinges on traditional markets, and many software makers are adopting open source just to keep abreast of industry trends. "This is a complex dynamic, because on the one hand, you need commercial support for [open-source products], but on the other hand, you have this phenomenon of wanting to resist, if you're a commercial provider," notes Forrester Research analyst Ted Schadler. Novell, Sun Microsystems, IBM, and the other leading software firms have embraced open source to some degree, often selling proprietary software and promoting open-source services at the same time; IBM, for instance, says that its open-source initiative helps advance industry standards that help harmonize the company's various products. They further argue that open source not only serves as a valuable tool against Microsoft's monopolization, but spells out where software providers should concentrate their efforts. However, as open-source products proliferate and extend their reach into other services, IBM and other open-source proponents could find their proprietary business eroding. Meanwhile, several smaller companies, such as JBoss and MySQL, sell commercial licenses around open-source software and provide support and other services. MySQL CEO Marten Mickos reports that open-source software generates more demand, which carries benefits for all tech companies; by taking the place of commodity goods, open source gives companies more room to focus on high-end offerings. "Ten years from now, we will look back and say, 'What did we do before open source?'"
    Mickos declares.
    Click Here to View Full Article

  • "Quantum Cryptography: Security Through Uncertainty"
    Network Magazine (02/04) Vol. 19, No. 2, P. 41; Dornan, Andy

    Start-ups MagiQ Technologies and ID Quantique announced quantum cryptography hardware late last year, but most enterprise networks will not be able to take advantage of the technology. However, the continued development of quantum cryptography over the next few years is expected to make the advancement more beneficial to enterprise networks. Quantum cryptography uses objects that are in different places at one time to create the same random numbers in two locations, enabling the two identical sets of random numbers to be used as symmetric encryption keys or one-time pads. The problem of creating and distributing encryption would be solved because the keys would never be used again. Nonetheless, dedicated fiber cable is needed for quantum key distribution through a network, and fully optical switches for multiplexing entangled photons with ordinary data remain a few years away. Moreover, repeaters can not be used, prompting MagiQ to experiment with using Free Space Optics lasers to send photons through a wireless link. Existing key distribution systems are unable to distribute a one-time pad, which makes them susceptible to outright mathematical attacks. A quantum computer could break encryption that reuses keys, but a working computer will not be here for decades.
    Click Here to View Full Article

  • "'Smart Dust' Is Way Cool"
    U.S. News & World Report (02/16/04) Vol. 136, No. 6, P. 56; Schmidt, Karen F.

    "Smart dust" has enabled field biologist John Anderson of the College of the Atlantic in Bar Harbor to collect information on thousands of Leach's storm petrels hunkered down in burrows on Great Duck Island, which is 12 miles off the coast of Maine. Anderson and his team of researchers make use of a network of small, wireless sensors to monitor the seabirds, which emerge only at night, as well as temperature, humidity, and barometric pressure. The sensors are battery-powered devices about the size of matchboxes that transmit data using a radio connection to a solar-powered base station on the island, and then to the Internet. Tiny, intelligent sensing devices, or motes, are poised to take off as ubiquitous information collectors that can be used in hard-to-access places and moved around at will, and some experts even envision the networks acting as an Internet that merges with the physical world in that users would be able to query buildings, roads, and rivers for information online. Nonetheless, the technology must become smaller, cheaper to make, more dependable, and more energy-efficient. There also are some concerns about the technology having a dark side. "It's a very intrusive technology and could be abused," says John Cozzens, the National Science Foundation's technical coordinator for the Center for Embedded Networked Sensing at the University of California-Los Angeles. Moreover, observers say dealing with the amount of information intelligent sensing devices can collect is the foremost challenge of the technology. Feng Zhao, manager of the Embedded Collaborative Computing Area at the Palo Alto Research Center, is working to design motes that can focus in on the most relevant data, while others are working on mote security and accuracy.
    Click Here to View Full Article

  • "Mind Over Machine"
    Popular Science (02/04); Zimmer, Carl

    The Defense Advanced Research Projects Agency (DARPA) has invested heavily in brain-machine interface research, whose promised benefits include thought-controlled robots for military operations and mentally-directed artificial limbs for paralysis victims. Such projects build upon research which established that the electrical impulses of neurons are similar to the on-off digital code of computers, which resulted in the theory that cracking the code could lead to machines operated by mental command. Miguel Nicolelis of Duke University and John Chapin of the State University of New York Downstate Health Science Center conducted pioneering research which revealed that analyzing the signals from a relatively small group of neurons via implanted electrodes produces enough data to recognize many different mental commands, thus removing a major impediment to brain-machine interface development. At Duke, Nicolelis implanted electrodes into primates and trained them to operate a robot arm by watching a cursor on a computer screen and making it move with a joystick; through the electrode link, the computer learned to decode the animals' brain patterns and translate them into cursor movements so that the monkeys could be weaned off the joystick and taught to control the prosthetic limb by thought alone. The Duke researchers must increase the system's portability and unobtrusiveness to make it practical for human quadriplegics. They believe this can be achieved with implanted neural electrodes attached to a processor that wirelessly transmits thoughts to a portable computer that in turn relays those commands to the prosthesis; Chapin is working on a force-feedback system to help users more precisely control limbs. DARPA, which funds the Duke research, is also supporting a University of Michigan project into mind-controlled robots in the hopes that such devices can be used for reconnaissance missions in dangerous environments. Former director of DARPA's Brain-Machine Interface Program Alan Rudolph notes that a viable interface for soldiers will require a noninvasive method for reading brain inputs.
    Click Here to View Full Article