Welcome to the October 5, 2009 edition of ACM TechNews, providing timely information for IT professionals three times a week.
HEADLINES AT A GLANCE
Bird-Watching Turns to Technology
BBC News (10/05/09) Ward, Mark
University of Lincoln computer scientist Patrick Dickinson is working on a 14-month research project to track the actions of guillemots and other seabirds through computer surveillance. As a species, guillemots are good indicators of any problems with their natural habitats. He says the project is challenging because the scenery is so changeable. Dickinson and fellow researchers will first work with a model of the environment surrounding a guillemot nest before venturing to Skomer Island where the birds reside. A closed-circuit TV system will track the time guillemots spend with mates, with their young, and looking for food. Computational ecologist Robin Freeman, who works for Oxford University and Microsoft Research, also is contributing to the project. Freeman and colleagues hid tubes with identification tag readers and weighing scales inside the homes of nesting pairs of Manx shearwaters. The devices, which weigh the fish the birds consume and gather other information, are linked to a wireless network that records all the data. "Computer monitoring has a huge potential in terms of streamlining data collection," says Sheffield professor Tim Birkhead. "Not just counting birds, but providing other kinds of biological information too. They could provide us with an early warning system."
Nanotech Could Make Humans Immortal by 2040, Futurist Says
Computerworld (10/01/09) Gaudin, Sharon
Ray Kurzweil says that humans who are alive in 2040 or 2050 could achieve near-immortality thanks to advances in nanotechnology that bring humans and machines closer together. Kurzweil says research is well underway that use nanobots to repair organs and cells as well as attack cancerous tumors. By inhabiting and eventually replacing blood, nanobots could heal wounds almost immediately. He also says that people will be able to regrow missing limbs and regain memories and personalities after a head injury by accessing a file on the computer. Nanotechnology also could make conditions such as Alzheimer's disease, cancer, obesity, and diabetes cease to exist. "The full realization of nanobots will basically eliminate biological disease and aging," Kurzweil says. "The nanobots will scout out organs and cells that need repairs and simply fix them. It will lead to profound extensions of our health and longevity." Massachusetts Institute of Technology researchers, for example, found that nanoparticles eliminated ovarian cancer in mice. University of London researchers used nanoparticles to create cancer-fighting genes that wiped out tumors while leaving normal cells alone. Kurzweil predicts that from 2024 onwards humans will add a year to their lives annually. But he stresses that nanotechnology carries risks as well as benefits, such as a self-replicating nanobot that becomes a non-biological plague. "Technology is not a utopia," he says. "It's a double-edged sword and always has been since we first had fire."
Aid Agencies Turn to Open-Source Software
New Scientist (10/03/09) Campbell, MacGregor
Wesleyan University and Trinity College students have developed Collabbit, software that acts as a virtual emergency response center. Collabbit serves as a central repository for information, using RSS or text messages to send project updates to response workers. The students built a prototype system in three weeks as part of the Humanitarian Free and Open Source Software (HFOSS) project, which developed applications in response to the Asian tsunami. The American Red Cross, the Salvation Army, and Catholic Charities USA, which saw potential in the project, also contributed to the design. The relief agencies recently modeled their response after a hypothetical hurricane strike on a major city, and were pleased with the performance of Collabbit. "Not only did it work, but it demonstrated to those who participated the value of the tool," says John Berglund, a coordinator with the New York City Salvation Army. HFOSS project director Trishan de Lanerolle says developing the program provided the computer science students with a meaningful outlet for their work. "The work they are doing is something that has an impact," he says. "It's not just a classroom exercise where you write your program and then delete it the next day."
New Digital Security Program Doesn't Protect as Promised
University of Texas at Austin (09/29/09)
The Vanish security system has been broken by a team of researchers from the University of Texas at Austin, Princeton University, and the University of Michigan. Developed by scientists at the University of Washington, Vanish is designed to protect a computer user's data by restricting the availability of the encryption key used to access it after a certain amount of time, such as eight hours. Vanish splits up the keys into many small pieces and stores them at many different places on the network, which makes the data look like digital gibberish. However, the team has developed a program, Unvanish, which is capable of collecting and storing anything that looks like a fragment of a Vanish key on the network, checking its archive of fragments and finding the pieces needed to decrypt a message. The researchers say Unvanish can make messages reappear long after they should have disappeared, close to 100 percent of the time. "A true self-destruction feature continues to be challenging to provide," says Texas professor Brent Waters. Texas professor Emmett Witchel says that "our goal with Unvanish is to discourage people from relying on the privacy of a system that is not actually private."
Researchers Hijack a Drive-By Botnet
Technology Review (10/02/09) Lemos, Robert
SDSC Part of $15 Million Project to Create 'FutureGrid' Computer Network
UCSD News (09/29/09) Zverina, Jan
The U.S. National Science Foundation has selected the San Diego Supercomputing Center (SDSC) to be part of a team to construct and operate an experimental high-performance testbed so that researchers can devise and test new parallel, grid, and cloud computing approaches in collaboration. The four-year FutureGrid project received a $10.1 million NSF grant to connect nine computational resources at a half-dozen partner sites across the United States and permit transatlantic collaboration through an alliance with the Grid'5000 computer infrastructure. FutureGrid project partners will contribute an additional $5 million in funding. FutureGrid will be comprised of approximately 1,400 central processing units, while projects that will benefit from the cluster will have vast data processing capabilities. "Researchers will be able to test new approaches to data analysis and computation on a wide range of customizable FutureGrid environments made possible by leveraging cloud computing technologies," says SDSC principal investigator Shava Smallen. SDSC scientists will work in benchmarking and implement and enhance the Inca monitoring software designed to spot problems with grid infrastructure by performing periodic, user-level grid-monitoring software and services. Linkage of computing and data resources will be facilitated via advanced research and education networks such as Internet2 and the National Lambda Rail. FutureGrid project leader and Indiana University professor Geoffrey Fox says future grids and clouds are visualized as many connected systems rather than one system.
Unix at 40: Hanging on Despite Strong Linux, Windows Challenges
InfoWorld (09/29/09) Krill, Paul
After four decades, the Unix platform is still very important to enterprise information technology and has many years of usefulness left to it, even though Linux and Windows Server have outpaced Unix in terms of sales volume. Hewlett-Packard's (HP's) Satya Scharma lauds Unix's deep integration and higher quality of service. "Unix really has been for what we call the much more demanding kinds of workloads, where you're looking at needing to have data warehouses which go to tens of terabytes," says HP's Brian Cox. He also notes that Unix is preferred by banks, manufacturers, and telecoms running millions of transactions per minute, because Linux and Windows Server do not have the uptime levels such tasks require. "[Unix is] really at the core of every one of our enterprise systems," says the University of Pittsburgh Medical Center's Paul Sikora. He says Unix delivers mature redundancy and clustering capabilities, and software vendors are amenable to the idea of their software operating on Oracle and Unix. Scharma estimates that the full Unix market for all vendors is still worth $18 billion annually for business worldwide. Still, that has not stopped HP and IBM from deeply embracing Linux and Windows, and Gartner analyst George Weiss anticipates that Linux will leverage hardware features such as better internal multitasking and multiprocessing.
Mobile Microbloggers Struggle to Make Their Postings Interesting, Study Finds
Helsinki University of Technology (09/28/09)
Mobile microbloggers are struggling to attract followers, according to researchers from the Helsinki Institute for Information Technology (HIIT), Google, and Elisa who have analyzed 400,000 messages from the popular microblogging service Jaiku. Postings about everyday experiences need to be interesting, but microbloggers tend to avoid revealing intimate emotions or experiences due to the public nature of the microblogging service, according to researchers Antti Oulasvirta, Esko Lehtonen, Esko Kurvinen, and Mika Raento. Comments are rare, and posts are usually about "work," "home," "lunch" and "sleeping." On Jaiku, a small number of users receive more than 50 percent of all comments. Some microbloggers use teasers for their postings and discuss the more interesting aspects of their daily experiences. However, a large number of new users stopped posting soon after registering because they were unable to build an audience.
After Losing Users in Catalogs, Libraries Find Better Search Software
Chronicle of Higher Education (09/28/09) Parry, Marc
People are worried that library catalogs are in danger of marginalization because they present information online in a clunky, non-intuitive way that often frustrates searchers. To combat this trend, a growing number of universities are investing in software designed to improve collection exploration through faceted searching, or Web-scale index searching. A number of other colleges are developing open source software designed to meet the same challenges without any licensing fees. Search results are ranked by relevance while prompts such as "did you mean..." are supported by these products. Vanderbilt University's Marshall Breeding points to resistance in the library community to the concept of Web-scale index searching, noting that some consider such interfaces to constitute a diminishing of library catalogs' intelligence. He says that traditional search tools bolster the concept that library users must clearly understand the various research materials. The new interfaces share the ability to derive content from catalogs and blend it with other data in a modern unit. Library community members such as the University of Virginia's Bess Sadler see the migration from commercial library search interfaces to open source interfaces as a "shift of power" in which libraries control the technology their patrons use to access their collections, which is particularly important in terms of customization.
Google Works on a Different Web
Science News (09/26/09) Vol. 176, No. 7, P. 10; Milius, Susan
The Google search engine has inspired a new algorithm that can predict which species losses will spark the fastest implosion of a food web, according to the University of Chicago's Stefano Allesina. "The problem of how ecosystems are likely to respond to the loss of species is quite important, particularly in light of how many different ways human activities are resulting in the local extinctions of populations," says the Santa Fe Institute's Jennifer Dunne. The algorithm's functionality is similar to Google's Web-page-ranking tool, PageRank. PageRank quantifies a page's value to searchers depending on the importance of the pages that link to it. However, the Google ranking system assumes that any page might lead to any other page--a concept that is not applicable to food webs. Allesina and Mercedes Pascual of the University of Michigan in Ann Arbor only drew connections between predator and prey. The researchers compared the new algorithm to others by using information from real-world food webs. The algorithm matched results of the standard-bearing genetic algorithm without being computationally intensive.
Virginia Tech College of Engineering's Improved Robotic Hand Captures Top Award
Virginia Tech News (09/24/09) Mackay, Steven
A team of five undergraduates from the Virginia Tech College of Engineering have placed first in a competition run by the American Society of Mechanical Engineers for RAPHaEL 2, a robotic hand that runs on compressed air. Rather than using a clunky motor, the hand relies on air pressure to grip objects. Varying levels of pressure can be applied depending on the item's shape and weight. The hand also uses a flexible tube actuator. Supported by the Robotics and Mechanisms Laboratory, the students improved upon the original RAPHaEL by adding closed loop control mechanism to its design as well as sensors for force feedback and automatic positioning of the fingers. Students also exchanged the first hand's acrylic material with a stronger polycarbonate one. "This gives us a lot more control over the kinds of things we can do with the hand," says student Kyle Cothern. "Eventually, we might be able to tell how soft an object you're grabbing is just by touching it." In the future, students hope to design the hand to pick up small objects in motion and to use lighter materials to make it look and feel more human. Cothern says that RAPHaEL 2 would be an ideal prosthetic hand because it is simple to use and its fingers are replaceable. Eventually, Virginia Tech researchers hope that RAPHaEL 2 will form part of the Cognitive Humanoid Robot with Learning Intelligence robot.
The Reality of Robot Surrogates
IEEE Spectrum (09/09) Corley, Anne-Marie
Telepresence technology has a long way to go before people can perform everyday duties with robotic avatars. Present-day telepresence interfaces consist of joysticks, wireless Internet links, video cameras, and audio through which people control robots. A NextGen Research study anticipates a booming market for domestic telepresence systems in the next five years. Telepresence is the most sensible choice for security and surveillance robots, according to study project manager Larry Fisher. Some contemporary telepresence machines are designed for functions outside of personal use, such as a robot that is controlled by a remote doctor via a joystick and secure Internet link to treat patients with stroke symptoms in 31 Michigan hospitals. Science fiction movies often portray robotic control through a direct connection to the human brain, and there is research underway to create and perfect brain-machine interfaces so that people can control robots by thought. Neuroscientists learned in the late 1990s that implanting electrodes in animals' brains and connecting them to computers made it possible to decode their intent. University of Florida researchers Jose Principe and Justin Sanchez are working on an interface that uses the motor cortex to trigger actions not related to movement.
Abstract News © Copyright 2009 INFORMATION, INC.
To submit feedback about ACM TechNews, contact: [email protected]
Change your Email Address for TechNews (log into myACM)