Association for Computing Machinery
Welcome to the July 22, 2009 edition of ACM TechNews, providing timely information for IT professionals three times a week.

HEADLINES AT A GLANCE


Seeking Advice on Women in Science
Inside Higher Ed (07/22/09) Eisen, Ben

A hearing of the U.S. House Committee on Science and Technology's Subcommittee on Research and Science Education concluded that while the scientific and engineering community is far more diverse than it was 20 years ago, there are still major gender gaps in the field. The under-representation of women in science, technology, engineering, and math (STEM) fields can be seen in the proportions of degrees granted to each gender. In 2006, women received 58 percent of all bachelor's degrees, but only 20 percent of computer science bachelor's degrees, 21 percent of physics degrees, and 20 percent of engineering degrees, the National Science Foundation (NSF) reports. Nevertheless, the NSF also reports that women hold more than half of science and technology degrees, with women earning 77 percent of psychology degrees, 62 percent of biological sciences degrees, and 54 percent of social sciences degrees. Catholic University professor Sandra Hanson says the culture of science is often associated with white men. She says that when children are asked to draw pictures of scientists, they usually draw white males, and when they do draw women they often look "severe and unhappy." Nearly 70 percent of all fourth graders report liking science, but by eighth grade male students report liking STEM fields twice as much as female students. American Association for the Advancement of Science CEO Alan Leshner says that role models may already be an effective method for eliminating gender gaps, as can be seen in biological sciences, where the number of female role models is far greater than in other STEM fields.


Researchers Train Minds to Move Matter
New York Times (07/21/09) P. D6; Blakeslee, Sandra

New research into brain-machine interaction indicates that a basic rethinking of brain-machine experiments may be in order. Previous studies involved giving computer interfaces that translate thought impulses into movements fresh instructions on a daily basis, but work by University of California, Berkeley scientists shows that monkeys learned how to move a cursor by thought using just one set of instructions and an atypically small number of brain cells that deliver instructions for executing movements the same way every day. Electrodes are implanted directly into the animals' brains to record activity from a group of 75 to 100 neurons that help guide movement, when the monkeys move a hand or arm. The limb is later rendered immobile, and researchers can predict the monkey's intent by studying the neurons' activity, and then send the pattern to a decoding algorithm that converts the brain signals into commands that a machine can perform. The variability induced by motions of the electrodes and changes in neurons has led researchers to assume that a new group of neurons would govern the movements every day, prompting daily recalibration of the decoder and forcing the animal to relearn the task in each session. Berkeley professor Jose M. Carmena postulated that an initially random population of 10 to 15 neurons could be persuaded into forming a stable motor memory. His team trained a pair of monkeys to move a cursor to targets with a joystick, and then taught them to move the cursor by thought during several weeks of practice sessions. The decoder was kept constant as it measured the monkeys' neuron activity, and then Carmena changed the decoder. The animals used the same small group of neurons to learn the new task in just a few days, and they had no difficulty switching back and forth between the two tasks. "This is the first demonstration that the brain can form a motor memory to control a disembodied device in a way that mirrors how it controls its own body," Carmena says.


Federal CTO Says U.S. Lagging in Innovation
InformationWeek (07/21/09) Hoover, J. Nicholas

The United States must capitalize on emerging opportunities in order to regain the top spot in innovation, federal chief technology officer Aneesh Chopra said in a keynote address to the recent Open Government and Innovations Conference in Washington, D.C. The Obama administration is investing in technology such as healthcare information technology and the smart grid, and these investments offer the United States an opportunity to take the lead in these areas, he said. "Driving game-changing innovation" will be a focus of Chopra, who added that open data standards, research and development investment, and preparing the workforce for jobs of the future will help spur innovation. Chopra cited a report from the Information Technology and Innovation Foundation that ranks the United States last among 36 countries on technology metrics such broadband activity, research and development tax credits, and immigration policy. "We have failed to translate the power and potential in our nation's capacity to compete in a more globally competitive marketplace," he said.


Japan's Next-Generation Supercomputer Configuration Is Decided
HPC Wire (07/20/09)

Japan's Institute of Physical and Chemical Research (RIKEN) and Fujitsu will deploy a new system configuration with a scalar processing architecture in its next-generation supercomputer. The goal of the supercomputer project, backed by Japan's Ministry of Education, Culture, Sports, Science, and Technology, is to develop a supercomputer capable of reaching 10 petaflop performance by 2012. The initial plan was to create a hybrid system that featured both scalar and vector units, but NEC, which was responsible for developing the vector units, recently told RIKEN that it would be unable to participate in the project's production stage, essentially eliminating the possibility of a hybrid system. RIKEN has decided to pursue a scalar configuration, maintaining its original goal of achieving a LINPACK performance of 10 petaflops. The supercomputer will use 128-gigaflop CPUs developed and manufactured by Fujitsu using 45nm process technology. The system's configuration will provide both energy efficiency and massive parallel computing capabilities. RIKEN also plans to work with organizations responsible for promoting supercomputer utilization to provide users of vector units with sufficient support.


Yale Researchers Create Database-Hadoop Hybrid
Computerworld (07/21/09) Lai, Eric

Yale University professor Daniel J. Abadi has led the development of HadoopDB, an open source parallel database management system (DBMS) that combines the data-processing capabilities of a relational database with the scalability of new technologies such as Hadoop and MapReduce. HadoopDB was developed using components from PostgreSQL, the Apache Hadoop data-sorting technology, and Hive, the international Hadoop project launched by Facebook. HadoopDB queries can be submitted as either MapReduce or in SQL language. Abadi says data processing is partially done in Hadoop and partially in "different PostgreSQL instances" spread out over several nodes in a shared-nothing cluster of machines. He says that unlike previously developed DBMS projects, HadoopDB is not a hybrid only at the language/interface level, but also at the systems implementation level. Abadi says HadoopDB combines the best of both approaches to achieve the fault tolerance of massively parallel data infrastructures, such as MapReduce, in which a single server failure has little effect on the overall grid, and is capable of performing complex analyses almost as quickly as existing commercial parallel databases. He says that as databases continue to grow, systems such as HadoopDB will "scale much better than parallel databases."


Smart Clothes for Better Healthcare
ICT Results (07/20/09)

The European HealthWear project has embedded sensing devices directly into textiles, creating clothing capable of monitoring the wearer's heart, breathing, and body temperature. "Remote monitoring is ideally suited to patients suffering from chronic diseases or recovering from an incident, such as a heart attack, who would otherwise have to spend longer [periods of time] in a hospital or visit their doctor more frequently for checkups," says project supervisor Theodore Vontetsianos. "By embedding the sensors in a vest that patients feel comfortable wearing, and requiring only a mobile phone-sized device to gather and transmit the information, the system empowers patients to be more active and independent while letting caregivers check on them at any time or in any place as necessary." The HealthWear system collects information from numerous sensors using a single device called the Portable Patient Unit (PPU). The embedded sensors include a six-lead electrocardiograph, respiration movement, pulse rate and skin temperature monitors, and an external oximeter to measure blood oxygen saturation. A three-dimensional accelerometer, inside the PPU, measures body position. The collected information is stored in the wearer's electronic health record and can be accessed over a secure Internet connection. The HealthWear developers say the PPU could eventually be replaced with a normal mobile phone.


Moore's Law Hits Economic Limits
Financial Times (07/21/09) P. 15; Nuttall, Chris

Much attention has been given to the approaching scientific limit to chip miniaturization and the continuation of Moore's Law, but an economic limit is nearing faster than the predicted scientific limit, according to some experts. "The high cost of semiconductor manufacturing equipment is making continued chipmaking advancements too expensive to use for volume production, relegating Moore's Law to the laboratory and altering the fundamental economics of the industry," says iSuppli's Len Jelinek. He predicts that Moore's Law will no longer drive volume chip production after 2014 because circuitry widths will dip to 20 nanometers or below by that date, and the tools to make those circuits will be too expensive for companies to recover their costs over the lifetime of production. The costs and risks associated with building new fabrication systems have already forced many producers of logic chips toward a fabless chip model, in which they outsource much of their production to chip foundries in Asia. At the 90nm level there were 14 chipmakers involved in fabrication, but at the 45nm level that number has been reduced to nine, and only two of them, Intel and Samsung, have plans to create 22nm factories. However, Intel's Andy Bryant says that as long as demand is maintained by consumers and businesses looking for the most advanced technology, Moore's Law, and the major investments it requires, will continue to make economic sense.
View Full Article - May Require Free Registration | Return to Headlines


Metro Discovers Problems in Additional Track Circuits
Washington Post (07/22/09) P. A1; Sun, Lena H.; Layton, Lyndsey

Washington, D.C., Metro officials and documents suggest that the technological malfunction behind the fatal train collision in June may be widespread, as indicated by subsequent malfunctions in at least six other track circuits discovered by the transit agency. The Metro system depends on track circuits to maintain a safe distance between trains, and the circuit system uses audio frequencies transmitted between the train and the rails to detect trains and automatically sends signals to the next train down the line. If the following train gets too close, the system transmits a signal that forces it to halt. Records show that the malfunctioning track circuits have failed to properly detect the presence of trains in recent weeks, and the Metro system's operation is unsafe if the location of trains is not known at all times. Metro rail chief Dave Kubicek says the cause of the problems is not known to Metro officials, and he suggests that track maintenance might be playing a role in some of the malfunctions. Some of the track circuits were deactivated last week after the agency stepped up reviews of recorded track circuit data, performing them after each rush period. Since the crash, the frequency of the agency's computerized reports has increased from once a month to twice a day to check up on track circuitry, and Metro calls the report "an added layer of inspection" beyond the physical inspection carried out earlier. On July 13, the National Transportation Safety Board categorized the Metro's train protection system as inadequate and exhorted the transit agency to deploy a real-time backup.
View Full Article - May Require Free Registration | Return to Headlines


New Technology to Make Digital Data Self-Destruct
New York Times (07/20/09) Markoff, John

University of Washington computer scientists have developed software that enables electronic documents to automatically destroy themselves after a certain period of time. The researchers say the software, dubbed Vanish, will be needed as an increasing number of personal and business documents are moved from being stored on personal computers to centralized machines or servers as the cloud computing trend grows. The concept of having digital information disappear after a period of time is not new, but the researchers say they have developed a unique approach that relies on "shattering" an encryption key that is widely distributed across a peer-to-peer file sharing system. Vanish uses a key-based encryption system in a new way, allowing for a decrypted message to be automatically re-encrypted at a specific point in the future without fear that a third party will be able to access the key needed to read the message. The researchers say that pieces of the key "erode" over time as they are gradually used less and less. To make the keys erode, Vanish uses the structure of peer-to-peer file systems, which are based on millions of personal computers that join and leave the network at will, creating frequently changing Internet addresses, making it incredibly difficult for an eavesdropper to reassemble the key because it is never held in a single location. A major advantage of Vanish is that it does not rely on the integrity of third parties, as other systems do.


Virtual Clouds Could Prevent Data Centers Destroying the Planet
Technology Review (07/21/09)

A virtual cloud has the potential to prevent failures from taking down an entire system, according to Alexandros Marinos at the University of Surrey in the United Kingdom and Gerard Briscoe at the London School of Economics. The concept would involve community clouds, made up of users from around the world, donating their computing resources. The virtual cloud would still offer services and applications over the Internet, although the approach would be more demanding. Marinos and Briscoe view Wikipedia, which relies on donations of computing resources, as the ideal test bed. The researchers believe the threat to the capacity of power grids posed by the rapidly expanding carbon footprint of data centers could lead to the development of virtual clouds.


Robo-Ethicists Want to Revamp Asimov's 3 Laws
Wired News (07/22/09) Ganapati, Priya

Researchers continue to work to make robots safer to be around humans, but some robot experts say the key is to stop making robots that lack ethics. "If you build artificial intelligence but don't think about its moral sense or create a conscious sense that feels regret for doing something wrong, then technically it is a psychopath," says Josh Hall, author of "Beyond AI: Creating the Conscience of a Machine." For years, science fiction author Issac Asimov's Three Laws of Robotics have been used for a robot's behavior guidelines. The three laws are: A robot may not injure a human being or allow one to come to harm, a robot must obey orders given by human beings and a robot must protect its own existence. However, as robots are increasingly incorporated into the real world, some believe that Asimov's laws are too simplistic. Robo-ethicists want to develop a set of guidelines that outline how to punish a robot, decide who regulates robots and even create a "legal machine language" to help police the next generation of intelligent automated devices. Willow Garage research scientist Leila Takayama says even if robots are not completely autonomous, there needs to be a clear set of rules governing responsibility for their actions. The next generation of robots will be able to make independent decisions and work relatively unsupervised, which means rules must be established that cover how humans should interact with robots and how robots should behave, robo-ethicists say.


UK Researchers Trial Real-Time Pollution Monitoring System
Electronics Weekly (UK) (07/16/09) Bush, Steve

British universities are testing a high-resolution pollution monitoring system called Mobile Environmental Sensing System Across Grid Environments (MESSAGE) in cities throughout the United Kingdom. The objective is to provide real-time pollutant concentration maps, which will be created using fixed, human-carried, and vehicle-carried sensors. The collected information will be filed into two data infrastructures. "There are different communications and different databases, but the system allows us to run one Web site to display data regardless of how it was collected," says Imperial College London (ICL) principle researcher Robin North. MESSAGE will measure air quality, and the collected data will be used to create three-dimensional pollution cloud models. "There is a lot that we do not know about air quality in our cities and towns because the current generation of large stationary sensors don't provide enough information," says ICL professor John Polak. "We envisage a future where thousands of mobile sensors are deployed across the country to improve the way we monitor, measure, and manage pollution in our urban areas." The University of Newcastle has developed a sensor box that can be attached to lamp posts in urban areas, and the University of Cambridge is working on a sensor that can be carried by people, either walking or cycling. Both sensors are capable of detecting NO2 and CO, and the mobile sensor can detect NO and CO2. The mobile sensor uses a Bluetooth link to connect to the person's mobile device, which relays the information back to the project database. The vehicle sensor, being developed by ICL, needs to have a faster response time than the mobile or stationary sensor.


Abstract News © Copyright 2009 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Unsubscribe
Change your Email Address for TechNews (log into myACM)