Welcome to the May 10, 2010 edition of ACM TechNews, providing timely information for IT professionals three times a week.
HEADLINES AT A GLANCE
States Move to Allow Overseas and Military Voters to Cast Ballots by Internet
New York Times (05/07/10) Urbina, Ian
The U.S. Election Assistance Commission (EAC) released guidelines that would allow nearly 3 million overseas and military voters to cast votes over the Internet in November. The EAC plan worries cybersecurity experts, election officials, and voting integrity advocates. They note that email messages and voting Web sites are vulnerable to interception or hacking. Congress mandated in 2009 that the EAC develop guidelines for pilot programs to aid overseas voting, including online voting. Most states seek EAC certification of voting technology, and the commission's Jeannie Layson says "the EAC hopes that the work we do in 2010 will assist states already running pilot programs to improve services for military and overseas voters." The majority of the 33 states that have developed pilot programs for Internet voting will let voters send completed ballots as an email attachment, while faxes, which are another approved method for sending votes, are increasingly being sent on the Web due to the growing use of voice-over-Internet phone service. Critics say the EAC is circumventing the technical board that is supposed to review new regulations and also may be violating federal law by not allowing enough time for public comment on the guidelines.
Joining the Dots to Put Pollution on the Map
ICT Results (05/10/10)
European researchers working on the INTAMAP project have developed a statistical tool that can turn a set of point measurements into a contour map that can be published on the Web in real time. The INTAMAP project, led by University of Munster's Edzer Pebesma, uses a process called interpolation to find the value of an environmental variable at a point on a map where there is no monitoring device. The system creates a contour map that shows what is happening between the measurement points and describes how accurate those measurements are. The open source interpolation software accepts raw data published on the Web using standards developed by the Open Geospatial Consortium (OGC). INTAMAP analyzes the data and conforms to OGC standards to create maps automatically, display them on the Web, and update them as needed. Pebesma says the INTAMAP tools could help researchers study weather patterns, groundwater pollution, agriculture, medical imaging, and other areas where a two-dimensional picture needs to be created from a series of point readings.
IBM and NTU Announce Collaborative Effort to Converge Cloud Computing and High Performance Computing
Nanyang Technological University (Singapore) (05/07/10) Lai, Alvin; Hambari, Hisham
IBM and Singapore's Nanyang Technological University (NTU) announced a joint effort to research and develop a platform for the convergence of cloud computing and high performance computing (HPC). "This collaboration between IBM and NTU pushes the envelope of technology to test leading-edge applications that will benefit faculty, students, and also business and government organizations that seek to leverage the power of cloud computing and high performance computing," says IBM Singapore chief technologist Foong Sew Bun. NTU professor Soh Yeng Chai says the joint research effort "will place NTU at the forefront of high performance computing, and hopefully encourage [small and medium-sized businesses] as well as large organizations to leverage HPC-cloud computing." The initiative will enable NTU faculty to pursue HPC and cloud-computing projects relevant to different industries, including engineering, mathematical sciences, finance and business, and medical research. The collaboration initially will focus on interactive digital media and business analytics.
Seeing the Forest for the Trees
MIT News (05/07/10) Hardesty, Larry
Object recognition systems that deconstruct images into ever smaller elements should be much more efficient and may yield insights on brain behavior, and underlying such systems are new methods developed by researchers at the Massachusetts Institute of Technology (MIT) and the University of California, Los Angeles (UCLA). The researchers have developed a system that learns to recognize new objects by being "trained" with digital images of labeled objects. For each labeled item, the system first identifies the smallest elements, and then seeks instances in which these elements are interconnected into slightly more complex configurations. The system continues to search for instances in which shapes of ever increasing sophistication are linked together until it has put together a hierarchical catalog of increasingly complex components whose top layer is a model of the entire object. The system then sifts through its catalog from the top down, weeding out all redundancies. Memory is saved because different objects can have shapes in common, requiring only once instance of memory storage.
A 3D Environment Model Enhances Collaboration During Learning
Universidad Politecnica de Madrid (Spain) (05/10/10) Martinez, Eduardo
Universidad Politecnica de Madrid (UPM) researchers have developed a model for three-dimensional (3D) virtual learning environments using an autonomous virtual tutor that detects collaboration. The model analyzes nonverbal communication relating to collaborative interaction that takes place while a task is completed. An avatar personifies the tutor in the learning process, which shows up in the material framework provided by virtual environments. The model proposes a schema that identifies what non-verbal communication signals are likely to be useful and how to measure and relate them to particular effective collaborative learning indicators. The tutor uses text messages to give advice to students as they are completing a task. The messages are activated when students have not satisfactorily attained the indicators of effective collaborative learning. Part of the research focused on developing guidelines for relating collaborative learning indicators to certain nonverbal communication signals. Additionally, the model can be adapted to monitor other types of activities based in virtual environments, such as training or meetings.
Robot-Inflicted Injuries Studied
BBC News (05/07/10)
German researchers have developed a prototype safety system that would reduce the injuries of humans working alongside robots using household tools. The collision detection system uses torque sensors to determine when a kitchen knife, screwdriver, or scissors hits a different substance, and halts the movements of the sharp tool. The team from the Institute of Robotics and Mechatronics conducted strike tests on a silicon lump, a leg from a dead pig, and the arm of a human volunteer, using a robot arm. When the safety system was turned off, the robot produced deep cuts that could prove to be lethal to a living subject.
Microsoft Designs Chip That Scales From Datacentre to Mobile Handset
Computer Weekly (05/06/10) Grant, Ian
Microsoft's joint venture with a supercomputing center in Barcelona aims to develop a processor that will scale from a data center server to a smartphone, which would save energy and require less space. The researchers hope to apply vector processing technology to commercial applications such as making data centers and mobile handsets run more efficiently. The goal of the energy-efficient, composable vector processor project is to build a device that uses grid computing techniques to analyze multiple streams of data in parallel, and for the device to reconfigure itself on the fly in response to the workload it receives, say Microsoft researchers Timothy Hayes and Oscar Palomar. The technique uses some of the concepts of reduced instruction set computing, as well as new programming so that a single instruction can initiate an array of complex processes. The researchers also are working on scheduling algorithms to allocate work efficiently and to accurately recombine results from processes.
Yale Scientists Explain Why Computers Crash But We Don't
Yale University (05/03/10) Hathaway, Bill
Yale University researchers have described why computers tend to malfunction more than living organisms by analyzing the control networks in both an E-coli bacterium and the Linux operating system. Both systems are arranged in hierarchies, but with some key differences in how they achieve operational efficiencies. The molecular networks in the bacteria are arranged in a pyramid, with a limited number of master regulator genes at the top that control a wide base of specialized functions. The Linux operating system is set up more like an inverted pyramid, with many different top-level routines controlling a few generic functions at the bottom. This organization arises because software engineers tend to save money and time by building on existing routines rather than starting systems from scratch, says Yale professor Mark Gerstein. "But it also means the operating system is more vulnerable to breakdowns because even simple updates to a generic routine can be very disruptive," Gerstein says.
Stimulus Funds Bring Supercomputer to Pittsburgh Area
Pittsburgh Post-Gazette (05/05/10) Hamil, Sean D.
D.E. Shaw Research will house its new Anton supercomputer at the Pittsburgh Supercomputing Center beginning next fall. Anton is a massively parallel, 512-node supercomputer that reportedly offers ground-breaking performance capabilities. "This computer does work that really wasn't even possible until now," says Pittsburgh Supercomputing Center biomedical scientist Markus Dittrich. Anton features a series of algorithms that can project how all the thousands of parts of a protein interact. "This computer has the potential to be a great accelerator in the development of drugs, how drugs work, and how systems work," says Jeremy Berg, director of the National Institute of General Medical Sciences, which provided a $2.7 million grant to pay for Anton's use at the supercomputing center. Anton took more than 10 years to create at D.E. Shaw Research, a private laboratory founded by David E. Shaw. "It's a pretty amazing machine [and] now people would like to get their hands on a machine to see if it can do what he says," notes University of Utah professor Thomas Cheatham.
Is Water the Key to Cheaper Nanoelectronics?
New Scientist (05/06/10) Barras, Colin
Kavli Institute of Nanoscience researchers have developed a way to use water to quickly transfer layers from one surface to another by exploiting the fact that different materials have different hydrophilicity, a discovery that could lead to lower manufacturing costs for nanoelectronics. The researchers, led by Kavli's Gregory Schneider and Cees Dekker, developed a solid hydrophobic layer on top of a silicon wafer by dipping it in a solution containing a hydrophobic polymer. They then submerged the wafer in water, which wedged the layers off the silicon base. Intermolecular forces between the graphene and silicon provide a stable attachment and eliminate the need for glue, Dekker says. Repeating the technique several times would allow graphene layers to be built up into a complex nanoelectronic structure. "A [three-dimensional] microelectrode can be designed, layer by layer, using our 'wedging' transfer technique," Schneider says.
New Data Analysis System Could Do Double Duty
UT Dallas News (05/04/10) Moore, David
A new system for identifying potential Internet threats has been developed by researchers at the University of Texas (UT) at Dallas. Designed to analyze behavioral data, the system monitors network traffic and provides an alert when it notices worrisome deviations in normal activity. "We proposed a novel platform that thoroughly analyzes network traffic behavior to identify potential Internet threats," says UT Dallas professor Mehrdad Nourani. The technology uses two subsystems that work in parallel to reach a high speed and use memory efficiently, which allows for faster results and optimal use of resources. A bell-shaped curve of normal activity is built, and it can achieve nearly zero false positives and negatives when identifying abnormalities outside the curve. Nourani says the technology also could be used to analyze health data and detect abnormal health issues such as heart arrhythmia, sleep apnea, or epileptic seizure.
Microsoft Science Fair Hits Silicon Valley
CNet (05/06/10) Fried, Ina
Microsoft's annual TechFair visited Silicon Valley so that Microsoft product teams and local research labs could see the latest developments. One spotlighted TechFair project seeks to supply more data to researchers studying strategies for protecting fish in San Francisco Bay Area waterways by using computers to mash up various data sources and present the results to scientists in an Excel spreadsheet. Microsoft tools are making it possible to more deeply analyze the numerous variables that are weighed in deciding the optimum underlying conditions for migratory salmon. Microsoft research lab director Rick Rashid says the same core system is being used to monitor weather data in Southeast Asia and seismic data in the Swiss Alps. Ensuring that researchers maintain individual privacy when they offer aggregate data was the focus of work by TechFair presenter Cynthia Dwork, who demonstrated a mathematical formula for determining whether a particular data set is likely private or not. Also on display at TechFair was a project investigating the possibility of a smartphone choosing on the fly whether to run parts of a program on the phone itself or transmit the computational work to a remote server or PC.
Too Much Data, Too Few Drugs
Fortune (04/29/10) Duncan, David Ewing
Two hundred biologists and computer scientists gathered at the first-ever Sage Congress have proposed creating an open source model that standardizes and links together thousands of scientific databases worldwide. In the future, such a system could give researchers and scientists access to thousands of raw genetic data samples that could then be connected and used to explain how a disease functions. Sage Bionetworks, a new nonprofit that organized the conference, also plans to build systems that can mimic the human body and help researchers analyze complex interactions among networks of genes. Merck has agreed to pass on some technology from the disbanded Rosetta project, which resulted in one of the fastest supercomputers in the drug industry. Open source technology will likely be used, and projects such as Science Commons are working to overcome the legal, financial, and infrastructural barriers to sharing studies and data.
Abstract News © Copyright 2010 INFORMATION, INC.
To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.