Welcome to the June 13, 2012 edition of ACM TechNews, providing timely information for IT professionals three times a week.
ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).
HEADLINES AT A GLANCE
Tech World Preps to Honor 'Father of Computer Science' Alan Turing, as Centenary Nears
Network World (06/11/12) Jon Gold
Worldwide celebrations will mark the 100th anniversary of Alan Turing's birth on June 23. Turing is considered the "father of computer science," and Google vice president and ACM President-elect Vint Cerf says Turing's importance cannot be overstated. "The man challenged everyone's thinking," Cerf says. "He was so early in the history of computing, and yet so incredibly visionary about it." Cerf and Rice University professor Moshe Vardi are helping organize ACM's celebration of Turing's birth. "The theory of computing really started in the 20th century, and Turing is one of the foremost--if not the foremost--parents of the theory of computing," Vardi says. Turing studied at King's College and Cambridge before inventing one of his most important conceptual works, known as the "a-machine" or Turing machine. Turing is probably best known for his central role in cracking German military codes during World War II. In addition to contributing to the war effort, Turing's work at Britain's Government Code and Cipher School led to the development of the basic computers he would design after the conclusion of the war. Most notably, Turing invented the Automated Computing Engine, which was influential in the development of the modern computer. The ACM celebration will have about 32 A.M. Turing Award winners on hand, in addition to more than 1,000 attendees.
Robotic Assistants May Adapt to Humans in the Factory
MIT News (06/12/12) Jennifer Chu
Massachusetts Institute of Technology (MIT) researchers have developed an algorithm that enables a robot to quickly learn an individual's preference for a certain task and adapt accordingly to help complete the task. The researchers are using the algorithm to train robots and humans to work together. "Using this algorithm, we can significantly improve the robot’s understanding of what the person’s next likely actions are," says MIT professor Julie Shah. During testing, the researchers examined spar assembly, a process for building the basic structural element of an aircraft's wing. The researchers tested the algorithm on FRIDA, a flexible robot with two arms capable of a wide range of motion that can be manipulated to either fasten bolts or paint sealant into holes, according to Shah. The researchers first developed a computational model in the form of a decision tree, with each branch representing a choice that a mechanic may make. They then used the decision tree to train a robot to observe an individual's chain of preference. Shah says that once the robot learned a person’s preferred order of tasks, it quickly adapted, either by applying sealant or fastening a bolt according to a person’s particular style of work.
NIH Seeking Advances in Spatial Uncertainty
CCC Blog (06/11/12) Erwin Gianchandani
The U.S. National Institutes of Health (NIH) recently issued a solicitation for innovative research that identifies sources of spatial uncertainty in public health data, incorporates the inaccuracy into statistical methods, and develops tools to visualize the nature and consequences of the spatial uncertainty. NIH says spatial uncertainty takes one or a combination of several forms. For example, geocoding is a procedure that converts information about the locations of people, homes, and other entities to geographic coordinates. In population-based public health data, geocoding is commonly obtained by an automated procedure and the results are well known to contain positional errors. To protect people's privacy, geographic information in disease data is usually not released, which results in gaps or incompleteness in data. Spatial uncertainty also appears when data come through a variety of collection schemes at varying spatial scales. Boundaries of spatial units can evolve across time, which adds another layer of mismatches to a spatio-temporal level, according to NIH. In addition, NIH is seeking statistical methods to model spatial uncertainty, as well as new geographic information system methods for addressing and visualizing spatial uncertainty.
Safety Measure Lets Cars Talk to Each Other to Avoid Crashes
Associated Press (06/10/12) Joan Lowy
The future of automotive safety will involve cars equipped with vehicle-to-vehicle (V2V) communications technology, which enables them to talk to each other and warn drivers of impending collisions. The U.S. government plans to launch a yearlong, real-world test of V2V technology involving about 3,000 cars, trucks, and buses with volunteer drivers. The vehicles will be equipped to continuously communicate over wireless networks, exchanging information on their location, direction, and speed 10 times per second with other vehicles within about 1,000 feet. An on-board computer analyzes the information and issues danger warnings to drivers. More advanced versions of V2V systems will be able to take control of the vehicle to prevent an accident if the driver is slow to react. In addition, connected cars will alert drivers if they do not have enough time to make a left turn because of oncoming traffic. Connected cars also will be able to exchange information with similarly equipped transportation infrastructure, such as traffic lights, signs, and roadways. The stream of vehicle-to-infrastructure information could give traffic managers a better idea of traffic flows, resulting in better timing of traffic signals. U.S. National Highway Traffic Safety Administration researchers have been working on the technology with eight automakers.
Research at the University of Twente: Wi-Fi Will Soon Reach Its Limits
University of Twente (Netherlands) (06/08/12) Joost Bruysters
University of Twente researchers have found that in some cases, Wi-Fi's efficiency can drop to less than 20 percent in areas where many different networks and wireless Internet devices are operating. The increasing demand for bandwidth means that the efficiency of Wi-Fi is likely to fall still further in the future, leading the Twente researchers to believe that there should be a new Wi-Fi standard. "Increasing use of the Wi-Fi spectrum probably means that we can expect ever more problems in the future," says Twente researcher Roel Schiphorst. "It is important that manufacturers explore ways of improving the Wi-Fi standard in busy scenarios," says Radiocommunications Agency Netherlands researcher Taco Kluwer. During testing, the researchers found that in places where numerous wireless networks are active, or where large numbers of wireless-enabled devices are operating, each individual network can experience a loss of efficiency.
University of Glasgow Developing New Type of Internet Search Engine
University of Glasgow (United Kingdom) (06/08/12) Ross Barker
A system that enables Internet users to search and analyze data from sensors will be developed as part of a joint research initiative in Europe. The system’s search engine will match queries with information from sensors and cross-reference data from social networks such as Twitter. As a result, users will be able to receive detailed answers to natural language search questions. The Search engine for MultimediA Environment geneRated contenT (SMART) project will be built on an open source search engine technology, known as Terrier, which researchers at the University of Glasgow have been developing since 2004. "The SMART engine will be able to answer high-level queries by automatically identifying cameras, microphones, and other sensors that can contribute to the query, then synthesizing results stemming from distributed sources in an intelligent way," says Glasgow's Iadh Ounis. "SMART builds upon the existing concept of 'smart cities,' physical spaces which are covered in an array of intelligent sensors which communicate with each other and can be searched for information." The researchers say SMART could be field-tested by 2014.
MEXT and NSF Statement on Big Data and Disaster Research Collaboration From NSF Director Dr. Subra Suresh and MEXT Minister Mr. Hirofumi Hirano
National Science Foundation (06/08/12) Dana Topousis
U.S. National Science Foundation director Subra Suresh and Japan's MEXT minister Hirofumi Hirano recently reached an agreement for U.S.-Japan collaboration in disaster research. They agreed to support collaboration among computer scientists, engineers, social scientists, biologists, geoscientists, physical scientists, and mathematicians that could help improve disaster response through big data. The nations are interested in using big data generated from disasters to advance analytic, modeling, and computational capabilities, with applications such as probabilistic hazard models. Research could help make information technology more resilient and responsive by enabling real-time data sensing, visualization, analysis, experimentation, and prediction, which is critical for making time-sensitive decisions. NSF and MEXT also want to advance sustainable civil infrastructure and distributed infrastructure networks, acquire big data, and improve knowledge of preparedness and response. Moreover, they want to integrate expertise from multiple disciplines, input from end users, and big data from all relevant sources. The agencies will develop a plan of action, and a more detailed agreement could come before the end of the year.
Robotics Helps Us Become More Competitive
Carlos III University of Madrid (Spain) (06/11/12)
Carlos III University of Madrid (UC3M) researchers recently coordinated the RoboCity12 consortium, which concluded that robotics can help humans innovate and become more competitive in the international marketplace. RoboCity12 gathered together nearly 40 robots with the goal of presenting the most recent advances in robotics that have been developed in Madrid, as well as the main innovations that have appeared on the national and international scenes. "In short, this is about involving society in the themes related to robotics," says UC3M professor Carlos Balaguer. UC3M professor Paolo Dario gave a talk on the robotics of the future, and discussed the potential benefits that robotics could have for during emergencies such as forest fires or natural disasters. Dario also said robotics offers many tools for helping societies face the demographic challenges posed by aging populations. In addition, he described the possibilities presented by “neuro-robotics," which he thinks is a growing area of research created by the merging of robotics and neuroscience.
Researchers Predict Advent of 'Resource-as-a-Service' Clouds
IDG News Service (06/05/12) Chris Kanaracus
The cloud model of rented virtualized servers and storage on demand could be sold in a much more efficient and granular manner, with specific resources rented for just a few seconds at a time, according to Technion-Israel Institute of Technology researchers, who dubbed their model resource-as-a-service (RaaS). "With the advent of Web hosting, clients could rent a server on a monthly basis," say the Technion-Israel researchers. However, the hourly rental model will not be sufficient as the cloud market matures. "If you only pay for a full second or any part of it, then you will only waste half a second over the lifetime of every virtual machine," the researchers note. Some cloud providers have begun selling sub-hour sections, as well as other infrastructure-as-a-service offerings, such as fixed bundles of computer power, memory, and networking resources. The researchers say that in a RaaS cloud, customers would purchase "seed virtual machines" containing a baseline of resources as well as an "economic agent" for adding additional capacity. Cloud service providers' software also would incorporate economic agents to represent their own interests. "In the RaaS cloud, virtual machines never know the precise amount of resources that will be available to them at any given second," the researchers note.
SFU Helps Quantum Computers Move Closer
Simon Fraser University (06/07/12) Don MacLachlan
Researchers at Simon Fraser University, Oxford University, and in Germany have developed 28Silicon, a highly enriched and purified material that enables processes to take place and to be observed in a solid state that were once thought to require a near-perfect vacuum. The researchers, led by Simon Fraser's Mike Thewalt, have extended the time in which scientists can manipulate, observe, and measure these properties from a matter of seconds to about three minutes. The breakthrough “opens new ways of using solid-state semiconductors such as silicon as a base for quantum computing,” Thewalt says. “A classical 1/0 bit can be thought of as a person being either at the North or South Pole, whereas a qubit can be anywhere on the surface of the globe--its actual state is described by two parameters similar to latitude and longitude.” He says a quantum computing system with enough qubits available could complete calculations in a matter of minutes that today’s most powerful supercomputers would take years to complete.
Rice, UCLA Slash Energy Needs for Next-Generation Memory
Rice University (06/07/12) David Ruth; Jade Boyd
Researchers at Rice University and the University of California, Los Angeles (UCLA) have developed a data-encoding system that cuts more than 30 percent of the energy needed to write data onto new memory cards that use phase-change memory (PCM). The researchers note that PCM performs the same function as flash memory, but in a faster, less expensive, and more energy-efficient way. “We developed an optimization framework that exploits asymmetries in PCM read/write to minimize the number of bit transitions, which in turn yields energy and endurance efficiency,” says Rice graduate student Azalia Mirhoseini. The researchers say their encoding method is the first to take advantage of PCM's asymmetric physical properties. “One part of the method is based on dynamic programming, which starts from small codes that we show to be optimal, and then builds upon these small codes to rapidly search for improved, longer codes that minimize the bit transitions,” says Rice professor Farinaz Koushanfar. The second part of the method is based on integer-linear programming (ILP), a method that finds optimal solutions. “The overhead for ILP is practical because the codes are found only once, during the design phase,” says UCLA professor Miodrag Potkonjak.
Dept. of Homeland Security to Focus on Cyber Workforce Development
Network World (06/06/12) Michael Cooney
The U.S. Department of Homeland Security (DHS) is working to build a strong pipeline of cybersecurity talent, including computer engineers, scientists, analysts, and information technology specialists. DHS secretary Janet Napolitano says the agency will form a task group to focus on the development of the cyberworkforce. "To accomplish this critical task, we have created a number of very competitive scholarship, fellowship, and internship programs to attract top talent," she says. The cybersecurity workforce task group will consider expanding DHS support of cyber competitions and university programs. The new task force also will consider strategies such as enhancing public-private security partnerships and working with other government agencies to develop a more agile cyber workforce across the federal government. Moreover, the task force will help in the effort to develop strong cybersecurity career paths within DHS and other agencies. Hacking expert Jeff Moss, who now works for the Homeland Security Advisory Council, and Alan Paller, director of research at the SANS Institute, will co-chair the cybersecurity workforce task group.
Pandemic Preparedness
Texas Advanced Computing Center (06/06/12) Aaron Dubrow
University of Texas at Austin researcher Lauren Ancel Meyers is working with the Texas Advanced Computing Center (TACC) to enhance data-driven science. Meyers worked with TACC staff and used the center's systems to forecast H1N1 flu virus infections as the pandemic progressed, and developed visualizations that depicted the spread of the disease. Meyers led the development of the Texas Pandemic Flu Toolkit, a Web-based service that simulates the spread of pandemic flu through the state, forecasts the number of flu hospitalizations, and determines where and when to place ventilators to minimize fatalities. The toolkit also can be used in emergency situations to guide real-time decision-making. "While the forecasts will not be exact, they give a rough idea of how many people will be hospitalized around the state and when an epidemic may peak," Meyers says. The toolkit can proactively develop scenarios of probable pandemics and to see how they may impact different locations, age groups, and demographics. TACC's supercomputers also enable data to simultaneously be processed and distributed to a large number of users.
Abstract News © Copyright 2012 INFORMATION, INC.
To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe
|