Association for Computing Machinery
Welcome to the November 26, 2012 edition of ACM TechNews, providing timely information for IT professionals three times a week.

ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


Scientists See Promise in Deep-Learning Programs
New York Times (11/23/12) John Markoff

Deep-learning technology, an artificial intelligence technique inspired by theories about how the brain recognizes patterns, has recently grown in speed and accuracy. "There has been a number of stunning new results with deep-learning methods," says New York University computer scientist Yann LeCun. "The kind of jump we are seeing in the accuracy of these systems is very rare indeed." For example, in October a group of University of Toronto researchers developed software to help find molecules that could lead to new medical drugs. The technique uses deep-learning software to determine which molecule is the most likely to be an effective drug agent. The software won the top prize in a contest sponsored by Merck. "This is a really breathtaking result because it is the first time that deep learning won, and more significantly it won on a data set that it wouldn’t have been expected to win at," says Kaggle CEO Anthony Goldbloom. Advanced pattern-recognition technologies also can be used in marketing and law enforcement applications. Deep-learning systems recently have outperformed humans in certain limited recognition tests, and have been trained to recognize images in a database of German traffic signs.


Brain-Like Chip Outstrips Normal Computers
New Scientist (11/22/12) Michael Marshall

University of Heidelberg researchers have developed the Spikey chip, which features a neuromatic design that tries to recreate the brain's hardware using analog circuitry. "On our system, you can physically point to the neuron," says Heidelberg researcher Karlheinz Meier. The Spikey chip contains 400 "neurons," or printed circuits. Similar to a real neuron, when the applied voltage reaches a certain level, the capacitor becomes conductive, firing a "nerve signal." The analog components have variable levels of resistance to simulate the way connections between neurons become stronger or weaker depending on how much they are used. The researchers connected the neurons in different ways to mimic various brain circuits. So far the researchers have modeled six neural networks. "This is as good as you can get in simulating neural architecture," says Boson University professor Massimiliano Versace. The researchers are currently scaling up Spikey as part of the BrainScales projects. "Instead of 400 neurons we have 200,000," says Heidelberg researcher Thomas Pfeil. The researchers have printed all of the circuits onto a 20-centimeter silicon wafer, which enables them to incorporate many more connections.


Supercomputers Face Growing Resilience Problems
IDG News Service (11/21/12) Joab Jackson

North Carolina State University (NCSU) researchers have developed RedMPI, software that runs in conjunction with the Message Passing Interface (MPI), a library for splitting applications across multiple servers so the different parts of the program can be executed in parallel. The researchers say RedMPI could be a solution to the growing vulnerability of high-performance computing systems. As supercomputers grow more powerful, they also grow more vulnerable to failure due to the increased amount of built-in componentry. NCSU's David Fiala says the problem will only get worse as the industry moves toward exascale systems. He says that to account for the additional hardware required for exascale computing, system reliability will need to be improved by 100 times in order to keep the same mean time between failures provided by today's supercomputers. RedMPI addresses the problem of silent data corruption by simultaneously running multiple copies of a program and then comparing the answers. RedMPI intercepts and copies every MPI message that an application sends, and distributes copies of the message to the clone of the program. If different clones calculate different answers, the numbers can be recalculated on the fly.


China to Unveil Homebrewed Chip in 2013
ZDNet (11/21/12) Jamie Yap

Longsoon Technology, which is partly funded by the Chinese Academy of Science, will share details about its Godson eight-core processor at the International Solid-State Circuits Conference in February. The new Godson processor has a clock speed of 1.35 gigahertz and provides 172.8 gigaflops of performance while drawing 40 watts of power. However, the Godson cores are based on a MIPS64 central-processing unit (CPU) instruction set, and do not support the Windows operating system, and instead run variants of Linux. "Just like a country's industry cannot always depend on foreign steel and oil, China's information industry needs its own CPU," says National People's Congress deputy Hu Weiwu. However, Intel's Rajeeb Hazra notes that although the effort to develop domestic chips may facilitate China's academic community, it might not be the best strategy if China wants to be a world technology leader. "Our goal is to demonstrate that for countries that may be contemplating that path, it's in their best interest, the best economic interest, to actually work with us and help us understand what they need rather than having to do something that is purely driven by a nationalistic boundary as opposed to more pure technology goals," Hazra says.


A More Sensitive Technique for Determining User Position Could Lead to Improved Location-Based Mobile Services
A*STAR Research (11/21/12)

A technique developed by an international research team could potentially improve the accuracy of determining the position of mobile devices. The protocol simultaneously predicts the location of the mobile device user and the data access points, or hotspots. Sinno Jialin Pan at the A*STAR Institute for Infocomm Research and colleagues trained a learning-based system with the signal-strength values received from access points at selected places in the area of interest, then used the information to calibrate a probabilistic location-estimation system. They approximated the location from the learned model using signal strength samples received in real time from the access points. During testing, the approach required less calibration and was more accurate than state-of-the-art systems. The technique could lead to the development of a new class of mobile apps that react to small changes in position. "We ... want to find ways to make use of the estimated locations to provide more useful information, such as location-based advertising," Pan says. He notes robots also could use the approach to navigate on their own.


IT Building Blocks for the Man in the Street
SINTEF (11/21/12) Ase Dragland

SINTEF ICT researchers are developing computing tools composed of different building blocks so that people can select, combine, and put together the information technology (IT) services they need. "Since most people aren’t qualified programmers or software developers, we have to provide them with a new user interface and a tool that they can understand," says SINTEF ICT's Jacqueline Floch. The project, which has been underway for the past four years, includes developing software systems for the mobile market, software for telecom operators and service providers, and a network on which new IT services can be operated experimentally. The researchers say the services will lead to improved tracking of mobile units, especially with the project's SmartTrack interface. The project's Easy Designer framework enables users to modify existing services and quickly create new solutions. The SINTEF researchers also have developed the City Explorer application, which enables users to create or edit places and itineraries in a city. "We are interested in adding to existing functions, so that the user can create their own 'menu list,'" Floch says.


Filtering Spam
Concordia University (11/20/12) Cléa Desjardins

Concordia University researchers have developed a method for removing unwanted spam emails from user inboxes. The method is a statistical framework for spam filtering the researchers say quickly and efficiently blocks unwanted messages. "By considering patterns from text and images simultaneously, we’ve been able to propose a new method for filtering out spam," says Concordia's Ola Amayri. Spam messages often use sophisticated tricks, such as deliberately obscuring text, combining words with symbols, and using groups of the same images with different backgrounds and colors that might contain random text from the Web. When these tricks are used together, conventional spam filters are unable to stop the messages because they focus on either text or images but rarely both. "Our new method for spam filtering is able to adapt to the dynamic nature of spam emails and accurately handle spammers' tricks by carefully identifying informative patterns, which are automatically extracted from both text and images content of spam emails," Amayri says.


Robotic Fish Research Swims Into New Ethorobotics Waters
NYU-Poly (11/20/12)

A study from researchers at the Polytechnic Institute of New York University (NYU-Poly) found that zebrafish were most attracted to the presence of a robotic fish when it modulated its tail motion in a similar manner to live fish. NYU-Poly professor Maurizio Porfiri and colleagues used image-based tracking software to analyze the movement of the live zebrafish and provide real-time feedback to the robot. The live zebrafish were most likely to spend time near the robot when it increased its tail beat frequency as they approached. The robotic fish was able to produce a tail beat motion that replicated the behavior of "informed fish" attempting to lead "naive fish." The findings suggest that real-time visual feedback could be an effective way to use robots to influence the behavior of live animals. In particular, wildlife conservation could benefit from the research.


Supercomputer Assists in a World First for Space Simulation
CORDIS News (11/20/12)

The Partnership for Advanced Computing in Europe (PRACE) has provided Hermit, a Cray XE6 supercomputer, to the Finnish Meteorological Institute (FMI) to develop the Vlasiator simulator, the world's first large-scale space simulator. FMI specializes in large-scale computer simulations modeling the behavior of particles and electromagnetic fields in the vicinity of Earth and other bodies in the solar system. PRACE has awarded 30 million core hours of computing time to FMI for the Vlasiator space simulator. Vlasiator is designed for modeling near space, and requires the most computing time. The supercomputer features 113,664 cores and executes more than 1 million billion computations per second. "Using these resources, we'll be the first in the world to run a large-scale space simulation where even small-scale phenomena can be seen accurately for the first time ever," says FMI researcher Minna Palmroth. Vlasiator is the first simulation in the world based on the Vlasov equation that can generate a model of the Earth's complete magnetic field in three dimensions, while concurrently producing particle distributions in six dimensions.


Exascale Unlikely Before 2020 Due to Budget Woes
Computerworld (11/19/12) Patrick Thibodeau

The U.S. Department of Energy (DOE) is targeting 2020 or 2022 as the date for developing the world's first exascale supercomputer systems. DOE is working on a report for Congress that will detail its Exascale Computing Initiative, which is expected to spell out a plan and cost for building an exascale system. China, Europe, and Japan also are developing exascale systems, so there is no guarantee that the United States will reach the 1,000 petaflop threshold first. Making the project even more challenging, the U.S. has established strict criteria for its exascale effort. The system needs to be relatively low power, serve as a platform for a wide range of applications, and lead to marketable technologies that can help the information technology industry. Exascale systems also will require a new programming model. "The real change is programming in the node and the parallelism to hide latency, to hide the communication to the other nodes, so that requires lots of parallelism and concurrency," says Argonne National Laboratory researcher Pete Beckman. He notes that the new systems will require adaptive programming models, and until an approach is settled it is going to be a disruptive few years in terms of programming models.


Software Enables Avatar to Reproduce Our Emotions in Real Time
Ecole Polytechnique Federale de Lausanne (11/19/12) Cécilia Carron

Ecole Polytechnique Federale de Lausanne (EPFL) researchers have developed Faceshift, an avatar-based animation program that could save time for animation and video game designers. The researchers say the system also could enhance the future of video chats. The software requires a camera that has motion and depth sensors, such as the Microsoft Kinect. A user reproduces several basic expressions requested by the program, such as a smile or raised eyebrows, and the captured expressions enable the software to recognize the user's face. "The more movement is incorporated into the program's 50 positions, the more realistic are the results," says EPFL's Thibaut Weise. The researchers used an algorithm to demonstrate that three-dimensional facial movements could be reconstructed in real time without using facial markers or complex scanning hardware. The Faceshift software creates an avatar that mimics the emotions of the actor in real time, making rendering work much faster. "This new tool can reduce the time to make a film by up to 30 percent," Weise says.


Abstract News © Copyright 2012 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe