Association for Computing Machinery
Welcome to the December 21, 2012 edition of ACM TechNews, providing timely information for IT professionals three times a week.

ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


Keep It Simple: Bring Software Complexity Under Control
CORDIS News (12/19/12)

European Union-funded researchers working on the industrial deployment of advanced system engineering methods for high productivity and dependability (DEPLOY) project, have developed an approach for building software systems that is safer, less expensive, and more robust. The research aims to improve traditional software engineering processes that are not equipped to handle the complexity and diversity of modern software systems. "As more and more elements and more functionality are packed into systems, engineers say they are losing control of complexity and worry they won't be able to provide the quality assurances required," says University of Newcastle professor Alexander Romanovsky. He says the DEPLOY project approach follows formal engineering methods, which are starting to gain acceptance among industry experts as a more efficient, practical way to develop complex software systems. Formal engineering methods are based on mathematical modeling and analysis, supporting reasoning at multiple levels of abstraction to enable a systematic engineering flow. Romanovsky says the advantage of formal engineering methods is that errors are caught early in the development and complexity is kept within predefined limits, all of which reduces the need for testing at the final stages. "In the long run, software systems are only going to keep getting more complex--formal engineering, as we have shown, is one way to address that problem," he says.


Cellphone, GPS Data Suggest New Strategy for Alleviating Traffic Tie-Ups
UC Berkeley NewsCenter (12/20/12) Robert Sanders

Researchers at the University of California, Berkeley and the Massachusetts Institute of Technology found that asking specific groups of drivers to stay off the road can significantly reduce rush-hour traffic. The researchers monitored traffic through drivers' cell phone and global positioning system (GPS) data and found that canceling or delaying the trips of 1 percent of all drivers across a road network would reduce delays caused by traffic by only about 3 percent. However, canceling the trips of 1 percent of drivers from carefully selected neighborhoods would reduce the extra travel time for all other drivers in a metropolitan area by as much as 18 percent. "Reaching out to everybody to change their time or mode of commute is thus not necessarily as efficient as reaching out to those in a particular geographic area who contribute most to bottlenecks," says Berkeley professor Alexandre Bayen. The researchers used three weeks of cell phone data to gather information about anonymous drivers’ routes and the estimated traffic volume and speed on those routes both in Boston and the San Francisco Bay Area. The researchers also used data taken from GPS sensors in taxis in the San Francisco area to compute the speed of the taxis based on travel time from one location to another.


Supercomputing on the XPRESS Track
Sandia National Laboratories (12/20/12) Neal Singer

Sandia National Laboratory researchers are working on the eXascale Programming Environment and System Software (XPRESS) project, an operating system that can handle the million trillion operations per second of future exascale computers and then create prototypes of several programming components. Exascale computing speeds will more accurately simulate the most complex reactions in such fields as nuclear weapons, atmospheric science, chemistry and biology. "The XPRESS project aims to provide a system software foundation designed to maximize the performance and scalability of future large-scale parallel computers, as well as enable a new approach to the science and engineering applications that run on them," says Sandia's Ron Brightwell. The XPRESS effort will address factors known to degrade fast supercomputer performance, such as the insufficiency of concurrent partial problem-solving at particular processing locations, which hinders efficiency and scalability because it can require more parallelism. Information delays and unwanted overhead also can slow supercomputer performance. Brightwell says the XPRESS project brings together researchers with expertise in operating systems as well as other system software capabilities, such as performance analysis and dynamic resource management, which are crucial to supporting the features needed to effectively manage the complexities of future exascale systems.


Fail-Safe Software Could Stop Flash Crashes
New Scientist (12/20/12) Paul Marks

University of Bristol researchers have proposed using circuit-breaker algorithms that are triggered by erratic trading to avoid stock market flash crashes like the one that occurred on May 6, 2010. The researchers note that similar software systems already monitor medical robots and nuclear reactors for potentially dangerous behavior. High-frequency trading (HFT) "circuit breakers need to be considered as very high reliability software engineered to work under stressed conditions and with multiple backups," says Bristol professor Philip Bond. However, a decision on whether or not to use such a system should be made quickly. Although the proportion of high-frequency trades recently has declined in the U.S., they still account for a majority of trades. Meanwhile, the hardware required to run the software is becoming more affordable. Oxera economist Fod Barnes notes that efforts to prevent erratic trading must be able to distinguish between technical breakdowns and simple bad decision-making. "The HFT system is a bit more complex than just engineering," Barnes says. "Why should those who manage to write an algorithm that makes a series of very bad trades be protected from their own folly?"


MIT Research Shows New Magnetic State that Could Aid Quantum Computing
IDG News Service (12/19/12) Stephen Lawson

Massachusetts Institute of Technology (MIT) researchers have demonstrated quantum spin liquid (QSL), a new type of magnetism they say could be used in future communications, computing, and data storage technologies. MIT professor Young Lee says QSL involves a tiny crystal of a rare mineral that took 10 months to create. While studying how QSL works, the researchers saw a phenomenon known as long-range entanglement, which they say could be applied to new types of storage, computing, and networking. In the new material, each particle constantly changes its magnetic moment. "If the magnetic moments do not order, and they're constantly fluctuating with respect to each other, then we call that a liquid," Lee says. As part of the QSL research, which was carried out using a technique called neutron scattering, the crystal had to be cooled to near zero Kelvin, according to the researchers. The long-range entanglement phenomenon could aid in the development of quantum computing, which uses a qubit based on the quantum state of an atomic particle to represent each bit of information. However, "there are issues that need to be improved in these qubits so you can have a quantum state that lasts a very long time without, essentially, decaying," Lee says.


Research Outlays to Decline Next Year
Wall Street Journal (12/18/12) Gautam Naik

Research and development spending by governments and corporations in the U.S. and Europe could decline in 2013 as a result of weak economies and large national debts, according to a Battelle Memorial Institute forecast. U.S. inflation-adjusted R&D spending is expected to fall by 0.7 percent in 2013 on the assumption that lawmakers will resolve the fiscal cliff issue, but spending could decline further in the absence of a resolution. U.S. inflation-adjusted R&D spending growth averaged about 4 percent annually from 2004 to 2007, but since 2009 growth has failed to outpace inflation. Globally, the U.S. remains the leader in R&D, spending $418.6 billion in 2012 in current-dollar terms while China spent $197.3 billion, but Battelle predicts that China could achieve R&D spending parity with the U.S. in 2022 and possibly 2019 if the fiscal cliff remains unresolved. Worldwide R&D spending in 2013 is projected to increase by 3.7 percent, or $53.7 billion, to nearly $1.5 trillion, of which the largest increase of $22.9 billion is expected to come from China. The U.S. is predicted to spend $423.7 billion on R&D in 2013, with academic research comprising more than 60 percent of the basic research conducted nationwide.
Share Facebook  LinkedIn  Twitter  | View Full Article - May Require Paid Subscription | Return to Headlines


Open Software Jumpstarts Academic Communities
Center for Digital Education (12/19/12) Tanya Roscorla

City University of New York (CUNY) researchers have developed Commons in a Box, an open source software platform that enables universities and other organizations to install and manage community sites for faculty, staff, administrators, and students. The software runs on WordPress and a plug-in for WordPress called BuddyPress. "When you build your own network, it's meant to be very customizable and very flexible," says CUNY Academic Commons' Boone B. Gorges. "But with that kind of customizability comes a certain amount of complexity that was a barrier for a lot of people." Commons is designed to make learning communities accessible for organizations that do not have the staff or funding to design their own communities. An installation manager guides users through the steps to get their site up and running, and the system will not let anyone deactivate one thing if that deactivation will break something else that is activated. "What we're offering is a way for communities to really own their own data and their own spaces and to be in control of them, which we think is a pretty important move for higher education," says CUNY Academic Commons director Matthew K. Gold.


Breaking Through the Wall of Chinese Language
SBS World News (Australia) (12/19/12) Rhiannon Elston

Researchers from the University of Sydney's school of Information Technologies have created an improved technique for mining information from Chinese language texts, using what professor Josiah Poon calls a "linguistic approach" to study patterns in the text. "We look at all the grammar structures between the two languages, compare them, and find unique patterns there," he says. The patterns are subsequently mapped to patterns in another language, and student Cathy Xiao Yu says the method they have developed will ease and expedite translation. "Our method is much more powerful because it can be used in different domains," she says. Poon notes the model can be adapted to specific user needs. "You will actually have the power to decide what you want to find out from the document," he says. "That's good for flexibility and also usability." The researchers hope their translation model can be applied to the generation of a smartphone application so the technology can be made available to more users.


Online Translation Breaks Language Barriers
Deutsche Welle (Germany) (12/20/12) Diana Fong

Free online translation services are improving thanks to better data and more sophisticated algorithms. Google's Franz Josef Och reports that Google Translate currently accommodates most of the world's translation, with more than 200 million users worldwide accessing the service each month. Google Translate and Microsoft's Bing translator compute the most likely match between two parallel texts in a given language pair using statistical algorithms. However, Google Translate's output quality suffers in subject areas where translated text for parallel match-ups is not widely available. Although Google Translate cannot make a distinction between human translations and those produced by its own system, Och's team is working on programs for screening out bad machine translations. Also aiding Google's translation improvement efforts are users that translate text from their mother tongue, as they can use a new feature asking them to select the better translation among multiple options. German Research Center for Artificial Intelligence's Aljoscha Burkhardt says a key advantage of linguistic rule-based machine translation is that it gives users more control over the outcomes. "You know what the rule-based system is doing, but you need experts for going into a new subject domain, which requires a new lexicon and a new way of constructing sentences," he notes.


Team Announced Major Breakthrough in Indoor Positioning Research
KAIST (12/17/12)

KAIST researchers have developed a method to build a Wi-Fi radio map that does not require global positioning system signals. The method uses Wi-Fi fingerprints, which are a set of Wi-Fi signals captured by a mobile device and the measurements of received Wi-Fi signal strengths from surrounding access points. The KAIST researchers collected fingerprints from users' smartphones every 30 minutes through the modules embedded in mobile platforms, utilities, or applications, and analyzed their characteristics. "We discovered that mobile devices such as cell phones are not necessarily on the move all the time, meaning that they have locations where they stay for a certain period of time on a regular basis," says KAIST professor Dong-Soo Han. The researchers were able to classify the fingerprints as being collected from either the home or office. Then they converted each home and office address into geographic coordinates to determine the location of the collected fingerprints. "Although there seems to be many issues like privacy protection that have to be cleared away before commercializing this technology, it is no doubt that we will face a greater demand for indoor positioning system in the near future," Han says.


SANS NetWars Tests Cybersecurity Pros Against Peers
IDG News Service (12/18/12) Grant Gross

About 200 cybersecurity professionals and 30 high school students recently gathered for the SANS Institute's NetWars Tournament of Champions, a realistic cybersecurity competition that offers prizes to the victors as well as bragging rights. The NetWars competition simulates several skills needed for cybersecurity professionals, including system administration, computer forensics, and defending your own network against live attackers. The U.S. Air Force has been using a modified version of NetWars for about two years to train its cybersecurity professionals. SANS also is developing CyberCity, a simulation that enables cybersecurity professionals to test their skills against simulated attacks against physical infrastructure. The scale-model city will have a hospital, railroad, military base, electric utility, Internet service provider, and other buildings. SANS instructor Ed Skoudis says the institute is building CyberCity so that people training with the institute have a more realistic picture of the consequences of cyberattacks.


Artificial Intelligence Helps Sort Used Batteries
University of Gothenburg (Sweden) (12/17/12) Thomas Melin

Researchers at the University of Gothenburg and Chalmers University of Technology have developed an artificial intelligence (AI) system that sorts used batteries. The system, which uses computerized optical recognition to sort up to 10 batteries per second, is equipped with a trained neural network that can recognize about 2,000 different types of batteries by taking pictures of them from all possible angles. The neural network identifies the batteries by comparing the pictures taken against a database of previously taken pictures. After the batteries have been identified, compressed air separates them into different containers according to chemical content. The AI system was incorporated into a recycling machine from Optisort. "For each single battery, the system stores and spits out information about, for example, brand, model and type," says Optisort CEO Hans-Eric Melin. "This is sparking further research and development so that we will eventually use artificial intelligence to sort all types of waste," Melin says.


Computer Engineering: Feeling the Heat
Nature (12/12/12) Philip Ball

Heat has become the biggest obstacle to computing's continued advancement, and researchers are experimenting with new ways of cooling chips. For example, IBM researchers believe that flowing water could be used to extract heat, as well as provide power for the chip's circuitry by carrying dissolved ions that engage in electrochemical reactions at energy-harvesting electrodes. The researchers are developing microfluidic cells for powering microprocessors, using a reduction-oxidation process based on vanadium ions. The researchers believe that future processors will have significantly lower power requirements. Transistors currently are packed into 2D chips, but if they were stacked in 3D arrays, the energy lost in data transport could be cut drastically. "If you reduce the linear dimension by a factor of 10, you save that much in wire-related energy, and your information arrives almost 10 times faster," says IBM researcher Bruno Michel. He argues that if computers are to be packaged three-dimensionally, they should mimic the human brain's hierarchical architecture. Michel says 3D packaging could reduce computer volume by a factor of 1,000, and power consumption by a factor of 100, compared to current 2D architectures.


Abstract News © Copyright 2012 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe