Association for Computing Machinery
Welcome to the August 29, 2016 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


Government Sees Potential 'Quantum Ecosystem' in Australia
Computerworld Australia (08/29/16) George Nott

Australian industry, innovation, and science minister Greg Hunt says government backing of quantum computing research will help cultivate a local "quantum ecosystem" and industry, in addition to sustaining Australia's competitive edge as it competes to build a scalable silicon-based quantum computer. "This ecosystem would be defensible in the medium term, due to the unique nature of the technology, but we need to look beyond that," says Commonwealth Bank CIO David Whiteing. "We're now thinking about software and services that are relevant to our customers, and now's the time to invest in those opportunities and begin to create something truly differentiated." This year also saw the opening of a new facility at the Center for Quantum Computation and Communication Technology (CQC2T), which received $25 million in government funding as part of Australia's National Innovation and Science Agenda. The CQC2T's ambitions will be complemented by the Australian Research Council Center of Excellence for Engineered Quantum Systems' mission to "initiate the Quantum Era in the 21st century by engineering designer quantum systems." Among CQC2T director Michelle Simmons' concerns is a scarcity of young people learning the basics of computer science. Whiteing agrees the quantum ecosystem cannot be realized without aggressive support for education.


New Approach to Computing Boosts Energy Efficiency
Chalmers University of Technology (08/26/16)

The Chalmers University of Technology in Gothenburg, Sweden, is leading a European project to upgrade the energy efficiency of computer systems. The three-year Excess project, which will be completed this week, sought to address "a clear lack of tools and mathematical models to help the software engineers to program in an energy-efficient way, and also to reason abstractly about the power and energy behavior of her software," says Chalmers professor and project leader Philippas Tsigas. He notes the project espouses a holistic, integrated approach in which software and hardware work together to enable coders to make power-aware architectural decisions early. "This allows for larger energy savings than previous approaches, where software power optimization was often applied as a secondary step, after the initial application was written," Tsigas says. Excess' toolkit features fundamentally new energy-saving hardware components, such as the Movidius Myriad vision-processor platform, as well as efficient libraries and algorithms. The results of tests run on large data-streaming aggregations have been impressive, as a coder can offer a solution 54 times more energy efficient solution on the Excess framework than on a standard high-end PC processor deployment. The holistic strategy initially presents the hardware benefits and then shows the optimal process for dividing computations within the processor for additional performance enhancement.


Secure Networks for the Internet of the Future
University of Wurzburg (08/25/16) Robert Emmerich

Researchers at Bavaria, Germany's University of Wurzburg, working as part of the SEcure Networks for a DATa center cloud in Europe (SENDATE) consortium, are developing secure and efficient networks for the Internet of the future. "If we want data centers to continue operating in a secure, flexible, reliable, and instantaneous manner, telecommunication networks and [information technology] will have to be consolidated," says Wurzburg professor Phuoc Tran-Gia. In addition, Tran-Gia says Internet researchers must decentralize computing and storage capacities and bring them closer to end-users. The SENDATE consortium aims to develop a network architecture and technologies for secure and flexible distributed computing centers. "Innovative technologies and approaches such as the virtualization of network functions [NFV] combined with software-defined networking [SDN] establish the basis for this," Tran-Gia says. The project, which is scheduled to run until February 2019, has a research budget of about $82 million. The researchers also are investigating the development, operation, and optimization of virtual network functions and their positioning in distributed data centers.


Extending Battery Life for Mobile Devices
University of Massachusetts Amherst (08/26/16)

Mobile devices will be able to leverage battery power in larger nearby devices for communication using new radio technology developed at the University of Massachusetts Amherst (UMass Amherst). A paper on the new technology was presented last week at the ACM Special Interest Group on Data Communication (SIGCOMM 2016) conference in Brazil. "We take for granted the ability to offload storage and computation from our relatively limited personal computers to the resource-rich cloud," says UMass Amherst professor Deepak Ganesan. "In the same vein, it makes sense that devices should also be able to offload how much power they consume for communication to devices that have more energy." The team enabled Bluetooth to operate asymmetrically like radio-frequency identification (RFID). Dubbed Braidio for "braid of radios," the technology operates like a standard Bluetooth radio when a device has sufficient energy, but runs like RFID when energy is low, offloading energy use to a device with a larger battery when needed. The team developed a prototype radio that could help extend the life of batteries in small, mass-market mobile devices such as fitness trackers and smartwatches. Ganesan says battery life could be extended hundreds of times in some cases, and "energy offload" techniques could lead to smaller and lighter devices in the future.


Electrons at the Speed Limit
ETH Zurich (08/26/16) Oliver Morsch

Researchers at the Swiss Federal Institute of Technology in Zurich (ETH Zurich) led by professor Ursula Keller have explored the ultimate speed of electrons when controlled by electric fields, which has implications for future petahertz electronics. Keller and colleagues' exposure of a 50-nanometer-thick diamond fragment to an infrared laser pulse lasting a few femtoseconds demonstrated that the laser light's electric field oscillated back and forth five times and excited the electrons. The researchers saw the absorption varied characteristically following the rhythm of the oscillating electric field of the infrared pulse. A joint ETH/Tsukuba University of Japan project was conducted to model the reaction of the electrons to the laser pulse on a supercomputer, which determined the absorption was consistent with the Zurich measurements. "The advantage of the simulations compared to the experiment...is that several of the effects that occur in real diamond can be switched on or off, so that eventually we were able to ascribe the characteristic absorption behavior of diamond to just two such energy bands," says ETH postdoctoral researcher Matteo Lucchini. The results offered proof the dynamical Franz-Keldysh effect was responsible for the absorption in diamond under the influence of the laser pulse.


UTA Physicists to Upgrade Titan Supercomputer Software for Extreme Scale Applications Such as Biology and Materials Science Simulations
UTA News Center (08/25/16) Louisa Kellie

University of Texas at Arlington (UTA) researchers have been awarded a grant from the U.S. Department of Energy to improve operational efficiencies of the Titan supercomputer to support data-heavy applications. Although the Titan upgrades are primarily meant to handle the massive datasets produced by the particle and nuclear experiments in the Large Hadron Collider (LHC), UTA professor Kaushik De also is interested in providing computing support for advanced biology and materials science simulations. De originally designed the Production and Distributed Analysis (PanDA) system to handle jobs for the LHC's ATLAS particle physics experiment, making ATLAS data available to thousands of scientists using a global grid of networked computing resources. De's latest system, Big PanDA, is a workload management system that schedules jobs on Titan in a way that does not conflict with Titan's traditional computing jobs. De notes even when the supercomputer is fully scheduled, Big PanDA can harvest Titan's idle time for use in other projects. "Using Big PanDA to further integrate supercomputers with the grid and cloud computing would also enable a much more efficient use of computing resources for a wider range of scientific applications," he says.


The First Autonomous, Entirely Soft Robot
Harvard Gazette (08/24/16) Leah Burrows

Harvard University researchers have demonstrated an autonomous, untethered, entirely soft robot they say could lead to a new generation of completely soft, autonomous machines. Using a hybrid approach, the team used three-dimensional (3D) printers to quickly produce each of the functional components required for the soft robot body, including fuel storage, power, and actuation. Nicknamed octobot, the soft robot is powered by gas under pressure. A reaction inside the bot transforms a small amount of hydrogen peroxide into a large amount of gas, which flows into the octobot's arms and inflates them like a balloon. The researchers say the simple reaction between hydrogen peroxide and platinum enables the researchers to replace rigid power sources. The team used a microfluidic logic circuit, a soft analog of a simple electronic oscillator, to control when hydrogen peroxide decomposes to gas in the octobot. "The entire system is simple to fabricate, by combining three fabrication methods--soft lithography, molding, and 3D printing--we can quickly manufacture these devices," says Harvard graduate student Ryan Truby.


Flood Forecasting Gets Major Upgrade
National Science Foundation (08/24/16) Aaron Dubrow

Researchers at the University of Texas at Austin and the University of Illinois at Urbana-Champaign have created the National Water Model, which provides forecast information, data, decision-support services, and guidance to essential emergency services staff and water management personnel. The National Water Model combines geographic-information system (GIS) data with real-time weather forecasts, and uses sensor data from more than 8,000 U.S. Geological Survey gauges and other sources of information to predict where dangerous flood situations will arise at all 2.7 million reaches in the U.S. However, this process requires powerful computers and software that can stitch diverse pieces of information together and produce flow forecasts that can be quickly updated. The researchers used the Texas Advanced Computing Center's Stampede and Lonestar supercomputers to manage the massive surge of data and to perform the necessary calculations to forecast events across the U.S. As part of the U.S. National Science Foundation (NSF)-funded National Flood Interoperability Experiment, the researchers demonstrated the feasibility of simultaneously calculating the river flow for all 2.7 million stream reaches. "This project is a great example of how integrated cyberinfrastructure--hardware, software, data, networks, and people--can be used to solve some of the world's most challenging problems and to positively impact people's lives," says former NSF program director Daniel Katz.


Post-Disaster Optimization Technique Capable of Analyzing Entire Cities
Lehigh University (08/24/16) Lori Friedman

Lehigh University researchers have developed the Algorithm with Multiple-Input Genetic Operators (AMIGO), a method they say represents a major improvement in existing computational models and optimization methodologies. AMIGO, which was designed to consider very complex objectives while keeping computational costs down, makes the search process more efficient and expedites the convergence rate by leveraging the additional data in the genetic operators that are used to guide the algorithm toward a solution. Lehigh professor Paolo Bocchini says AMIGO "can be used to solve a variety of scheduling optimization problems common in different fields, including construction management, the manufacturing industry, and emergency planning." The researchers demonstrated the effectiveness of the algorithm by conducting a large-scale numerical analysis using an imagined earthquake scenario in San Diego. AMIGO uses heuristic techniques and evolutionary algorithms that are particularly useful for solving multi-objective optimization problems. During testing, AMIGO found a set of near-optimal solutions in a small number of trials. The research is part of the Probabilistic Resilience Assessment of Interdependent Systems project, a collaboration between Lehigh, Florida Atlantic, and Georgia State universities. The team was awarded a $2.2-million grant by the U.S. National Science Foundation last year as part of the Obama administration's "Smart Cities" initiative.


Analog DNA Circuit Does Math in a Test Tube
Duke Today (08/23/16) Robin A. Smith

Duke University professor John Reif and colleagues are using DNA to make nanoscale computers. The team has created strands of synthetic DNA that, when mixed together in a test tube in the right concentrations, form an analog circuit that can add, subtract, and multiply as they form and break bonds. Reif says the device performs calculations in an analog fashion by directly measuring the varying concentrations of specific DNA molecules, without requiring special circuitry to convert them to zeroes and ones first. The researchers used software to simulate reactions over a range of input concentrations and tested the circuit experimentally in the lab. Reif says DNA circuits can be far smaller than silicon-based circuits, and work in wet environments, which might make them useful for computing inside the bloodstream or in the soupy, cramped quarters of the cell. Test tube calculations are slow, so commercial applications are a long way off. However, Reif notes the devices potentially could be programmed to sense whether particular blood chemicals lie inside or outside the range of values considered normal, and respond by releasing a specific DNA or RNA that has a drug-like effect.


System Might Detect Doctored Images and Videos for the Military
Purdue University News (08/23/16) Emil Venere

Purdue University and an international coalition of researchers are developing technologies for the automated assessment of digital images through an end-to-end platform. If successful, the platform could help the U.S. military detect manipulations of open source images. Visual media can be doctored by a wide range of sophisticated image- and video-editing applications, and authenticating the integrity of a potentially valuable image can be difficult. "Many tools currently available cannot be used for the tens of millions of images that are out there on the Net," says Purdue professor Edward Delp. "They take too long to run and just don't scale up to this huge volume." As part of a four-year project funded by the U.S. Defense Advanced Research Projects Agency, researchers are working to create an end-to-end system capable of gleaning useful data from the massive volume of media uploaded regularly to the Internet. Specialized machine-learning computers will be designed to automatically verify the authenticity of uploaded images and videos. Delp says once the technique is fully developed, the system will be available to anyone in the field of media forensics, not just the intelligence community.


The Hype--and Hope--of Artificial Intelligence
The New Yorker (08/26/16) Om Malik

GigaOm founder Om Malik says filtering through the media hype about artificial intelligence (AI) shows its boosters are really referring to data analytics. In contrast, experts generally concur AI "is a set of technologies that try to imitate or augment human intelligence," Malik writes. "To me, the emphasis is on augmentation, in which intelligent software helps us interact and deal with the increasingly digital world we live in." Malik cites a three-stage breakdown of AI evolution outlined by Juji co-founder and human-computer interaction expert (and ACM Distinguished Scientist) Michelle Zhou. The first stage is recognition intelligence, in which algorithms running on increasingly powerful computers recognize patterns and extract topics from a small sample of text or sentences. The second stage is cognitive intelligence, in which machines start inferring meaning from data, and stage three is the generation of virtual beings capable of thinking, acting, and behavior that are equal to that of humans. "We are only in the 'recognition intelligence' phase--today's computers use deep learning to discover patterns faster and better," Malik says. He notes the third stage is a long way off despite the hype, as human intelligence enhancement is currently AI's most valuable function--a capability that requires people to train algorithms to imitate humans.


Abstract News © Copyright 2016 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]

Unsubscribe