Welcome to the September 28, 2016 edition of ACM TechNews, providing timely information for IT professionals three times a week.
Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).
HEADLINES AT A GLANCE
Why Human Brains Hold the Key to Smarter Artificial Intelligence
CORDIS News (09/27/16)
The European Union-funded INSIGHT project has gained a new understanding of how people solve problems by examining how ideas in their brains could evolve throughout their lives, which could one day lead to smarter robots. Project coordinator Eors Szathmary, a professor at Hungary's Eotvos University, notes robots currently "lack adequate algorithms for insight problem-solving in various contexts, which is vital in human understanding." INSIGHT used computer simulations, robots, examinations of cell cultures, and human psychology experiments and neuroimaging to support the hypothesis that cognitive adaptations operate in real time within the human brain's neural networks over its life. For example, rat neurons were activated to learn temporal activity patterns, which were recorded and played back to a naive network to determine if the acquired information could be duplicated. Natural selection algorithms designed to create open-ended autonomous exploration were fed to robots, which were tested to see if they could establish their own goal. "Ultimately, robots could be able to generate their own values and desires, and in a sense have minds of their own," Szathmary says. INSIGHT also has created an evolutionary robotics toolkit to enable anyone with a computer to evolve robot bodies and brains in physics-based simulations, three-dimensionally print these elements, and assemble the robot and observe its real-world behavior.
Quantum Computing a Step Closer to Reality
Australian National University (09/27/16) Will Wright
Australian National University (ANU) researchers say they have taken another step toward creating quantum computers by stopping light in a new experiment. Controlling the movement of light is critical to developing future quantum computers, says ANU postdoctoral researcher Jesse Everett. "Optical quantum computing is still a long way off, but our successful experiment to stop light gets us further along the road," he says. Everett notes quantum computers based on light could connect with communication technology such as optical fibers and have potential applications in fields such as medicine, defense, telecommunications, and financial services. The ANU experiment involved creating a light trap by shining infrared lasers into ultra-cold atomic vapor. "The atoms absorbed some of the trapped light, but a substantial proportion of the photons were frozen inside the atomic cloud," Everett says. ANU professor Ben Buchler says the light-trap experiment demonstrated incredible control of a very complex system. The new method lets the researchers manipulate the interaction of light and atoms with great precision, and ANU postdoctoral fellow Geoff Campbell notes the team's work could potentially lead to a quantum logic gate, which is the building block of a quantum computer.
'Missing Link' Found in the Development of Bioelectronic Medicines
University of Southampton (United Kingdom) (09/27/16)
Researchers from the University of Southampton in the U.K. have demonstrated that memristors could be the missing link in the development of implants that use electrical signals from the brain to help treat medical conditions. The researchers showed memristors could provide real-time processing of neurological signals leading to efficient data compression and the potential to develop more precise and affordable neuroprosthetics and bioelectronic medicines. The researchers developed a nanoscale Memristive Integrating Sensor (MIS) into which they fed a series of voltage-time samples, which replicated neurological electrical activity. The metal-oxide MIS was able to encode and compress neurological spiking activity recorded with multi-electrode arrays. The researchers say this approach addressed previous bandwidth restraints and also was very power efficient, as the energy needed for each recording channel was up to 100 times lower when compared to current best practices. "We are thrilled that we succeeded in demonstrating that these emerging nanoscale devices, despite being rather simple in architecture, possess ultra-rich dynamics that can be harnessed beyond the obvious memory applications to address the fundamental constraints in bandwidth and power that currently prohibit scaling neural interfaces beyond 1,000 recording channels," says Southampton professor Themis Prodromakis.
IBM's Brain-Inspired Chip Tested for Deep Learning
IEEE Spectrum (09/27/16) Jeremy Hsu
IBM's TrueNorth computer chip is designed to emulate the structure and functions of a human brain and support deep learning. Deep-learning algorithms typically rely on convolutional neural networks consisting of layers of nodes to filter data, and IBM has demonstrated such algorithms can run on neuromorphic hardware that supports spiking neural networks, which more closely mimic the way real neurons work. Researchers developed a new algorithm that enables its brain-inspired hardware to run convolutional neural networks with 65-percent to 97-percent accuracy on datasets involving vision and speech challenges. TrueNorth can process between 1,200 and 2,600 video frames each second, detecting patterns from as many as 100 cameras simultaneously. "The new milestone provides a palpable proof-of-concept that the efficiency of brain-inspired computing can be merged with the effectiveness of deep learning, paving the path towards a new generation of chips and algorithms with even greater efficiency and effectiveness," says IBM's Dharmendra Modha, a recipient of the ACM Gordon Bell Prize in 2009. He notes the chip's design will enable it to run multiple types of artificial intelligence networks on the same chips, unlike traditional deep-learning hardware specialized to run only convolutional neural networks. IBM researchers say they will continue to test TrueNorth's capabilities for deep learning by gradually introducing hardware constraints.
Technical University of Darmstadt (Germany) (09/23/16) Boris Hanssler
Researchers at Germany's Technical University of Darmstadt (TU Darmstadt) want to automate encryption so software developers can correctly integrate cryptographic protocols into applications that communicate over the Internet. Between 2013 and 2015, 1,769 security vulnerabilities registered in the U.S. National Vulnerability Database stemmed from mistakes made by software developers, and issues involving the integration of encryption protocols in applications were the fourth most frequent source of such vulnerabilities. In addition, recent studies demonstrated software integration is an important point of weakness because the use of components of cryptographic libraries requires knowledge of too many details that app programmers often do not possess. Software developers should ensure the individual steps of an encryption protocol are executed in a specific order, for which concrete recommendations are available depending on the data that is to be protected. However, the researchers note developers frequently lack the time to read the appropriate manuals, and another source of errors is digital certificates that verify the validity of a given key. Developers sometimes disable the verification process for their software certificates in order to speed up testing, but then forget to re-enable it for the production system. "These are both common errors, even among serious software providers; and those are just two examples among many," says TU Darmstadt researcher Mira Mezini.
An Algorithm for Taxi Sharing
Researchers from the University of the Republic in Uruguay have developed an evolutionary algorithm to enable a smart city to facilitate efficient taxi sharing to reduce an individual's transportation costs, as well as reduce traffic congestion and pollution. Tests on global-positioning system data for taxi journeys found the new algorithm can reduce delays and travel costs for a group of passengers. The researchers note smart cities can use information and communications technologies to improve the quality and performance of urban services. "In this way, it is possible to reduce costs, increase efficiency in the use of resources, and allow a more active participation of citizens," the researchers say. They note the practice of sharing vehicles has both economical and environmental benefits, at individual and broader total population scales. "In parallel with carpooling and private drivers transporting private citizens, sharing conventional taxi journeys offers similar benefits," the researchers note. They say the next step is developing an online application for end-users.
Efforts to Bring Computer Science to All Students Make Progress
eSchool News (09/26/16) Laura Devaney
The CSforAll Consortium, a new initiative announced at the White House's Summit on Computer Science Education, aims to give every U.S. student from kindergarten through high school computer science (CS) skills. In addition to the national CSforAll initiative, there are several regional initiatives. For example, Codesters is a key partner in the CodeBrooklyn campaign to advance CS in Brooklyn's 500 public schools. "Codesters has been essential to expanding the reach of computer science into underserved schools in Brooklyn," says CodeBrooklyn co-founder Rob Underwood. The CSforAll Consortium website will serve as a hub for families, schools, and districts seeking resources that match their needs, and it also will help connect members of the national CS education community, provide a process for disseminating their work, and track collective progress toward providing every student with the opportunity to learn CS. The consortium currently consists of more than 180 organizations, including nonprofits, government, industry, and education organizations. The consortium steering committee will be chaired by CSNYC and includes ACM, Code.org, The College Board, the Computer Science Teachers Association, and the National Center for Women & Information Technology.
Microsoft Bets Its Future on a Reprogrammable Computer Chip
Wired (09/25/16) Cade Metz
Microsoft's Project Catapult is designing and implementing reprogrammable computer processors for a new operational architecture that Internet companies are expected to use to run future Internet services on a massive scale. A major step forward has been the advent of field programmable gate arrays (FPGAs), which will drive new deep neural network-based search algorithms in the coming weeks. Catapult team member Doug Burger expects every new Microsoft server to eventually have an FPGA, noting it "gives us massive capacity and enormous flexibility." The fast, energy-efficient, and customizable FPGAs solve the challenge of it generally being too costly to design specialized, purpose-built chips for each new task presented by ever-changing technologies and business models. The FPGA team says it went through several chip redesigns before Catapult was ready to go live. The final prototype consisted of chips sitting at the edge of each server and interfacing directly with the network while still supporting a pool of FPGAs to which any machine could connect. Microsoft's Peter Lee says Catapult will enable the company to maintain the expansion of its global supercomputing capabilities through 2030, and then start moving toward quantum computing.
Single Photon Carries 10 Bits of Information
Technology Review (09/23/16)
Tristan Tentrup and colleagues at the University of Twente in the Netherlands have packed more than 10 bits into a single photon for the first time. The team created an alphabet with 9,072 symbols, with each symbol encoding more than 13 bits of information. The researchers defined a 112 by 81 grid of pixels--9,072 of them--and each pixel represented a different symbol of the alphabet. To encode a photon with one of the symbols, Tentrup says the team used a spatial light modulator to point the photon toward that part of the grid, so when a specific pixel registered the arrival of a photon, it registered that symbol. The team significantly improved on the previous record of just seven bits per photon. Tentrup says physicists use information encoded in single photons with the binary code of 0s and 1s for applications such as the distribution of keys in quantum cryptography. He notes the new technique immediately enables each photon to carry an order of magnitude more. Tentrup says the work could enable the implementation of a large-spatial-alphabet encoding for quantum key distribution.
Teaching Computers to Identify Odors
Harvard Gazette (09/21/16) Peter Reuell
Harvard University researchers have developed a machine-learning algorithm to train a computer to recognize the neural patterns associated with various scents, and to identify whether specific odors are present in a mix of smells. The algorithm works like other pattern-recognition systems, only the patterns are the neural activation patterns of mice reacting to particular odors. "Essentially, one odor causes a particular neural activation pattern, and another odor causes a different pattern," says Harvard professor Venkatesh Murthy. The researchers trained the algorithm to recognize those patterns by gathering data on the neural activation patterns associated with various odors by imaging the brains of mice. They used 80 percent of the data to train the system to recognize patterns of activation for particular odors. The system examines the patterns and randomly selects pixels, and if those pixels reach a certain level, the system concludes the target odor is present. Over thousands of trials, the algorithm eventually became as successful as mice at identifying whether a specific odor was present in a mix of scents. The researchers say their study suggests computer-learning algorithms could be potentially powerful tools to examine the sense of smell, and a way to design and conduct experiments in a virtual space before performing them in the real world.
Army Researchers Teach Translator Devices to Grasp Dialects
Army News Service (09/21/16) David Vergun
A team led by Steve LaRocca at the U.S. Army Research Laboratory's Multilingual Computing and Analysis Branch has developed algorithms that can enable speech-translator devices to comprehend French dialects and convert them into English. LaRocca says the algorithms were mainly designed for translation tasks in Africa, where French dialects are spoken in 21 nations. His team was permitted by a speech-translator device vendor to unlock the gadget so they could add an algorithm overlay onto the standard French model that would process the dialect. Some team members first went to Africa to collect speech samples from natives, and then the recorded speech was parsed into frequency bands to find the energy peaks typical of the vowels and consonants of the accent. LaRocca says the next step involved generating a "fuzzy match" of those peaks and valleys of energy and applying that to a model that anticipates what the person has just spoken based on those patterns. He notes recent milestones in deep neural learning have made such algorithms possible. LaRocca's team expects field-testing of the enhanced translator to begin next spring and to boast a 20-percent reduction in word error rates over the standard French device.
Students' Web App Seeks to Simplify, Modernize Voter Registration
MIT News (09/22/16) Michael Patrick Rutter
Votemate, a new Web application developed by five Massachusetts Institute of Technology (MIT) students, Microsoft's Scott Su, and marketing specialist Brandon Uveges could make it easier for people to confirm they are registered to vote. Votemate is designed to integrate with many state-based voter registration websites, according to its developers. They note the app will deliver a short sample registration form to unregistered voters, and it will display only the information required to register and the registration deadline for the next election. Users will be able to find their polling place and receive email reminders for the next coming election. The team conceived the app as a solution to simplifying and modernizing voter registration for millennials. "With Votemate, we wanted to ensure that our peers and anyone else new to the political process could engage in a familiar way," says MIT graduate student Keertan Kini. "Our Web app lowers the barriers to register and, we hope, vote." The developers note a study of the 2008 election found 21 percent of those between 18 and 24 years of age reported they did not know how or where to register to vote.
Acoustic Resonator Device Paves the Way for Better Communication
Yale News (09/20/16) William Weir
Researchers at Yale University say they have developed a piezo-optomechanical device that achieves "a strong coupling" between a superconducting microwave cavity and a bulk acoustic resonator system. The researchers say the device achieves an exchange of energy and information between the microwave and mechanical resonator systems in a way that exceeds the dissipation of energy of each of the individual systems, so information does not get lost. Postdoctoral researcher Xu Han says the system operates at a very high frequency of 10 GHz, allowing for a high-speed processing signal while also making it easier to observe quantum phenomena in experiments. Han says information storage is a potential application, and the device's compatibility with superconducting quantum bits could mean an important step toward hybrid quantum systems. He is developing a device that uses the mechanical system to convert information from the microwave domain to the optical. "If you want to transmit the information signal, you have to use optics, because optical fiber has very low loss over a long distance," Han notes.
Abstract News © Copyright 2016 INFORMATION, INC.
To submit feedback about ACM TechNews, contact: [email protected]