Welcome to the February 7, 2018 edition of ACM TechNews, providing timely information for IT professionals three times a week.
ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).
To view "Headlines At A Glance," hit the link labeled "Click here to view this online" found at the top of the page in the html version.
The online version now has a button at the top labeled "Show Headlines."
|
|
Worm Uploaded to a Computer and Trained to Balance a Pole
Technische Universitat Wien February 6, 2018
Researchers at the Technical University of Wien (TU Wien) in Austria have translated the neural system of a C. elegans nematode into computer code, and taught the virtual worm to balance a pole on the tip of its tail. C. elegans' neural system can be drawn as a circuit diagram or reproduced by computer software, so the worm's neural activity is simulated by a computer program. C. elegans exhibits a reflexive response to external stimuli, which can be explained and reproduced by analyzing the worm's nerve cells and the strength of their connections. When this simple reflex network is recreated on a computer, the simulated worm reacts in exactly the same way to a virtual stimulation because that behavior is hard-wired in its neural network. "With the help of reinforcement learning, a method also known as 'learning based on experiment and reward,' the artificial reflex network was trained and optimized on the computer," says TU Wien's Mathias Lechner.
|
Autonomous 3D Scanner Supports Individual Manufacturing Processes
Fraunhofer-Gesellschaft February 6, 2018
Researchers at the Fraunhofer Institute for Computer Graphics Research (IGD) in Germany have developed the first autonomous three-dimensional (3D) scanner that works in real time, which promises to be a boon to manufacturers. A component on a turntable is mapped by the scanner, which is mounted to a robot arm while algorithms create a 3D image in the background. A simulation of the image checks whether a 3D print would meet the relevant stability requirements, and then the component is printed. During an initial scan, algorithms calculate what further scans are required so the object can be recorded with the fewest number of scans; this enables the system to rapidly and independently measure objects that are unknown to it. "Our scan system is able to measure any component, irrespective of its design--and you don't have to teach it," says IDG's Pedro Santos. "Also, you don't need information about [computer-aided design] models or templates."
|
AI Just Learned How to Boost the Brain's Memory
Wired Robbie Gonzalez February 6, 2018
Researchers at the University of Pennsylvania have demonstrated how machine-learning algorithms can decode and augment human memory by triggering the delivery of precisely timed pulses of electricity to the brain. "We're using it to build...something that can look at electrical activity and say whether the brain is in a state that's conducive to learning," says University of Pennsylvania professor Michael Kahana. The project involved 25 epilepsy patients with electrodes implanted in their brain, which were used to record high-resolution brain activity during memory tasks. From the training data the team built algorithms that predicted which words in a list each patient would likely recall based solely on their electrode activity. "A closed-loop system lets us record the state of the subject's brain, analyze it, and decide whether to trigger a stimulation, all in a few hundred milliseconds," Kahana says. He notes the system improved patients' ability to remember words by an average of 15 percent.
|
IBM's New AI Can Predict Psychosis in Your Speech
Futurism Dom Galeon February 5, 2018
Researchers in IBM Research's Computational Psychiatry and Neuroimaging groups and universities worldwide have developed artificial intelligence (AI) for predicting the onset of psychosis in patients by analyzing speech patterns. Their work is based on an IBM study demonstrating the use of AI to model the differences in speech patterns of high-risk patients who later become psychotic and those who do not. The team evaluated a patient cohort instructed to talk about a story they had just read, and built a retrospective model of speech patterns, says IBM's Guillermo Cecchi. The study determined the AI could have predicted the eventual onset of psychosis in patients with 83-percent accuracy. Cecchi thinks the AI could lead to improved diagnosis at the onset of psychosis, and from there to better therapy. "Patients considered at-risk could be quickly and reliably triaged so that the [always-limited] resources can be devoted to those deemed very likely to suffer a first episode of psychosis," he says.
|
Cornell Tech Professor Explains Hype and Pitfalls of Cryptocurrencies
The Cornell Daily Sun Amol Rajesh February 5, 2019
Cornell University professor Ari Juels sees cryptocurrencies such as bitcoin exacting a heavy energy toll, warranting the need for finding less wasteful alternatives. Juels notes the boom in bitcoin "mining" has greatly elevated overall electricity consumption. "The bitcoin network consumes as much electricity at this point as a small nation," he says. "This is unconscionable and unsustainable." Also of concern to Juels is cryptocurrencies' security vulnerabilities, which he views as identical to those of the Internet. "The difference is that cryptocurrencies are a particularly rich and juicy target and their partial anonymity helps black-hat hackers," he notes. Juels says he wants to see positive regulation in the future to ensure stability and foster innovation in the cryptocurrency market. "I'd like to see regulation that effectively brings the best of the old and the new, removing corrupt players in the traditional financial industry, but preventing new and worse unscrupulousness," he stresses.
|
Brain-Implanted Devices Could Lead to UW Medical Breakthroughs
The Seattle Times Katherine Long February 6, 2018
Researchers at the University of Washington's (UW) Center for Sensorimotor Neural Engineering (CSNE) are linking computers to human brains with the ultimate goal of restoring limb movement to paralysis victims. One experimental device developed by the CSNE is a deep brain stimulation system to help control essential tremor. The device only triggers stimulation when a patient moves his or her arms, and UW's Andrew Ko says research suggests intermittent stimulation may be better at treating symptoms than constant stimulation. The device relies on machine-learning algorithms that "decode" certain neural signals picked up by sensors implanted in the brain and correlate them with essential tremor symptoms. "In some ways, it is like a prosthetic device for brain pathways," Ko notes. CSNE researchers also are investigating the possibility of patients being able to activate and adjust the system by thought instead of turning the device on and off manually.
|
A New CMU Algorithm Can Help Thwart Antibiotic Resistance
Pittsburgh Post-Gazette David Templeton February 6, 2018
Researchers at Carnegie Mellon University (CMU), the University of California San Diego, and St. Petersburg State University in Russia have developed the VarQuest algorithm to discover new non-resistant antibiotics. VarQuest has mined and structured large scientific databases to identify 10 times more variants of peptidic natural products (PNPs) than previously discovered. CMU professor Hosein Mohimani says VarQuest enabled the team to pre-process databases to facilitate the search that discovered 1,000 variants of the 100 known PNP compounds. "Five or six years ago, researchers in the community of antibiotic discovery realized they have to coordinate their efforts to make sure that they are not doing redundant work, and at the same time, sharing valuable data collected from different parts of the earth," Mohimiani notes. Consequently, the new social network of researchers shares databases that include "fingerprints of molecules" of variants that VarQuest found, which would have taken hundreds of years of computation to find using previous techniques.
|
Researchers Solve a Materials Mystery Key to Next-Generation Electronic Devices
University of Wisconsin-Madison Renee Meiller February 5, 2018
Researchers at the University of Wisconsin-Madison (UW Madison) have observed a two-dimensional hole gas, the missing second half of the combination necessary to advance oxide electronics materials. The team provided evidence of a hole gas coexisting with the electron gas by designing an ultrathin material, known as a thin film structure. The researchers achieved this breakthrough thanks to a new layer design in which UW Madison professor Chang-Beom Eom alternated layers of strontium oxide and titanium dioxide on the bottom, then layers of lanthanum oxide and aluminum oxide, and finally additional layers of strontium oxide and titanium dioxide on the top. Eom says the design resulted in the hole gas forming at the interface of the layers at the top, while the electron gas formed at the interface of the layers on the bottom. He thinks this advance could lead to a platform that can enable new concepts and applications that cannot be imagined today.
|
How DNA Could Store All the World's Data in a Semi-Trailer
Financial Times Chloe Cornish February 5, 2018
Researchers from the University of Washington, Microsoft, and Twist Bioscience in October successfully stored two musical recordings on a DNA sequence that could be decrypted and played back with perfect quality, demonstrating that DNA-based data storage is a serious possibility. "We need about 10 tons of DNA to store all the world's data," says Columbia University's Yaniv Erlich. "That's something you could fit on a semi-trailer." University of California, Los Angeles professor Sriram Kosuri notes rendering digital data's binary code as the chemical constituents of DNA spirals is less complicated than might be expected, as a paper he co-authored details that the 0s and 1s of binary code can be translated into the DNA bases of A, T, C, and G. Kosuri says a DNA strand can then be constructed matching the sequence of digital code, with the data read and decoded by running the DNA through a sequencer.
|
The Quest for a Quantum Computer
The Battalion (TX) Naren Venkatesh February 5, 2018
Researchers at Texas A&M University have received a $1.3-million U.S. National Science Foundation grant to fund a project to advance the field of quantum computing. Texas A&M professor Eric Rowell, who is one of four researchers working on the project, will help the team develop a model for storing information and ensuring resistance toward errors. The current goal the researchers are striving toward is trying to put the global property of quantum states to good use and make memories resistant to corruption. Texas A&M professor Andreas Klappenecker is motivated by the fundamental theoretical results, and wants to follow up with experimental results, focusing on making error correction feasible. A combination of classical and quantum computing algorithms has the most potential to advance this research, but Klappenecker contends it is worth the time to study new methods to ensure efficient transmission and exchange of information between the two types of computing.
|
Researchers Are Putting a Multidisciplinary Spin on Teaching Physics
UT Dallas News Center Chase Carter February 5, 2018
Researchers at the University of Texas at Dallas have launched Scaffolded Training Environment for Physics Programming (STEPP), a three-year project funded by the U.S. National Science Foundation involving specialized computer programs to teach children physics. STEPP will test a computer-based learning program that requires no prior coding experience to teach physics to high school students. The researchers are developing a programming platform that enables a student with little or no knowledge of computer coding to build a system of rules and procedures from scratch. The team wants STEPP to provide students with the tools to successfully use computer science as an application, as well as a thorough understanding of how those tools could be used to solve a wide variety of problems. The researchers will conduct initial testing of the software in the fall of 2018, and then a summer institute will be held in 2019 for the first full implementation with high school teachers.
|
Army Researchers Develop New Algorithms to Train Robots
U.S. Army February 2, 2018
Researchers at the U.S. Army Research Laboratory (ARL) and the University of Texas at Austin (UT Austin) have developed new methods for robots or computer programs to learn how to perform tasks by interacting with a human instructor. The researchers focused on a case in which a human provides real-time feedback in the form of a critique. The Deep TAMER system employs deep learning to provide a robot with the ability to learn how to perform tasks by viewing video streams in a short amount of time with a human trainer. The ARL/UT Austin team considered situations where a human teaches an agent how to behave by observing it and providing feedback, such as "good job," or "bad job." They demonstrated Deep TAMER's success by using it with 15 minutes of human-provided feedback to train an agent to perform better than humans on the Atari game of bowling.
|
Teaching Machines to Learn by Themselves
Princeton University Doug Hulette February 1, 2018
Princeton University professor Elad Hazan discusses his work to create self-learning machines, such as the AdaGrad optimization algorithm used to train deep neural networks. "An optimization algorithm can be thought of as a computer program that takes input data, and, based upon the data, sets the weights/connections of the circuit to achieve the desired functionality," Hazan says. He notes his In8 startup will concentrate on a time-series prediction method and efficient optimization techniques to help accelerate training of deep neural networks. As part of a deal with Google, Hazan says In8 will "continue to work on our focus areas: optimization of non-convex models required for deep learning, that is, those for which optimization is far more challenging because we don't have provably efficient methods; and predictive control for robotics systems such as, for instance, using signals such as velocity, momentum, acceleration, and position to tell a vehicle's steering wheel, gas pedal, and brakes what to do."
|
|