Welcome to the December 20, 2017 edition of ACM TechNews, providing timely information for IT professionals three times a week.

ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

To view "Headlines At A Glance," hit the link labeled "Click here to view this online" found at the top of the page in the html version. The online version now has a button at the top labeled "Show Headlines."

Blue computer chip, illustrated  Mont-Blanc 2020 Project Sets Sights on Exascale Processor
Michael Feldman
December 18, 2017

The European Commission's Mont-Blanc 2020 project centers on the production of an ARM-based system-on-chip (SoC) that can power exascale supercomputers. The new effort, which launched on Dec. 11, will bring together some of the leading European supercomputing organizations, including Arm Limited, Atos/Bull, the Barcelona Supercomputing Center (BSC), Forschungszentrum Julich, the French Alternative Energies and Atomic Energy Commission (CEA), Kalray Corporation, and SemiDynamics. The project aims to design a low-power exascale SoC and deliver an initial ARM chip in a proof-of-concept demonstration. The work will include the development of critical building blocks and provide a blueprint for its first-generation implementation, focusing on creating a design that balances vector length, network-on-chip bandwidth, and memory bandwidth, with the power limitations that come with an exascale supercomputer. The new chip will be developed in a modular manner so the functional blocks can be re-purposed for other applications where low-power ARM technology is required.

Full Article
Electromagnetic Emissions From Smartphones Analyzed for Security Vulnerability
Charles III University of Madrid (Spain)
December 19, 2017

Researchers at the Charles III University of Madrid (UC3M) in Spain are developing a tool that enables cellphones to be analyzed in order to determine if they could be vulnerable to a cyberattack that steals encryption keys via their electromagnetic emanations. The research focuses on "lateral movement attacks," which happen when a hacker tries to take advantage of a circumstance such as the magnetic field produced by an electric current for illicit purposes, such as trying to extract the private password from the encryption. When electronic devices are on, they use energy and generate electromagnetic fields. The researchers attempted to capture their traces to obtain the encryption key and decipher the data, according to UC3M researcher Lorena Gonzalez. The goal is to detect and describe the vulnerabilities of electronic devices and their chips so software and hardware developers can implement appropriate defenses to protect user security.

Full Article

Illustration of Earth bisected to show core Researchers Compute Their Way to the Center of the Earth
Gauss Center for Supercomputing
Eric Gedenk
December 18, 2017

Researchers at the University of Cologne in Germany are using computing resources at Germany's Julich Supercomputing Center (JSC) to better understand how materials behave in the extreme conditions below the Earth's surface. The team has been using JSC's JUQUEEN supercomputer to simulate the structure of melts by studying silicate glasses as a model system for melts under ultra-high pressures. "Understanding properties of silicate melts and glasses at ultra-high pressure is crucial to understanding how the Earth has formed in its infancy, where impacts of large asteroids led to a completely molten Earth," says University of Cologne professor Clemens Prescher. The team utilized ab initio calculations of atoms' electronic structures and put those calculations in motion using molecular dynamics simulations. The researchers focused on silicon dioxide, a common, well-known material because it made it easier to expand the range of pressure they could simulate and attempt to validate the model with experimental data.

Full Article
Israeli Software Turns Standard Cameras Hyperspectral
Abigail Klein Leichman
December 19, 2017

Researchers at Ben-Gurion University of the Negev (BGU) in Israel say they have developed new software that enables standard cameras to capture hyperspectral images and video, providing a faster and more cost-efficient approach than with conventional technology. Current hyperspectral technology, which is expensive, cumbersome, and slow, can detect specific materials and identify the qualities of those materials. The new technology shows that light wavelengths in nature cover only a small portion of all possible spectra, since the spectrum of the sun is relatively stable and the number of substances in the world is finite, according to BGU professor Ohad Ben-Shahar. The researchers used computational methods to reconstruct hyperspectral imaging from the regular RBG color model used in traditional cameras. The new software captures the spectral signature of every pixel in a single image, while current spectrometric technology can measure just one point or line at a time. The software also can create hyperspectral videos, instantly collecting hyperspectral information on moving objects.

Full Article

Illustration of computer and circles with the names of different coding languages The Top Seven Programming Languages of 2018
SD Times
Jenna Sargent
December 14, 2017

Coding Dojo recently released a list of the top seven most in-demand programming languages coders should be aware of over the next year. "New and in-demand programming languages, frameworks, and technologies can emerge, rise to fame, and then fade away in the course of a few years," which means developers should be constantly learning new skills, says Coding Dojo's Speros Misirlakis. The company analyzed data from Indeed on 25 programming languages, stacks, and frameworks to determine which were the seven most in demand by employers. The analysis found that Java is the most in-demand language, with 68,000 jobs in 2017, but that figure is expected to decrease to 62,000 in 2018. Meanwhile, Python is expected to grow from 41,000 jobs in 2017 to 46,000 jobs in 2018. JavaScript ranked third, with 40,000 jobs in 2017, but is projected dip to 38,000 in 2018. C++ and C# were fourth and fifth, respectively, followed by PHP and Perl.

Full Article
University of Sydney Develops Quantum Trick to Block Background Sensor 'Chatter'
Asha McLean
December 19, 2017

Researchers at the University of Sydney (USyd) in Australia, Dartmouth College, and Johns Hopkins Applied Physics Laboratory have developed a method to block background "chatter," which they say solves a common problem associated with quantum sensing devices. The new quantum control techniques could lead to the development of ultra-sensitive sensors that can identify tiny signals while rejecting background noise down to theoretical limits. "By applying the right quantum controls to a qubit [quantum bit]-based sensor, we can adjust its response in a way that guarantees the best possible exclusion of the background clutter--that is, the other voices in the room," says USyd professor Michael J. Biercuk. The team's experiments used trapped atomic ions to reduce spectral leakage by "many orders of magnitude" over conventional methods. In addition, the new approach is relevant to nearly any quantum sensing application and can be applied to quantum computing.

Full Article

Cars in a traffic jam Improving Traffic--By Tailgating Less
Adam Conner-Simons
December 14, 2017

Researchers at the Massachusetts Institute of Technology's Computer Science and Artificial Intelligence Laboratory have shown that there would be fewer traffic jams if drivers would stop tailgating. The researchers found maintaining an equal distance between cars on either side, known as "bilateral control," could enable drivers to reach their destination twice as quickly. They predict that traffic would become noticeably better if just a small percentage of cars were outfitted with sensor systems on the front and rear bumpers that could help drivers maintain an equal distance between the other cars. The researchers plan to run simulations to test whether this method is also safer for drivers. Another proposed approach involves electronically connecting vehicles together to coordinate the distances between each other. However, this so-called "platooning" method requires detailed coordination and a massive network of connected vehicles. The researchers' approach would simplify the system, only requiring new software and some inexpensive hardware updates.

Full Article
AI Insights Could Help Reduce Injuries in Construction Industry
University of Waterloo News
Matthew Grant
December 19, 2017

Researchers at the University of Waterloo in Canada are using artificial intelligence (AI) to gain new insights into ways to help reduce wear-and-tear injuries and boost the productivity of construction workers. Previous studies involving motion sensors and AI software have found that experienced bricklayers use previously unidentified techniques to limit the loads on their joints. In their first study, the researchers analyzed data from bricklayers of various experience levels wearing sensor suits while building a wall with concrete blocks. The study found the experienced workers put less stress on their bodies, but were able to do more work than their inexperienced counterparts. A follow-up study involved using sensors to record the workers' movements and AI software to identify patterns in body positions. The researchers also are developing a system that uses sensor suits to give trainees immediate feedback, helping them to modify their movements to reduce stress on their bodies.

Full Article

Sun rising on jupiter Artificial Intelligence Is Helping Astronomers Discover New Planets
Abigail Beall
December 17, 2017

Researchers at the University of Texas at Austin and Google AI trained a computer to learn how to identify exoplanets in the light readings recorded by the U.S. National Aeronautics and Space Administration (NASA) Kepler mission. The majority of exoplanets are discovered by studying the large amounts of data related to the brightness of an image. When a solar system has more than one planet, the reductions in brightness occur in complicated patterns, from which the planets' masses and distances from their stars can be determined. The researchers used 15,000 signals to train the computers using a neural network and found it can identify planets with 96-percent accuracy. Overall, the project found 35,000 possible planetary signals in the data Kepler gathered over its four-year mission. "This finding shows that our data will be a treasure trove available to innovative researchers for years to come," says NASA's Paul Hertz.

Full Article
How Do You Spot a Russian Bot? Answer Goes Beyond Kremlin Watching
New York University
James Devitt
December 18, 2017

Researchers at New York University (NYU) have isolated the characteristics of bots on Twitter by studying bot activity related to Russian political discussions. Their research identifies how Russian accounts influence online exchanges using bots and trolls, with the goal of provoking unrest and disrupting civil discourse. The findings reveal notable differences between human and bot-created posts, in addition to several similarities that could help hide bots from detection. For example, the researchers found that bots are much more likely to use online platforms while humans more often use mobile devices. They focused on two specific time periods, both of which were notably consequential in Russian politics. The researchers analyzed about 15 million tweets sent from about 230,000 Russian Twitter accounts, including 93,000 that were active during both time periods. The researchers found that of those accounts active in both periods, about 67 percent of them were bots.

Full Article
Quantum Computer With 46 Qubits Simulated
Julich Supercomputing Center
Regine Panknin
December 15, 2017

Researchers at the Julich Supercomputing Center (JSC) in Germany, Wuhan University in China, and the University of Grongingen in the Netherlands say they have set a new world record by successfully simulating a quantum computer with 46 qubits for the first time. The researchers conducted the simulation on the Julich JUQUEEN supercomputer, as well as on the Sunway TaihuLight supercomputer (which currently leads the Top500 ranking of the fastest supercomputers in the world) at China's National Supercomputing Center in Wuxi. It is possible to simulate the functioning of relatively large quantum computers on classical digital computers, but the amount of memory required doubles with each simulated qubit. "There are only a few supercomputers in the world that currently have such a large amount of memory, an adequate number of compute nodes, and sufficiently fast network connections to even simulate a system of 45 qubits--that was the former world record," notes JSC professor Kristel Michielsen. The record-breaking simulation was achieved following an adjustment of the simulation code.

Full Article
'Negative Capacitance' Could Bring More Efficient Transistors
Purdue University News
Emil Venere
December 18, 2017

Researchers at Purdue University have experimentally demonstrated how to harness negative capacitance for a new type of transistor that could reduce power consumption in electronic devices. The researchers used a two-dimensional layer of molybdenum disulfide to make a channel adjacent to the gate. The team then used hafnium zirconium oxide, a ferroelectric material, to create a negative capacitor in the newly designed gate. "The overarching goal is to make more efficient transistors that consume less power, especially for power-constrained applications such as mobile phones, distributed sensors, and emerging components for the Internet of Things," says Purdue professor Peidi Ye. The researchers found the ferroelectric material and negative capacitance in the gate results in good switching in both the on and off states. The new design also does not generate hysteresis, a harmful electronic property, an achievement that enables the transistors to switch on and off properly.

Full Article
UTA Researcher Publishes Article on Multi-Channel, Nonlinear-Optical Processing Devices
UTA News Center
Herb Booth
December 14, 2017

Researchers at the University of Texas at Arlington (UTA) and the University of Vermont say they have made a breakthrough that could facilitate a dramatic reduction in the cost and energy consumption of high-speed Internet connections. They found nonlinear-optical effects can be used to process data thousands of times faster than can be achieved electronically. The researchers demonstrated an optical medium in which multiple beams of light can autocorrect their own shapes without affecting one another, and this enables simultaneous nonlinear-optical processing of multiple light beams by a single device without converting them to electrical form. The researchers say they could help the technology achieve its full multi-terabit potential, resulting in less-expensive and more energy-efficient high-speed Internet communications. "Our new nonlinear medium has allowed us to demonstrate simultaneous all-optical regeneration of 16 WDM [wavelength-division multiplexing] channels by a single device, and this number has only been limited by the logistical constraints of our laboratory," says UTA professor Michael Vasilyev.

Full Article
ACM Queue Mobile
ACM Distinguished Speakers Program

Association for Computing Machinery

2 Penn Plaza, Suite 701
New York, NY 10121-0701

ACM Media Sales

If you are interested in advertising in ACM TechNews or other ACM publications, please contact ACM Media Sales or (212) 626-0686, or visit ACM Media for more information.

To submit feedback about ACM TechNews, contact: [email protected]