Welcome to the March 21, 2018 edition of ACM TechNews, providing timely information for IT professionals three times a week.

ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

To view "Headlines At A Glance," hit the link labeled "Click here to view this online" found at the top of the page in the html version. The online version now has a button at the top labeled "Show Headlines."

John L. Hennessy and David A. Patterson Pioneers of Modern Computer Architecture Receive ACM A.M. Turing Award
Jim Ormond
March 21, 2018

ACM has named John L. Hennessy, former Stanford University president, and David A. Patterson, retired University of California, Berkeley professor, recipients of the 2017 ACM A.M. Turing Award. The two researchers are being recognized for their work developing a systematic, quantitative approach to the design and evaluation of computer architectures with enduring impact on the microprocessor industry. Hennessy and Patterson created a method for designing faster, lower power, and reduce instruction set computer (RISC) microprocessors. Their approach resulted in lasting and repeatable principles that generations of architects have used for a range of projects in academia and industry. Today, 99 percent of the more than 16 billion microprocessors produced annually are RISC processors. Hennessy and Patterson will formally receive the 2017 ACM A.M. Turing Award, which also carries a $1-million prize, at ACM's annual awards banquet on June 23 in San Francisco, CA.

Full Article
China Trails U.S. in Every Area of AI Development Except Big Data, Oxford University Report Finds
South China Morning Post
Meng Jing
March 19, 2018

China may struggle to achieve its goal of leading the world in artificial intelligence (AI), according to an Oxford University study. The study gives China a score of 17 for its overall capacity to develop technologies in the field, compared with 33 for the United States. Except for data access, China lags the U.S. in every driver of AI development. China's greatest hurdle is likely to be production of hardware such as microprocessors and chips, due to high initial costs and long creation cycles, the study found. A shortage of AI researchers and a lack of innovation in algorithm development are also major impediments for China. Hoping to leverage the data gathered from its 700-million-plus Internet users, China last July announced a three-step plan to lead the world in AI that culminates in becoming an "innovation center for AI" by 2030.

Full Article
K Computer Accurately Models Aerosol Effects
Asian Scientist
March 16, 2018

Researchers at the RIKEN Advanced Institute for Computational Science (AICS) in Japan have used the K supercomputer to accurately calculate the effects of aerosols on clouds in a climate model. The team combined a model that simulates the entire global weather over the course of a year, at a horizontal resolution of only 14 kilometers, with a simulation of how the aerosols behave within clouds. The researchers note the new high-resolution model accounts for the vertical processes inside clouds and accurately depicts how large areas experience a drop in cloud cover. "It was very gratifying to see that we could use a powerful supercomputer to accurately model the microphysics of clouds, giving a more accurate picture of how clouds and aerosols behave in the real world," says RIKEN AICS professor Yousuke Sato. "In the future, we hope to use even more powerful computers to allow climate models to have more certainty in climate prediction."

Full Article

A group of people meditating. Mindfulness Can Improve Problem-Solving Skills
The University Network
Hyeyeun Jeon
March 16, 2018

Researchers from the University of Seville in Spain have conducted a study showing software engineering students can improve problem-solving skills via meditation. The team performed three experiments, dividing students into experimental and control cohorts and having the experimental group participate in mindfulness sessions. After meditating, participants performed conceptual modeling exercises, with effectiveness and efficiency measured after each session. The team found the students who practiced meditation in all the experiments solved the exercises at a much faster speed than those who did not. In the first and second experiments, meditation practitioners solved problems more effectively by about 10 percent, while their efficiency rose by about 37 percent in the first experiment and about 46 percent in the second. The researchers want to replicate their experiment at other universities so that their findings can be generalized, and they are in discussions with certain firms with the aim of starting empirical studies in software development companies.

Full Article
Computer Science Grads Can Earn More Than MBAs
U.S. News & World Report
Farran Powell
March 20, 2018

Salaries among computer-related jobs, especially for those who hold an advanced degree in the field, are on the rise due to a shortage of technological workers across many economic sectors. Recent graduates with a master's degree in computer science can often earn more than a recent MBA grad because computer science-related jobs are predicted to grow even more in the next few years, according to industry experts. "The demand for technological workers continues to increase while facing a limited supply--both are good news for salary," says PayScale.com's Katie Bardaro. "When looking at early-career pay--five years or less of experience--a master's in computer science beats an MBA in national median earnings." In addition, computer science is one of the few disciplines in which more jobs than degrees are being produced in the U.S. Job titles with computer science skills include software developer, data analyst, and information technology architect, among others.

Full Article

An illustration of nerve cells. New Method Manages and Stores Data From Millions of Nerve Cells – in Real Time
Lund University
March 19, 2018

Researchers at Lund University in Sweden have discovered a way to recode neural signals into a format that computer processors can use instantly. Advanced data management is critical when implantable brain machine interfaces are used to establish electronic communication between the brain's nerve cells and computers. Using simulated recordings from nerve cells to evaluate the method, the team was able to simultaneously collect data from over 1 million nerve cells, analyze the information, and provide feedback within a few milliseconds. The method enables simultaneous communication in real time with millions of nerve cells. "Recoding the nerve cell signals directly into bitcode dramatically increases the storage capacity," says Jens Schouenborg, Professor of Neurophysiology at Lund University and one of the researchers behind the study. "However, the biggest gain is that the method enables us to store the information in a way that makes it immediately available to the computers' processors."

Full Article
Pirate Site Visits Lead to More Malware, Research Finds
March 18, 2018

Carnegie Mellon University research shows spending time on pirate sites increases the risk of malware, while the same is not true for other site categories, such as social networks, shopping, and gambling sites. Carnegie Mellon University professor Rahul Telang spent a year observing the computer habits of 253 people who participated in the Security Behavior Observatory. Doubling the amount of time spent on infringing sites causes a 20 percent increase in malware count, according to Telang. All files on the respondents' computers were scanned and checked against reports from Virustotal.com. The research also shows no evidence that users who visit infringing sites are more likely than other users to install antivirus software. Doubling the amount of traffic spent on a pirate site adds an extra 0.05 of a piece of malware per month, with the average being 0.24, according to the study.

Full Article
IBM Tool Seeks to Bridge AI Skills Gap
CIO Journal
Angus Loten
March 20, 2018

IBM has recently launched Deep-Learning-as-a-Service, a new tool it describes as designed to bridge the skills gap for creating custom artificial intelligence (AI) systems to draw value from data. IBM notes the new tool seeks to lower barriers to deploying AI and deep-learning tools, which can be a complex and painstakingly repetitive process that requires large amounts of computing power. Deep-Learning-as-a-Service enables users to upload data in Watson Studio, IBM's cloud-native platform for data scientists, developers, and business analysts. The users can then create a neural network using a drag-and-drop interface to select-configure, design, and code the system. In addition, IBM says it has automated the repetitive process of fine-tuning deep-learning algorithms, with successive training runs started, monitored, and stopped automatically. The company notes for many organizations, the complexity of creating smart algorithms from scratch has prevented them from exploiting AI to parse massive stores of data for business value.

Full Article
*May Require Paid Subscription

Wyoming Passes Forward-Thinking Computer Science Education Bill Wyoming Passes Forward-Thinking Computer Science Education Bill
Ryan Johnston
March 16, 2018

Wyoming Gov. Matt Mead has signed legislation finalizing a three-year process for transforming the state into one of the biggest promoters of K-12 computer science education in the country. The law adds computer science to the state's Common Core education platform, which means each K-12 school will be required to teach computer science no later than the 2022-2023 school year. State Superintendent Jillian Balow intends to aggressively pursue this goal, even as the State Board of Education considers what content standards to implement, and plans to have 500 teachers trained in computer science in the next five years. Some Wyoming schools already offer computer science, and Balow will gather stakeholder, student, teacher, and industry opinions from schools already making progress to inform final decisions. "Some states have standards for computer science education, some require computer science education in school, some states that have it count towards a math or science credit, and we have all of that," Balow says.

Full Article
Augmenting Scientific Inquiry With Augmented Reality
UT News
Aaron Dubrow
March 15, 2018

Researchers at the Texas Advanced Computing Center (TACC) have presented a proof-of-concept demonstration that enables scientists to watch a plasma model evolve over time in a virtual three-dimensional space. "Augmented and virtual reality tools give us a new way to really see what is taking place in these evolving complex nonlinear vortex plasma structures," says University of Texas professor Wendell Horton. The researchers created the augmented reality visualization by converting the plasma datasets into a form that could be ingested into the Unity platform, a leading framework for augmented reality content creation. The team then overlaid text and audio, making a sharable augmented reality experience that other researchers can view and interpret. "[Horton's] team is now considering how to best take advantage of the technology," says TACC's Greg Foss. Horton's team also created augmented reality representations of physics-based models of clouds and developed a tool that enables air traffic controllers to perceive planes in the sky.

Full Article
AI Tackles the Vatican's Secrets
MIT Technology Review
March 15, 2018

Researchers at Roma Tre University in Italy have created a machine vision system that aims to automatically transcribe part of the Vatican Secret Archives known as the Vatican Registers. "The Code System" project will tackle more than 18,000 pages of official 13th-century correspondence to and from the Catholic Church. Creating machine learning data sets for the project was uniquely difficult because manuscripts are written in varying styles with different ligatures and abbreviations. The team developed a way to train an optical character recognition system that divides each word into a series of strokes. The system then tries to fit the strokes together to form known letters and analyzes all potential letter permutations, ruling out those that are non-grammatical. The system showed impressive results, generating the exact transcription for 65 percent of the data set's word images.

Full Article
New Quantum Spin Liquid Predicted by Nobel Laureate Prepared for the First Time
Aalto University
March 14, 2018

An international team of researchers led by Aalto University in Finland note they have prepared a new superconductor-like quantum spin liquid originally proposed by Nobel Prize-winning physicist Paul W. Anderson in 1987. They say they achieved this by customizing the properties of magnetic materials via techniques developed by Aalto chemists. High-temperature superconductors are copper oxides in which the copper ions arrange into a square lattice so the adjacent magnetic moments are oriented in opposite directions. When changing the copper's oxidation state disturbs this structure, the material becomes superconducting. The researchers were able to alter the magnetic interactions of this square structure with ions that had a d10 and d0 electronic structure, converting the material into a quantum spin liquid. "In the future, this new d10/d0 method can be utilized in many other magnetic materials, including various quantum materials," says Aalto's Otto Mustonen.

Full Article

ORNL researchers Hong-Jun Yoon, Mohammed Alawad and Gina Tourassi ORNL Researchers Design Novel Method for Energy-Efficient Deep Neural Networks
Oak Ridge National Laboratory
Scott Jones
March 14, 2018

Researchers at Oak Ridge National Laboratory (ORNL) have developed a novel method for the creation of energy-efficient deep neural networks for solving complex science problems. The team demonstrated that by converting deep-learning neural networks (DNNs) to deep-spiking neural networks (DSNNs), they can improve the efficiency of network design and realization. DSNNs mimic neurons in the human brain via pulses or "spikes" in place of actual signals, with the individual spikes indicating where to perform the computations. The DSNN-based approach achieved nearly the same accuracy as the original DNN and performed better than a state-of-the-art spiking neural network. The stochastic DSNN, which distributes spikes uniformly over time, consumed 38 times less energy than the original DNN and nearly two times less energy than a conventional DSNN while delivering significantly better task performance. "Spiking the network lowers energy consumption because we disregard the unnecessary computations and we look only for the relevant nodes of the network," says ORNL's Hong-Jun Yoon.

Full Article
ACM Queue Mobile App
ACM Ambassador Program

Association for Computing Machinery

2 Penn Plaza, Suite 701
New York, NY 10121-0701

ACM Media Sales

If you are interested in advertising in ACM TechNews or other ACM publications, please contact ACM Media Sales or (212) 626-0686, or visit ACM Media for more information.

To submit feedback about ACM TechNews, contact: [email protected]