Welcome to the April 6, 2020 edition of ACM TechNews, providing timely information for IT professionals three times a week.
ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).
To view "Headlines At A Glance," hit the link labeled "Click here to view this online" found at the top of the page in the html version.
The online version now has a button at the top labeled "Show Headlines."
|
|
AI Hiring Expected to Show Resilience Amid Coronavirus Slowdown
The Wall Street Journal Angus Loten March 30, 2020
The number of jobs related to artificial intelligence (AI) globally could increase by as much as 16% this year, reaching 969,000, according to technology research firm International Data Corp. (IDC) estimates, driven by stronger demand for AI workers as organizations deal with the impact of the coronavirus pandemic. IDC's forecast includes a variety of AI-related jobs, such as data engineers, data scientists, and machine learning developers. While millions of U.S. businesses have announced layoffs or furloughs, the project resilience of the AI job market mirrors forecasts of continued spending on AI tools and technology. Global AI spending this year could reach $50.7 billion, up 32% over last year, according to IDC.
*May Require Paid Registration
|
A 3D-Printed Brain Could Make It Easier to Find Cancer Treatments
News@Northeastern Laura Castanon April 1, 2020
Researchers at Northeastern University, Rensselaer Polytechnic Institute, and the Icahn School of Medicine at Mount Sinai have developed a technique to study glioblastoma brain tumors using a three-dimensionally (3D)-printed framework composed of human brain cells and biomaterials. Guohao Dai's laboratory at Northeastern grew a 3D model to serve as brain tissue for tumor cells to infect using human brain blood vessel cells connected with various brain cells, and 3D-printed these components in stacks. The researchers inserted glioblastoma tumor stem cells in the middle of the framework, and used a laser to scan the sample and generate a 3D image of its cellular structure. The team employed this methodology to assess the effectiveness of the chemotherapy drug temozolomide. Said Dai, "With our 3D glioblastoma model and imaging platform, you can see how the cells respond to radiation or chemotherapy very quickly."
|
That Chatbot May Be Chatty, but Is It Engaging?
USC Viterbi School of Engineering Rishbha Bhagi April 3, 2020
University of Southern California (USC) researchers have developed a technique for evaluating chatbots' conversational skills, grading their responses on an engagement scale of 0 to 1 based on the concept that open-domain dialogue systems must be genuinely interesting to the user, not just relevant. Sarik Ghazarian in USC’s Viterbi School of Engineering Information Sciences Institute said understanding this assessment will help improve chatbots and other open-domain dialogue systems. Said USC Viterbi's Nanyun Peng, "We can use [this work] as a development tool to easily automatically evaluate our systems with low cost. Also, we can explore integrating this evaluation score as feedback into the generation process via reinforcement learning to improve the dialogue system."
|
Attackers Can Use Zoom to Steal Users' Windows Credentials with No Warning
Ars Technica Dan Goodin April 1, 2020
Zoom for Windows contains a bug that could allow attackers to steal users' operating system credentials without any warning, according to researchers. The exploit leverages the Zoom chat window to send targets a string of text that represents the network location on the Windows device being used, and the Zoom app for Windows automatically renders these universal naming convention (UNC) strings as clickable links. Should targets click on those links on networks that are not fully locked down, Zoom will send Windows usernames and corresponding Net-NTLM-v2 hashes to the address in the link; attackers can then use the credentials to access shared network resources, including Outlook servers and storage devices. Zoom officials said the UNC bug, as well as a separate pair of bugs for macOS, have been fixed. The company said it was enacting a feature freeze for 90 days to focus on securing features already in place.
|
Capturing 3D Microstructures in Real Time
Argonne National Laboratory Joseph E. Harmon April 2, 2020
Researchers at the U.S. Department of Energy's Center for Nanoscale Materials (CNM) have created an algorithm for quantitatively evaluating three-dimensional (3D) materials with features as tiny as nanometers, in real time. CNM's Subramanian Sankaranarayanan said the algorithm can extrapolate the exact 3D microstructure of a material within seconds, even if nothing is known about it beforehand. CNM's Henry Chan added that the algorithm features an unsupervised program that maps out boundaries among the grains in a material sample, rapidly generating 3D samples and precise microstructural data that is robust and resilient to noise. Sankaranarayanan said, "The [tool's] main advantage is ... [researchers] can even quantitatively and visually track the evolution of a microstructure as it changes in real time."
|
D-Wave Systems Offers Free Access to Hybrid Quantum Computing for Coronavirus Researchers
GeekWire Alan Boyle March 31, 2020
D-Wave Systems said anyone working on responses to the coronavirus outbreak can freely access its Leap hybrid quantum cloud service, and the company's partners and customers are supplying expertise to help researchers use its quantum tools to study the pathogen. Partnering with D-Wave to share their expertise in using its tools are Volkswagen, Kyocera, NEC Solution Innovators, Denso, Cineca, Germany's Forschungszentrum Julich, MDR/Cliffhanger, Menten AI, OTI Lumionics, Ludwig Maximilian University of Munich's Quantum Applications and Research Laboratory, Sigma-i, and Japan's Tohoku University. D-Wave CEO Alan Baratz said, "We want to expand the computational capabilities to experts across disciplines, verticals, and geographies, and bring the community's deep quantum knowledge to bear on the complex and dynamic COVID-19 situation."
|
Study Uses AI to Estimate Unexploded Bombs From Vietnam War
Ohio State News Jeff Grabmeier March 24, 2020
Researchers at The Ohio State University (OSU) used artificial intelligence to find unexploded Vietnam War-era bombs in Cambodia. They used machine learning to analyze a commercial satellite image of a 100-square-kilometer area near Kampong Trabaek in Cambodia, for evidence of bomb craters, and the method has increased true bomb crater detection by more than 160% over standard methods. The study suggests that about 44% to 50% of the bombs dropped there remain unexploded. OSU's Erin Lin said, "The process of demining is expensive and time-intensive, but our model can help identify the most vulnerable areas that should be demined first."
|
OLCF Staff Develops Job Step Viewer Tool to Visualize Applications on Supercomputer
HPCwire March 30, 2020
Oak Ridge Leadership Computing Facility (OLCF) and Oak Ridge National Laboratory researchers have developed Job Step Viewer, a tool to visualize in real time where computing jobs are being executed as scientific applications run on the IBM AC922 Summit supercomputer at OLCF. IBM built Summit's job launcher, called jsrun (job step run), for the Summit and Sierra systems at the Oak Ridge and Lawrence Livermore national laboratories, respectively. By employing user-defined groupings of central processing unit cores, graphic processing units, and memory, Summit's large compute nodes can be subdivided to optimally meet the needs of a specific app. Job Step Viewer helps users ascertain the appropriate resource set layout and process binding options for their application.
|
How a Real Dog Taught a Robot Dog to Walk
Wired Matt Simon April 3, 2020
Researchers at Google have developed a robotic dog and taught it to walk by showing it motion-capture videos of real dogs walking on treadmills. The robot, Laikago, has a body very different from a biological dog's body; a digital version of Laikago used computer simulations to determine how to imitate the motion of the digital version of the biological dog without directly copying its mechanics. The researchers used a reinforcement learning algorithm to help the robot learn to move as similarly to the original reference motion as possible. The algorithm attempts random movements, and gets a digital "reward" if it gets closer to the dog's reference motion. Over many iterations, this reward systems teaches the simulated robot dog to move like a real dog. Said Google’s Jason Peng, “The drawback with the kind of manual approach is that it's not really scalable for every skill that we want a robot to perform.”
|
Neural Networks Facilitate Optimization in the Search for Materials
MIT News David L. Chandler March 26, 2020
Massachusetts Institute of Technology researchers have created a machine learning system to streamline the process of searching for new materials for applications when millions of potential materials are being considered and multiple criteria must be met and optimized. The system was used to consider nearly 3 million candidates for an energy storage system, producing a set of the eight most-promising materials. The process was accomplished in five weeks, but researchers say it would have taken 50 years using conventional methods. The study involved teaching an advanced machine-learning neural network about the relationship between the materials' chemical compositions and their physical properties, with that knowledge used to produce suggestions for possible materials to be used for the next round of training. This iterative optimization system generated reliable results with only a few hundred samples. Said Northwestern University’s George Schatz, “This is a beautiful combination of concepts in statistics, applied math, and physical science that is going to be extremely useful in engineering applications.”
|
How Are You Feeling? Surveys Aim to Detect COVID-19 Hot Spots Early
The New York Times David M. Halbfinger April 1, 2020
Epidemiologists and computer scientists have persuaded more than 2 million Britons and 150,000 Israelis to fill out Web- or app-based health surveys to detect incipient COVID-19 outbreaks and send resources to where they can do the most good. Artificial intelligence experts at Israel's Weizmann Institute of Science led the Israeli initiative, feeding computer models survey results; the models accurately forecast surges in COVID-19 cases in cities like Bnei Brak, Jerusalem, and Beersheva five days in advance. The U.K. initiative uses survey results to help pinpoint where ventilators and mobile intensive-care units should be dispatched. Massachusetts General Hospital, the Massachusetts Institute of Technology (MIT), and Weill Cornell Medicine in New York City are leading a U.S. effort. Massachusetts General's Andrew Chan and colleagues are adapting the U.K. app, while MIT will promote a survey app co-developed with engineers at image-sharing website Pinterest.
*May Require Paid Registration
|
4G Networks Vulnerable to Denial of Service Attacks, Subscriber Tracking
ZDNet Charlie Osborne March 26, 2020
A report from enterprise security solutions provider Positive Technologies indicates that every 4G network is susceptible to a form of denial-of-service (DoS) attack. The report centers on the industry-standard Diameter signaling protocol, which in 4G is used to authenticate and authorize messages. The report analyzed the networks of 28 telecommunications operators across Europe, Asia, Africa, and South America between 2018 and 2019, finding that every attempt to infiltrate these networks in some form was a success. The researchers found that DoS was the easiest form of cyberattack to attempt, due to architectural flaws in the Diameter protocol. They said these security weaknesses will continue to exist as 5G networks build out on existing architecture and the Diameter protocol.
|
IISc Team Proposes Efficient Design for Quantum Circuits
Indian Institute of Science Aniket Majumdar March 27, 2020
Indian Institute of Science (IISc) researchers have developed an algorithm to count the number of computing resources needed to design a theoretical quantum circuit, and optimized it to maximize efficiency. The researchers considered a previous study that found counting gates to realize maximum efficiency is equivalent to finding the path with the shortest distance between two points in some mathematical space with a specific volume. IISc's Aninda Sinha said his group's calculations confirmed that the minimum number of gates directly varies with the volume of that space. Their results also appear to associate the efficiency optimization problem with string theory, and the researchers think this connection can be critical in helping scientists interpret theories involving gravity.
|
|