Welcome to the March 22, 2021 edition of ACM TechNews, providing timely information for IT professionals three times a week.
ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).
To view "Headlines At A Glance," hit the link labeled "Click here to view this online" found at the top of the page in the html version.
The online version now has a button at the top labeled "Show Headlines."
|
|
Lip-Reading Software Helps Users of All Abilities to Send Secure Messages
University of California News Lorena Anderson March 18, 2021
Lip-reading software developed by researchers at the University of California, Merced (UC Merced)'s Human-Computer Interaction Group can continuously learn from its users. LipType allows users to send texts or emails on their computers and mobile devices, and contactlessly engage with public devices without speaking aloud. The UC Merced team incorporated filters for different lighting conditions and a mistake-corrector based on different language models into LipType, which was faster and less error-prone than other lip-readers. UC Merced's Laxmi Pandey said, "LipType performed 58% faster and 53% more accurately than other state-of-the-art models in various real-world settings, including poor lighting and busy market situations."
|
System Detects Errors When Medication is Self-Administered
MIT News Daniel Ackerman March 18, 2021
Massachusetts Institute of Technology (MIT) researchers have developed a system to detect and reduce errors when patients administer their own medication. The solution combines wireless sensing with artificial intelligence (AI) to determine when a patient is using an insulin pen or inhaler, and cites potential errors in self-administration. A sensor tracks the patient's movements within a 10-meter (32-feet) radius of the medical device via radio waves, then AI interprets the reflected waves for signs of the patient self-administering an inhaler or insulin pen; the final step is to alert the patient or healthcare provider when an error has been detected. Said MIT’s Mingmin Zhao, “We can not only see how frequently the patient is using their device, but also assess their administration technique to see how well they’re doing.”
|
Engineers Combine AI, Wearable Cameras in Self-Walking Robotic Exoskeletons
University of Waterloo News (Canada) March 15, 2021
Researchers at Canada's University of Waterloo have combined computer vision and deep-learning artificial intelligence (AI) technology in an effort to develop robotic exoskeleton legs that can make decisions. Current exoskeleton legs must be controlled manually via smartphone applications or joysticks. The researchers used wearable cameras fitted to exoskeleton users and AI software to process the video feed to recognize stairs, doors, and other aspects of the surrounding environment. Waterloo’s Brokoslaw Laschowski said, "Our control approach wouldn't necessarily require human thought. Similar to autonomous cars that drive themselves, we're designing autonomous exoskeletons that walk for themselves."
|
UC Chemists Use Supercomputers to Understand Solvents
University of Cincinnati News Michael Miller March 19, 2021
University of Cincinnati (UC) chemists Thomas Beck and Andrew Eisenhart used a supercomputer to understand the basic characteristics of an industrial solvent via quantum simulation. The researchers employed the university’s Advanced Research Computing Center and the Ohio Supercomputer Center to investigate glycerol carbonate. Said Eisenhart, "Quantum simulations have been around for quite a while. But the hardware that's been evolving recently—things like graphics processing units and their acceleration when applied to these problems—creates the ability to study larger systems than we could in the past." Eisenhart said the analysis provided insights into how small modifications to molecular structure can have larger effects on the solvent overall, "and how these small changes make its interactions with very important things like ions and can have an effect on things like battery performance."
|
Technology Will Create Millions of Jobs. The Problem Will Be to Find Workers to Fill Them
ZDNet Daphne Leprince-Ringuet March 19, 2021
Economic analysis by Boston Consulting Group (BCG) indicates that new technologies will create tens of millions of jobs by 2030, but are unlikely to offset job losses from automation over the same period. Models of prospective changes in labor supply and demand in Germany, Australia, and the U.S. forecast that the next decade's job losses will be matched by even greater job creation. BCG's Miguel Carrasco said, "It is unrealistic to expect perfect exchangeability—not all of the surplus capacity in the workforce can be redeployed to meet new or growing demand." Occupations facing the biggest shortages include computer-related professions and jobs in science, technology, engineering, and math. BCG recommends aggressively upskilling and retraining the workforce, to ensure the timely fulfillment of demand for talent.
|
Quantum Algorithm Surpasses QPE Norm
Osaka City University (Japan) March 17, 2021
Researchers at Japan's Osaka City University (OCU) have developed a quantum algorithm that requires a tenth the computational cost of the Quantum Phase Estimation (QPE) benchmark. The Bayesian eXchange coupling parameter calculator with Broken-symmetry wave functions (BxB) forecasts these states via direct calculation of the energy differences. The OCU team used BxB to calculate the vertical ionization energies of small molecules such as carbon monoxide, oxygen, cyanide, fluorine, water, and ammonia within 0.1 electron volts of precision, using half the number of quantum bits. The researchers said the computational cost for reading out the ionization energy remained constant, regardless of the number of atoms or molecular size.
|
'Expert' Hackers Used 11 Zerodays to Infect Windows, iOS, Android Users
Ars Technica Dan Goodin March 18, 2021
Google's Project Zero security researchers warned that a team of hackers used no fewer than 11 zeroday vulnerabilities over nine months, exploiting compromised websites to infect patched devices running the Windows, iOS, and Android operating systems. The group leveraged four zerodays in February 2020, and their ability to link multiple zerodays to expose the patched devices prompted Project Zero and Threat Analysis Group analysts to deem the attackers "highly sophisticated." Project Zero's Maddie Stone said over the ensuing eight months the hackers exploited seven more previously unknown iOS zerodays via watering-hole attacks. Blogged Stone, "Overall each of the exploits themselves showed an expert understanding of exploit development and the vulnerability being exploited."
|
France's Competition Authority Declines to Block Apple's Opt-in Consent for iOS App Tracking
TechCrunch Natasha Lomas March 17, 2021
France's competition authority (FCA) has rejected calls by French advertisers to block looming pro-privacy changes requiring third-party applications to obtain consumers’ consent before tracking them on Apple iOS. FCA said it does not currently deem Apple's introduction of the App Tracking Transparency (ATT) feature as abuse of its dominant position. However, the regulator is still probing Apple "on the merits," and aims to ensure the company is not applying preferential rules for its own apps compared to those of third-party developers. An Apple spokesperson said, "ATT will provide a powerful user privacy benefit by requiring developers to ask users' permission before sharing their data with other companies for the purposes of advertising, or with data brokers. We firmly believe that users' data belongs to them, and that they should control when that data is shared, and with whom."
|
Novel Deep Learning Framework for Symbolic Regression
Lawrence Livermore National Laboratory March 18, 2021
Computer scientists at Lawrence Livermore National Laboratory (LLNL) have created a new framework and visualization tool that applies deep reinforcement learning to symbolic regression problems. Symbolic regression, a type of discrete optimization that seeks to determine the underlying equations or dynamics of a physical process, generally is approached in machine learning and artificial intelligence with evolutionary algorithms, which LLNL's Brenden Petersen said do not scale well. LLNL's Mikel Landajuela explained, "At the core of our approach is a neural network that is learning the landscape of discrete objects; it holds a memory of the process and builds an understanding of how these objects are distributed in this massive space to determine a good direction to follow." The team's algorithm outperformed several common benchmarks when tested on a set of symbolic regression problems.
|
Standard Digital Camera, AI to Monitor Soil Moisture for Affordable Smart Irrigation
University of South Australia March 15, 2021
Researchers at the University of South Australia (UniSA) and Middle Technical University in Iraq developed a smart irrigation system that uses a standard RGB digital camera and machine learning technology to monitor soil moisture. The new method aims to make precision soil monitoring easier and more cost-effective by eliminating the need for specialized hardware and expensive thermal imaging cameras that can encounter issues in certain climatic conditions. UniSA's Ali Al-Naji said the system was found to accurately determine moisture content at different distances, times, and illumination levels. The camera was connected to an artificial neural network, which could allow the system to be trained to recognize the specific soil conditions of any location. UniSA's Javaan Chahl said, "Once the network has been trained it should be possible to achieve controlled irrigation by maintaining the appearance of the soil at the desired state."
|
Learning Apps Have Boomed in the Pandemic. Now Comes the Real Test.
The New York Times Natasha Singer March 22, 2021
The pandemic fueled demand for education technology, with CB Insights reporting a surge in venture and equity financing for education technology startups from $4.81 billion in 2019 to $12.58 billion in 2020. Futuresource Consulting reports a jump in the number of laptops and tablets shipped to U.S. primary and secondary schools from 14 million to 26.7 million over the same period. The pandemic prompted schools to use videoconferencing and other digital tools to replicate the school day for remote students, rather than implement artificial intelligence-powered apps that could tailor lessons to a child's abilities. However, apps that facilitate online interactions between students and teachers have grabbed investors' attention. The reading lesson app Newsela, for instance, is now valued at $1 billion. Whether these apps will remain popular will depend on how useful they prove to be in the classroom amid the shift back to in-person learning.
*May Require Paid Registration
|
Are Quantum Computers Good at Picking Stocks? This Project Tried to Find Out
ZDNet Daphne Leprince-Ringuet March 15, 2021
Researchers from the Technical University of Denmark (DTU), consultancy firm KPMG, and an unnamed European bank are piloting the use of quantum computing for portfolio optimization. The researchers turned economist Harry Markowitz's classical model for portfolio selection into a quadratic unconstrained binary optimization problem based on budget, expected return, and other criteria. Using D-Wave's 2,000-qubit quantum annealing processor, the researchers could embed and run the problem for up to 65 assets, versus 25 for the brute force method. D-Wave's processor outperformed brute force for 15 or more assets, and the simulated annealing method for more than 25 assets. DTU's Ulrich Busk Hoff said, "As the portfolio size was increased, a degradation in the quality of the solutions found by quantum annealing was indeed observed. But after optimization, the solutions were still competitive and were more often than not able to beat simulated annealing."
|
Continuous Upgrades to Datacenter Virtualization Setups Key to Curbing Carbon Emissions
ComputerWeekly.com Caroline Donnelly March 15, 2021
Carbon emissions tied to European datacenters could be lowered 55% by 2040 through increased use of virtualization technologies, according to a report by Aurora Energy Research commissioned by VMware. However, the report also found that maintaining current datacenter deployment levels would see emissions rise by more than 250% over the same period. The researchers said that even as the pandemic significantly reduced carbon emissions from commuting, energy consumption by datacenters increased due to the swift adoption of remote working and online learning. In addition to increased deployment of virtualization technologies, the researchers emphasized the need for more renewably powered datacenters.
|
|