Online Master's Degree Program for Information Professionals
 
Welcome to the September 22, 2021 edition of ACM TechNews, providing timely information for IT professionals three times a week.

ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

To view "Headlines At A Glance," hit the link labeled "Click here to view this online" found at the top of the page in the html version. The online version now has a button at the top labeled "Show Headlines."

Researchers are turning their artificial intelligence tools to the study of fundamental questions about treatments for COVID. Preparing for a Future Pandemic with AI
Pacific Northwest National Laboratory
Melissae Fellet
September 21, 2021


Artificial intelligence (AI)-driven investigations into COVID treatments by scientists at the U.S. Department of Energy's Pacific Northwest National Laboratory (PNNL) are yielding insights for application to future pandemic responses. PNNL's Jeremy Zucker and colleagues used a counterfactual AI framework to simulate biochemical data collected from hypothetical patients with severe COVID-19 infections; each fictitious patient had different viral loads, received a different dose of a drug, and either recovered or died. The analysis delivered more precise information about the drug's potential benefit to individuals, compared with algorithms that just predicted average patient post-treatment outcomes. Researchers at PNNL and the University of Washington also combined high-throughput biochemical measurements and AI-based screening to extract one molecule with promising antiviral activity against SARS-CoV-2 out of 13,000.

Full Article
How Quickly Do Algorithms Improve?
MIT News
Rachel Gordon
September 20, 2021


Massachusetts Institute of Technology (MIT) researchers reviewed 57 textbooks and over 1,110 research papers to chart algorithms' historical evolution. They covered 113 algorithm families, sets of algorithms solving the same problem highlighted by textbooks as paramount; the team traced each family's history, tracking each time a new algorithm was proposed, and noting the more-efficient program. Each family averaged eight algorithms, of which a few upgraded efficiency. Forty-three percent of families dealing with large computing problems saw year-on-year gains equal to or larger than Moore's Law-dictated improvements, while in 14% of problems, algorithmic performance improvements overtook hardware-related upgrades. MIT's Neil Thompson said, “Through our analysis, we were able to say how many more tasks could be done using the same amount of computing power after an algorithm improved."

Full Article
A Way to Solve the 'Hardest of the Hard' Computer Problems
Ohio State News
Jeff Grabmeier
September 21, 2021


Scientists have vastly accelerated the speed and efficiency of reservoir computing to tackle some of the most difficult information processing challenges. An Ohio State University (OSU) team used next-generation reservoir computing on a Lorenz weather prediction task, which was 33 to 163 times faster than the current-generation model. The new model also was about a million times faster in terms of forecasting accuracy, and OSU's Daniel Gauthier said it achieved this improvement using the equivalent of just 28 neurons, versus 4,000 for the current-generation model. Gauthier credited the speedup to the new model's reduced warmup and training time, adding, "What's exciting is that this next generation of reservoir computing takes what was already very good and makes it significantly more efficient."

Full Article
Quantum Computer Helps Design Better Quantum Computer
New Scientist
Matthew Sparkes
September 20, 2021


Researchers at the University of Science and Technology of China in Shanghai used a quantum computer to design a quantum bit (qubit) that could power better quantum systems. The resulting plasonium qubit is smaller, less noisy, and can retain its state longer than the team's current qubit model. The researchers used a variational quantum eigensolver algorithm to simulate particle behavior in quantum circuits and smooth out negative properties while developing advantageous features, without needing to build many physical prototypes. Each plasonium qubit is only 240 micrometers long, or 40% the size of a transmon qubit, which will allow processor miniaturization. The plasonium qubit's strong anharmonicity also means the additional states the qubits can reach are varied and less likely to be found accidentally, reducing the potential for computational errors.

Full Article

Concept rendering of a digital cooking appliance that boasts dozens of ingredients and a precise cooking laser to assemble and cook meals using digital recipes. Now We're Cooking with Lasers
Columbia University
Holly Evarts
September 17, 2021


Columbia University researchers have been creating laser-cooked and three-dimensionally (3D) printed foods since 2007. Recent experiments considered different cooking modalities by exposing chicken to blue and infrared light, then 3D-printing a sample to evaluate cooking depth, color development, moisture retention, and flavor differences between laser-cooked and stove-cooked meat. The team learned laser-cooked meat shrinks less, retains more moisture, and exhibits similar flavor development to stove-cooked meat. Columbia's Hod Lipson said, "What we still don't have is what we call 'Food CAD [computer-assisted design],' sort of the Photoshop of food. We need a high-level software that enables people who are not programmers or software developers to design the foods they want. And then we need a place where people can share digital recipes, like we share music."

Full Article

Researchers at Cornell University say their new recommendation algorithm could improve Spotify by incorporating both likes and dislikes. 'Dislike' Button Would Improve Spotify Recommendations
Cornell Chronicle
Adam Conner-Simons
September 15, 2021


A research team at Cornell University has developed a recommendation algorithm that could improve audio streaming and media services provider Spotify by incorporating both likes and dislikes. The group showed that a listener is about 20% more likely to like a song if the algorithm recommending it is trained on 400,000 likes and dislikes, as opposed to an algorithm trained solely on likes. Cornell's Sasha Stoikov said an algorithm that only keys on likes has a greater chance of recommending songs the listener dislikes. The new Piki system picks music from a database of approximately 5 million songs and gives users $1 for every 25 songs they rate, which Stoikov said "incentivizes the user to vote truthfully."

Full Article
Digital Sky Survey Receives SIGMOD Award
FermiLab
Lisa Roberts
September 20, 2021


The ACM Special Interest Group on Management of Data (SIGMOD) has given its 2021 ACM SIGMOD Award to the Sloan Digital Sky Survey (SDSS) astronomy project for its "early and influential demonstration of the power of data science to transform a scientific domain." Launched in 1998, the project continues to collect huge amounts of data on objects in the night sky. The award recognized contributions by more than a dozen people, including the Fermi National Accelerator Laboratory's Bill Boroski, Steve Kent, and Brian Yanny for the development of database systems to distribute SDSS data.

Full Article

A microscope image of one of the organic electrochemical transistor networks that the researchers created, made from polymer-based fibers. Brain-Inspired AI Enables Future Medical Implants
IEEE Spectrum
Rebecca Sohn
September 10, 2021


An international team of researchers developed a biocompatible artificial intelligence system from networks of polymer fibers that can detect and categorize abnormal electrical signals in the human body. The researchers fabricated the networks from a carbon-based material called PEDOT and immersed them in an electrolyte solution that replicates the inside of the human body, where they operate as organic electrochemical transistors. These mechanisms convert electrical inputs in the body into nonlinear signals like the binary code computers use, making it suitable for computation. In tests, the system successfully distinguished irregular from regular heartbeats.

Full Article
Upgrades to Old Wireless Tech Could Enable Real-Time 3D Motion Capture
UC San Diego News Center
September 21, 2021


Electrical engineers at the University of California, San Diego (UCSD) have upgraded ultra-wideband (UWB) wireless technology that eventually could support real-time three-dimensional (3D) motion capture. UWB promises to help locate and track the movement of devices with greater accuracy than Wi-Fi and Bluetooth. UCSD's Dinesh Bharadia said faster performance, extreme low-power operation, and highly accurate 3D localization also must be realized to achieve real-time 3D motion capture. His team updated the system's software so the attachable tracking device or tag emits just one radio signal to all anchor devices. The prototype communicates data with 1-millisecond latency, can run continuously for over two years on a small battery, and can pinpoint the location (in three dimensions) of stationary objects within three centimeters (and of moving objects within eight centimeters).

Full Article

The components of a supercomputers, visualized by the Unity 3D Game engine. Researchers Turn to Unity 3D Game Engine for Supercomputer Diagnostics
Tom's Hardware
Aleksandar Kostovic
September 20, 2021


Massachusetts Institute of Technology (MIT) scientists tapped the Unity 3D Game engine to enhance supercomputer diagnostics through hardware visualization. The resulting MM3D tool is a part of the Data Center Infrastructure Management tool suite developed by the MIT SuperCloud unit. According to the researchers, the MM3D "fully utilizes the capabilities of modern 3D [three-dimensional] gaming environments to create novel representations of computing hardware which intuitively represent the physical attributes of the supercomputer while displaying real-time alerts and component utilization."

Full Article
Two-Thirds of Cloud Attacks Could Be Stopped by Checking Configurations
ZDNet
Charlie Osborne
September 15, 2021


IBM Security X-Force's latest Cloud Security Threat Landscape report concluded that two-thirds of cloud attacks occurring during the year-long study period "would likely have been prevented by more robust hardening of systems, such as properly implementing security policies and patching systems." Researchers uncovered issues with credentials or policies in each penetration test performed by IBM's X-Force Red security team while sampling scanned cloud environments. They cited improperly configured assets, password spraying, and switching from on-premises infrastructure as the most frequently observed initial breach vectors. Application programming interface configuration and security issues, remote exploitation, and accessing confidential data were other common vectors. The team recommended businesses "manage their distributed infrastructure as one single environment to eliminate complexity and achieve better network visibility from cloud to edge and back."

Full Article
Concurrency:  The Works of Leslie Lamport
 
AI-Curated Custom Feeds by Subject
 

Association for Computing Machinery

1601 Broadway, 10th Floor
New York, NY 10019-7434
1-800-342-6626
(U.S./Canada)



ACM Media Sales

If you are interested in advertising in ACM TechNews or other ACM publications, please contact ACM Media Sales or (212) 626-0686, or visit ACM Media for more information.

To submit feedback about ACM TechNews, contact: [email protected]