Welcome to the October 23, 2017 edition of ACM TechNews, providing timely information for IT professionals three times a week.

ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

To view "Headlines At A Glance," hit the link labeled "Click here to view this online" found at the top of the page in the html version. The online version now has a button at the top labeled "Show Headlines."
Tech Giants Are Paying Huge Salaries for Scarce AI Talent
The New York Times
Cade Metz
October 22, 2017


With artificial intelligence (AI) expertise in short supply, AI talent is commanding huge salaries from the largest technology companies, with sources estimating a typical specialist can make as much as $500,000 or more annually. Element AI in Canada calculates fewer than 10,000 people worldwide are sufficiently skilled to meet leading AI research challenges. Tech firms' overwhelming need for AI talent stems from their conviction that AI will revolutionize everything from driverless vehicles to digital assistants to medicine to robotics to stock trading. Top earners are executives who have experience with AI project management, but although such behavior is "rational" for companies, Carnegie Mellon University's Andrew Moore says this trend is not necessarily beneficial for society. The scarcity of AI specialists is spurring tech giants to hire top academic talent, limiting the number of professors who can teach AI skills. Some faculty members are compromising by splitting their time between industrial and academic pursuits.

Full Article
*May Require Free Registration
IBM Has Just Achieved an 'Impossible' Step in Quantum Computing
Alphr
Abigail Beall
October 23, 2017


IBM has surpassed Google in the race to build a practical quantum computer by simulating a quantum system on a classical machine with 56 quantum bits (qubits). It was previously deemed unworkable to simulate more than 49 qubits on a classical computer because of the limited memory classical computers have versus quantum computers. IBM's simulation also significantly reduces the volume of memory used in earlier efforts, using 4.5 TB compared to a Swiss project's 500 TB. IBM's Edwin Pednault says the breakthrough came about from conceiving qubit simulation along a grid-circuit framework, in which "the gates form a bristle-brush pattern where the bristles are the entangling gates that are being applied to that qubit." Pednault notes his team used tensors to represent qubits and squeeze in more information. With IBM's achievement, "it's going to be much harder for quantum-device people to exhibit [quantum] supremacy," says the University of Southern California's Itay Hen.

Full Article

‘shield’ room where researchers test their radar Rendering the Invisible Visible
Ruhr-University Bochum (Germany)
Julia Weiler
October 17, 2017


Researchers at Ruhr University Bochum and the University of Duisburg-Essen in Germany are working with other institutions on the Mobile Material Characterization and Localization by Electromagnetic Sensing project, whose long-term goal is developing novel signal-processing methods for imaging and material characterization using radar. The researchers say they want to use these techniques in combination with radar-based localization of objects in order to develop a flying platform capable of generating a three-dimensional representation of its surroundings. The team has developed algorithms for converting a radar signal into an informative image with better focusing and fewer systemic measurement errors. The researchers note under controlled laboratory conditions, the new algorithms are able to determine the position of an object and whether it is composed of a different material than the surface on which it lies. The team says the next step is to enable the system to recognize what the object actually is.

Full Article

Computer circuit board Selective Memory
MIT News
Larry Hardesty
October 22, 2017


Researchers at the Massachusetts Institute of Technology, Intel, and the Swiss Federal Institute of Technology in Zurich have unveiled Banshee, a new cache-management scheme that upgrades the data rate of in-package dynamic random-access memory (DRAM) caches by 33 percent to 50 percent. The team says Banshee adds three bits of data to each entry in the table mapping virtual addresses used by individual programs to the actual addresses of main-memory-stored data. The researchers note one bit signifies whether the data at that virtual address can be found in the DRAM cache, and the other two indicate its whereabouts relative to any other data items with the same hash index. Banshee also introduces a 5-kilobyte circuit, known as a tag buffer, in which any given core can record the new location of a data item it caches. The researchers presented Banshee last week at the IEEE/ACM Symposium on Microarchitecture (MICRO 50) in Boston.

Full Article
Welcoming Our New Robot Overlords
The New Yorker
Sheelah Kolhatkar
October 23, 2017


Over the past decade, industrial robots and human laborers have switched roles, with robots now performing tasks while humans assist them. Roboticists such as Brown University professor Stefanie Tellex say they now are focused on human-robot interaction in the performance of complex tasks. "We're trying to make robots that can robustly perceive and manipulate the objects in their environment," Tellex says. Industrial robots that can operate in a constantly fluid environment represent a multi-billion-dollar business opportunity, but Tellex says her motivation for such projects is to help make society better. Although politically fraught, the issue of workplace automation is considered by some to have positive ramifications, including less physical stress for workers, less noise and dirt in production plants, and greater productivity. However, as automation raises efficiency in manufacturing and other sectors, the likelihood of machines eventually replacing the bulk of workforces escalates.

Full Article

Man holding a laptop shaking hands with a robot Imagine a Human-Robot Friendship
Northeastern University News
Molly Callahan
October 19, 2017


Researchers at Northeastern University are studying how humans and robots can cooperate to complete space missions. The team is working with the U.S. National Aeronautics and Space Administration's Valkyrie robot, which was designed for missions in which robots would land on Mars and set up camp before humans' arrival. Designing robots to address challenges that would be dangerous for humans could potentially save human lives. "For a spacewalk, instead of sending two humans into this high-risk situation, why not send a human and a robot?" asks Northeastern University researcher Murphy Wonsick. Wonsick finds inspiration for her human-robot research in everyday human interactions. For example, if a tall and a short person are trying to solve a problem, that is not all that different from Northeastern's human-robot interaction research. "We have differences among ourselves and we figured it out," Wonsick says. "If we can do it, we can figure out how to teach robots to do it."

Full Article

Hiker standing on rock overlooking the grand canyon Back to the Canyon
Sandia Labs News
Michael Padilla
October 19, 2017


A team of researchers from Sandia National Laboratory, the University of New Mexico, and the U.S. National Park Service is conducting the Rim-to-Rim Wearables at the Canyon for Health project, which aims to determine if fatigue can be predicted and whether life-threatening fatigue can be differentiated from recoverable fatigue. "With this study, we hope to identify predictive signatures for fatigue and quantify the type of fatigue," says Sandia researcher Glory Emmanuel Avina. The three-year study of rim-to-rim hikers at the Grand Canyon aims to predict the early onset of declines in performance and health. The team wants to identify which physiological and cognitive markers are most important for predicting performance and fatigue, with the goal of creating a single wearable device that can monitor fatigue in real time. "If a real-time analysis showed evidence of physiological, cognitive, or genetic predictors of fatigue, individuals could receive early warnings of potential health concerns," Emmanuel Avina says.

Full Article
Making Big Data a Little Smaller
Harvard University
Leah Burrows
October 19, 2007


Harvard University professor Jelani Nelson and Kasper Green Larsen of Aarhus University in Denmark have validated the Johnson-Lindenstrauss lemma (JL lemma) for reducing data dimensionality. "We have proven that there are 'hard' datasets for which dimensionality reduction beyond what's provided by the JL lemma is impossible," Nelson says. The JL lemma demonstrates for any finite collection of points in high dimension, there is a collection of points in a lower dimension preserving all distances between the points. Scientists determined the theorem can act as a preprocessing step and reduce data dimensionality before running algorithms. The theorem employs geometric classification to map the similarities between dimensions, retaining the geometry of the data and the angles between data points. Tel Aviv University professor Noga Alon in Israel says Nelson and Larsen's work addresses "a logarithmic gap...between the upper and lower bounds for the minimum possible dimension required as a function of the number of points and the distortion allowed."

Full Article
Andrew Ng Has a Chatbot That Can Help With Depression
Technology Review
Will Knight
October 18, 2017


Stanford University professor Andrew Ng is supporting the development of a Facebook chatbot called Woebot, which is designed to offer interactive cognitive behavior therapy to people suffering from depression. Woebot inventor Alison Darcy says it is possible to automate such therapy since it follows a series of steps for detecting and addressing unhelpful ways of thinking, while natural-language processing technologies have boosted the utility of chatbots within limited domains. The system provides a guided conversational interface, and Woebot checks in with patients daily, directing them through therapeutic steps. Tests of Woebot prototypes found the chatbot reduced the symptoms of depression in volunteers over two weeks. "If we can take a little bit of the insight and empathy [of a real therapist] and deliver that, at scale, in a chatbot, we could help millions of people," Ng says. He expects Woebot to become a more effective tool once better methods for parsing the meaning of language are developed.

Full Article
*May Require Free Registration
Physics Boosts Artificial Intelligence Methods
Caltech News
Mark H. Kim
October 18, 2017


Researchers at the California Institute of Technology (Caltech) and the University of Southern California (USC) say they have developed the first-ever application of quantum computing to a physics problem. The team says they used quantum-compatible machine-learning methods to develop a way to extract a Higgs boson signal from noise data. The researchers say they programmed a quantum annealer to filter error-ridden particle-measurement data. The quantum program looks for patterns within a dataset to distinguish meaningful data from noise, and USC's Joshua Job describes it as "a simple machine-learning model that achieves a result comparable to more complicated models without losing robustness or interpretability." Job also says most high-energy physics research entails scientists studying small packets of data to find out which are interesting. Caltech professor Maria Spiropulu notes the new quantum program "is simpler, takes very little training data, and could even be faster. We obtained that by including the excited states."

Full Article

Row of beehives in an orchard USU Professor Hopes 'BeePi' Hive Sensors Will Help Honeybees
The Herald Journal (UT)
John Zsiray
October 17, 2017


Researchers at Utah State University (USU) have developed BeePi, a computerized system based on the Raspberry Pi platform that uses sensors, microphones, and other hardware to monitor the activity inside beehives. "If you can think about the beehive as an intelligent, immobile robot, essentially monitoring the bee colony inside and letting all interested parties know of deviations from the norm, then it'll cut down on hive inspections and transportation costs immensely," says USU professor Vladimir Kulyukin. BeePi is powered by a USB battery system supporting nearly 48 hours of continuous runtime. Kulyukin is using the system to monitor four hives, each of which he checks every two days to change the batteries and download 20 MB of accumulated data. He says BeePi's still-photo monitoring system can approximate the accuracy of human bee counting to about 90 percent, depending on the weather. Microphones also record bees' audio patterns, and a computer converts that data into information about the colony's health.

Full Article
University Labs Put Cybersecurity Under the Microscope
Government Technology
Eyragon Eidam
November 1, 2017


In an interview, academic cybersecurity experts discuss the direction of their research, with the University of Nebraska at Omaha's (UNO) Deepak Khazanchi citing the Internet of Things (IoT), whose scale and complexity "act as a challenge for security in the future." He says UNO aims to embed security assurance within hardware and software via "assurance-based software engineering," or designing systems aware of compliance regulations and that ensure accountability for new software before deployment. IoT security also is a focus area for Syracuse University professor Shiu-Kai Chin, who supports the inclusion of cybersecurity in both the design process and the organizational culture. Meanwhile, Florida State University professor Xiuwen Liu says securing the Internet and the IoT is possible, but only if openness is sacrificed. Liu thinks many people's lack of awareness of broad online accessibility "is probably a bigger problem in terms of securing the Internet," which he says only a culture of vigilance and awareness can address.

Full Article
ACM Job Alert
 
ACM Online Books and Courses
 

Association for Computing Machinery

2 Penn Plaza, Suite 701
New York, NY 10121-0701
1-800-342-6626
(U.S./Canada)



ACM Media Sales

If you are interested in advertising in ACM TechNews or other ACM publications, please contact ACM Media Sales or (212) 626-0686, or visit ACM Media for more information.

To submit feedback about ACM TechNews, contact: [email protected]