Welcome to the March 17, 2017 edition of ACM TechNews, providing timely information for IT professionals three times a week.

ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

A robotic ear Robot Eavesdrops on Men and Women to See How Much They Talk
New Scientist
Timothy Revell
March 16, 2017


Researchers at the KTH Royal Institute of Technology in Sweden have developed Furhat, a robotic head that analyzes how people interact with each other, with the goal of discovering inequalities in people's participation when working on a shared activity. The researchers studied 540 people over nine days, and found pairs of women spoke for 45 percent of the time on average, compared with only 26 percent of the time for pairs of men. However, when women were paired with men, the amount of speaking time was 28 percent, with each gender sharing the time about equally. The researchers found this pattern only holds for adults, as gender was not found to make much difference in how much children interact. They also found when Furhat directly addressed the less-dominant speaker, they were more likely to speak. The research was presented this month at the 2017 Conference on Human-Robot Interaction (HRI 2017) in Vienna, Austria.

Full Article

A collection of old photographs AncestryAI Algorithm Traces Your Family Tree Back More Than 300 Years
Aalto University
March 16, 2017


Researchers at Aalto University in Finland have developed AncestryAI, a family tree artificial intelligence (AI) algorithm that looks for connections between 5 million baptisms from the end of the 17th century to the beginning of the 20th century. AncestryAI, which is part of the HisKi project, collects parish data on baptisms, marriages, and relocations. It then automatically searches for a child's most probable parents and creates family trees based on this information. The algorithm offers several options based on the parents' date and place of birth, and similar names. In addition, AncestryAI users can leave comments on the system's accuracy, and the algorithm uses these comments to improve its analysis. "It would be really interesting to have a family tree covering the whole of Finland, because it could also be used to study wars, epidemics, the influence of and changes in the class society," says Aalto University's Eric Malmi.

Full Article
Why We Need Exascale Computing
The Huffington Post
Paul Messina
March 15, 2017


Potentially everyone on earth could be affected by the benefits of exascale computing, writes Paul Messina, director of the Argonne National Laboratory's Exascale Project. "Many planned uses of exascale computing address fundamental scientific questions in fields such as high-energy physics and chemistry," he says. Among its purported benefits are greater combustion engine and gas turbine efficiency, which means fewer greenhouse gas emissions. Other benefits Messina cites include accelerating cancer research by automating data analysis and drug-response modeling, improving the management of the electric power grid, and using exascale computing to boost urban science and increase the quality of life. In addition, Messina envisions faster advances in materials science via complex behavioral simulations and the use of vast databases, as well as more accurate severe weather forecasting. "In addition to benefits to society, key technological advances also hold the promise of maintaining economic security," he notes.

Full Article

A $100 bill in a counterfeit detection machine How to Counterfeit Quantum Money
CORDIS News
March 16, 2017


Researchers in Poland and the Czech Republic have theoretically shown that ultrasecure currency designed using quantum mechanics can be forged by exploiting a serious security flaw. The quantum money was minted photonically, with a series of photons transmitted to a bank using their polarizations to encode information. Criminals intercepting the photons would find accurate counterfeiting impossible because duplicating quantum data is imperfect. However, because individual photons can be missed or distorted in transmission, banks accept partial quantum bills, which gives crooks an opening to make imperfect forgeries that are still similar enough for banks to verify them. Using an optimal cloner, the researchers demonstrated a bank would accept forged quantum currency if the standard for accuracy was not sufficiently high. They say an effective standard for acceptance would require the received photons' polarizations to be more than approximately 84-percent identical to the original.

Full Article

A statue of Charles Darwin How Darwin Evolved: 25,540 Paper Fragments Tell the Story
The New York Times
Constance Gustke
March 13, 2017


Researchers participating in the American Museum of Natural History's Darwin Manuscripts Project are using superfast computers to stitch together 25,540 digitized and mostly transcribed fragments of notes handwritten by Charles Darwin. The computers match up the ragged edges of the fragments, with the goal of reassembling the raw notes forming Darwin's famous treatise on evolution. The researchers are sorting through the material using coding and image processing to distinguish pages with rough, fuzzy, or sharp edges. The next step is matching curves via shape analysis. The project should be finished by the end of this month, accomplishing in months what would take years to do manually. Participating eBay software engineer Jin Chung says she analyzed the manuscripts using tools such as the Fourier transform, which breaks wave forms into alternate representations. The project's data are being added to the publicly available open source GitHub library.

Full Article
DARPA Plan Would Reinvent Not-So-Clever Machine Learning Systems
Network World
Michael Cooney
March 16, 2017


The U.S. Defense Advanced Research Projects Agency's (DARPA) new Lifelong Learning Machines (L2M) program seeks to develop next-generation machine learning systems that learn from new situations and use that knowledge to surpass the performance and reliability of current systems. DARPA says existing artificial intelligence can only function in orchestrated, specific settings with an extensive training dataset that painstakingly describes the conditions that will occur during execution time. One focus of the L2M program will explore algorithms, theoretical simulation, analysis, software, and architectures that deploy new continuous learning strategies, and instill robustness and safety by restricting system behaviors and enabling users to monitor system behavior and evolution and intercede when necessary. Another focus area will investigate the learning and adaptability of living systems and how these principles and methods can be applied to machine learning algorithms. Integration with biology and other natural domain areas will likely be a necessary research component.

Full Article
Researchers Present Early Warning System for Mass Cyberattacks
Saarland University
March 15, 2017


Researchers at the Competence Center for IT Security (CISPA) of Saarland University in Germany, along with colleagues from Japan, have developed an early warning system for massive distributed denial-of-service (DDoS) attacks. The researchers developed a honeypot trap, then laid out 21 of them in obscure areas of the Internet, which enabled them to document more than 1.5 million attacks. "What makes this so insidious is that the attackers achieve maximum damage with very little effort," notes Saarland professor Christian Rossow. The data helped the researchers identify the different phases of attacks, which in turn enabled them to develop an early warning system from the data. In addition, the researchers say they attached secret digital markers to the attack code in order to trace the source of the attacks. "This is quite impressive, because these address counterfeiters usually remain hidden by default," Rossow says.

Full Article
Hackers Can See What YouTube Videos You Watch
Ben-Gurion University of the Negev (Israel)
Andrew Lavin
March 15, 2017


Researchers at Ben-Gurion University (BGU) of the Negev in Israel say they have developed a machine learning algorithm that can identify which YouTube videos users watched--within a predetermined set of videos--with a high degree of accuracy. The algorithm is based on a study of how video services work, how video content is encoded, and how a video player requests information to play it. The researchers found the algorithm can ascertain if a user watched a specific video from a set of suspicious, terror-related videos. They say intelligence agencies could use the technology to track terrorists or other suspicious individuals. In addition, Internet marketing companies could use it to track the number and demographics of viewers watching an ad. Although this information could be helpful, BGU researcher Ran Dubin notes YouTube users should be aware their viewing histories can be monitored.

Full Article
Security for Multirobot Systems
MIT News
Larry Hardesty
March 16, 2017


Researchers at the Massachusetts Institute of Technology's (MIT) Computer Science and Artificial Intelligence Laboratory and their colleagues have developed a new method to prevent the hijacking of robot teams' communication networks. The technique analyzes the different ways in which robots' wireless signals interact with their surroundings and assigns each a unique radio "fingerprint;" multiple votes from the same transmitter are designated as probably bogus. The researchers also developed a theoretical analysis comparing the results of a common coverage algorithm under routine conditions and those produced when the new system is actively foiling a spoofing attack. Using this method, the robots' positions are within three centimeters of what they should be even when 75 percent of the robots have been compromised. The researchers applied their theory by deploying an array of distributed Wi-Fi antennas and an autonomous helicopter.

Full Article
What If Quantum Computers Used Hard Drives Made of DNA?
Wired
Sophia Chen
March 15, 2017


A key challenge of creating a working quantum computer is the inability to save or copy data, so physicists are investigating DNA molecules as super-compact quantum hard drives. A group of researchers have demonstrated a technique for storing 215 petabytes, or 215 million gigabytes, in a single gram of DNA, which is possible due to the molecule's three-dimensional structure. In addition, "DNA can store information for a very long time," notes Columbia University's Yaniv Erlich, who received the ACM-IEEE CS George Michael Memorial HPC Award for 2008. However, the drawbacks to this method include the high cost of DNA synthesis and the long time it takes to read DNA-coded information out. Another method unveiled last week by IBM researchers encodes single bits of data in individual atoms, which can be successfully read back. Other challenges to DNA storage include developing algorithms to compress and convert quantum data into binary code, as well as designing the hardware needed to execute the algorithms.

Full Article
As Moore's Law Nears Its Physical Limits, a New Generation of Brain-Like Computers Comes of Age in a Stanford Lab
Stanford News
Nathan Collins
March 13, 2017


Stanford University professor Kwabena Boahen sees next-generation neuromorphic computers as a solution to the approaching physical limits of Moore's Law. Boahen and colleagues have drafted a prospectus for building these brain-emulating computers, and they outlined a mixed digital-analog framework similar to neurons that uses transistors. The framework enables more energy-efficient operations and more robustness when transistors fail. From there, the prospectus builds on neurons' hierarchical structure, distributed computation, and feedback loops to propose a vastly more energy-efficient, powerful, and robust neuromorphic system. In the last three decades, Boahen's lab has deployed these concepts in actual devices, and Boahen expects computers that implement all of the framework's components will be designed and built in several more years. Boahen stresses neuromorphic computers will be complementary to current computers, and not a replacement for them. He expects they will be used in embedded systems requiring high power efficiency.

Full Article
Oh What Entangled Web We Weave
The Economist
March 9, 2017


Undecryptable communications links could be underpinned by quantum networks, if the technology's shortcomings can be addressed. Quantum cryptography currently is seeing wider deployment thanks to advancements that are securing more links, but the networks needed to connect senders and receivers are essential if the technology is to have a future. China last year completed work on a government-funded quantum network linking Shanghai and Beijing via the 50-node Jinan and the 46-node Hefei metropolitan networks. Distance creates a security loophole, and possible remedies include developing quantum analogs of the repeater and teleporting quantum states. A second approach could involve an airborne methodology, or teleportation through space. Country-wide networks and quantum-enabled satellites could facilitate the realization of a worldwide "quantum Internet" with each link offering hack-proof security, and such advances also will be useful in transmitting information within, and between, future quantum-computing devices.

Full Article
Kurzweil Claims That the Singularity Will Happen by 2029
Futurism
Dom Galeon; Christianna Reedy
March 15, 2017


Google director of engineering and ACM Fellow Ray Kurzweil (who received the ACM Grace Murray Hopper Award for 1978) this week at the SXSW Conference in Austin, TX, predicted the technological singularity--the emergence of human-level computer intelligence--will arrive by 2029. Kurzweil also dismissed concerns that a single artificial intelligence will dominate the human race, and said he envisions the event as an opportunity for mankind to improve. Kurzweil notes implanting computers within human brains and linking them to the cloud already has begun, "and it's going to accelerate." He predicts a computer-neocortex connection will make people smarter, and he expects this will enable people "to exemplify all the things that we value in humans to a greater degree." Kurzweil also expects to see the invention of technology that can be implanted in the brain as a memory enhancement in the 2030s. "Ultimately, it will affect everything," he says. "We're going to be able to meet the physical needs of all humans."

Full Article
Fordham_Dec_May17_160x600
 
ACM Digital Library
 

Association for Computing Machinery

2 Penn Plaza, Suite 701
New York, NY 10121-0701
1-800-342-6626
(U.S./Canada)



ACM Media Sales

If you are interested in advertising in ACM TechNews or other ACM publications, please contact ACM Media Sales or (212) 626-0686, or visit ACM Media for more information.

To submit feedback about ACM TechNews, contact: [email protected]