Welcome to the January 12, 2015 edition of ACM TechNews, providing timely information for IT professionals three times a week.
Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).
HEADLINES AT A GLANCE
Scientists and Investors Warn on AI
Financial Times (01/12/15) Tim Bradshaw
Dozens of scientists, investors, and other influential individuals have signed an open letter from the recently founded Future of Life Institute (FLI) suggesting priorities for "robust and beneficial" research and development in the field of artificial intelligence (AI). The letter and the FLI are a response to increasingly vocal nervousness about the potential impact that AI could have on humanity. Elon Musk, a signatory to the FLI letter and a member of the group's science advisory committee, recently said he worries AI could pose a greater existential threat to humanity than nuclear weapons. In its letter, the FLI is careful to balance fear and optimism. The group recognizes the benefits of AI technologies are potentially huge and could include "the eradication of disease and poverty." However, the letter says research into AI needs to have a greater focus on the possible ramifications of the technology, from increasing unemployment to potentially threatening the human race. In addition to Musk, the letter has attracted the signatures of other luminaries, including astrophysicist Stephen Hawking, Nobel laureate Frank Wilczek, Machine Intelligence Research Institute director Luke Muehlhauser, and the founders of AI companies DeepMind and Vicarious.
FBI Is Broadening Surveillance Role, Report Shows
The New York Times (01/11/15) Charlie Savage
The New York Times has obtained an unclassified version of a 2012 report from the U.S. Justice Department's inspector general outlining the role played by the Federal Bureau of Investigation (FBI) in the government's warrantless wiretapping program. Although that program is most commonly associated with the U.S. National Security Agency (NSA), the report demonstrates that since the passage of the FISA Amendments Act of 2008, the FBI has played an ever-larger role in the program. For example, in 2008 the FBI assumed the power to review the email accounts NSA wanted monitored through its Prism system, and in October 2009 the bureau began retaining copies of unprocessed communications gathered through the warrantless wiretapping program. By 2012, the Bureau was nominating new email accounts for collection through NSA's upstream system. The report also includes previously unknown details of how a series of court rulings shaped the warrantless wiretapping program prior to the passage of the 2008 FISA amendments. The report suggests a May 31, 2007 order issued by Judge Roger Vinson constrained the scope of the program and resulted in a significant drop in the number of so-called "foreign selectors" NSA was able to track under the program, helping to spur the passage of the FISA amendments.
Quantum Hard Drive Breakthrough
Australian National University (01/08/15) Phil Dooley
A solid-state technique has enabled physicists from Australian National University (ANU) and the University of Otago to develop a prototype quantum hard drive that achieved a record storage time of six hours. The team stored quantum information in atoms of the rare earth element europium embedded in a crystal. The researchers view crystals as portable optical hard drives for quantum entanglement, according to ANU's Manjin Zhong. The team used laser light to write a quantum state onto the nuclear spin of the europium crystal and then subjected it to a combination of a fixed and oscillating magnetic fields to preserve the fragile quantum information. The researchers say the boost in storage time by a factor of more than 100 is a major step for those who envision a secure worldwide data encryption network based on quantum information, and they note the network could be used for banking transactions and personal emails. "We believe it will soon be possible to distribute quantum information between any two points on the globe," Zhong says.
Nissan, NASA Team Up for Self-Driving Car Tech
PC Magazine (01/09/15) Stephanie Mlot
Nissan and the U.S. National Aeronautics and Space Administration (NASA) have announced a five-year research and development partnership that aims to produce an autonomous vehicle. The researchers will focus on autonomous drive systems, human-machine interface solutions, network-enabled applications, and software analysis and verification. "The partnership will accelerate Nissan's development of safe, secure, and reliable autonomous drive technology that we will progressively introduce to consumers beginning in 2016 up to 2020," says Nissan CEO Carlos Ghosn. By 2020, Nissan hopes to introduce self-driving cars that can navigate in nearly all situations. NASA will benefit from Nissan's expertise in component technologies and research on the development of vehicular transport applications. "All of our potential topics of research collaboration with Nissan are areas in which Ames has strongly contributed to major NASA programs," says Ames Research Center director S. Pete Worden. He notes Ames developed the Mars rover's planning software and helped put robots onboard the International Space Station. "We look forward to applying knowledge developed during this partnership toward future space and aeronautics endeavors," Worden says.
The 8080 Chip at 40: What's Next for the Mighty Microprocessor?
Computerworld (01/08/15) Lamont Wood
The Intel 8080 microprocessor, introduced in 1974, gave rise to the personal computer industry, and the descendants of that groundbreaking chip promise to lead the way to another 40 years of computer technology evolution. "The last four decades were about creating the technical environment, while the next four will be about merging the human and the digital domains, merging the decision-making of the human being with the number-crunching of a machine," predicts industry analyst Rob Enderle. Such merging is expected to involve people learning how to control machines via direct brain interaction, says Lee Felsenstein, who helped design early portable computers. He believes learning a computer/brain interface will be an interactive process starting in middle school and initially using toy-like systems. "A synthesis of people and machines will come out of it, and the results will not be governed by the machines nor by the designers of the machines," Felsenstein says. Retired Intel chip designer Stan Mazor predicts, "when computers can see, we will have a large leap forward in compelling computer applications. Although typical multiprocessors working on a single task saturate at around 16 [central-processing units (CPUs)], if a task can be partitioned, then we might see 100,000 CPUs on a chip."
How Reverse-Engineering the Brain Could Help Machines Learn
NextGov.com (01/08/15) Frank Konkel
The research arm of the U.S. intelligence community is interested in improving the way supercomputers and high-end machines learn. The U.S. Intelligence Advanced Research Projects Activity (IARPA) has announced a new research and development program called Machine Intelligence from Cortical Networks (MICrONS), which seeks to reverse-engineer algorithms brains use. The aim is to "achieve a quantum leap in machine learning that uses neutrally-inspired architectures and mathematical abstractions of the representations, transformations, and learning rules employed by the brain," according to IARPA. The goal is human-like proficiency in processing tasks such as one-shot learning, unsupervised clustering, and scene parsing. IARPA envisions an algorithm-driven machine that could potentially collect data sets relevant to its mission on its own volition, and the group will accept proposals from companies for the five-year program. Companies are expected to use neuroscience data to improve cortical computation and refine algorithms for smarter machines.
Toward Quantum Chips
MIT News (01/09/15) Larry Hardesty
Massachusetts Institute of Technology researchers have built an array of light detectors sensitive enough to register the arrival of individual photons, and mounted them on a silicon optical chip. The researchers note the arrays are important components of devices that use photons to perform quantum computations. They developed a procedure for fabricating and testing the detectors separately and then transferring those that work to an optical chip built using standard manufacturing processes. The researchers say their new approach can yield much denser and larger arrays, in addition to increasing the detectors' sensitivity. During testing, the researchers found the detectors were up to 100 times more likely to accurately register the arrival of a single photon than those found in earlier arrays. The process begins with a silicon optical chip made using conventional manufacturing techniques; then, on a separate silicon chip, the researchers grow a thin, flexible film of silicon nitride, upon which they deposit the superconductor niobium nitride in a pattern useful for photon detection, and deposit gold electrodes at both ends of the resulting detector. The researchers then attach a droplet of polydimethylsiloxane to one end of the silicon nitride film, and press a tungsten probe against the silicon.
New Research Indicates Cybersecurity Skills Shortage Will Be a Big Problem in 2015
Network World (01/08/15) Jon Oltsik
The lack of enough cybersecurity professionals could be a major problem for many organizations in 2015, according to new research by the Massachusetts-based Enterprise Strategy Group ( ESG). ESG asked 591 information technology (IT) and security professionals if their organizations planned to add employees in 2015 and found that half of all responding organizations plan to add a significant or small number of new IT staff positions in 2015. The study found 43 percent of respondents plan to add new IT staff positions in information security, 34 percent plan to add new IT staff positions in IT architecture and planning, and 34 percent plan to add new IT staff positions in server virtualization and private cloud infrastructure. ESG also asked which areas are lacking existing skills and 28 percent of respondents think their organization has a problematic shortage of information security skills, 23 percent think their organization has a problematic shortage of IT architecture and planning skills, and 22 percent think their organization has a problematic shortage of mobile device management skills. The annual survey has found a problematic shortage of information security skills for four consecutive years and indicates the skills shortage is worsening.
A Bendable Implant Taps the Nervous System Without Damaging It
Technology Review (01/08/15) Antonio Regalado
Researchers at the Swiss Federal Institute of Technology have developed a soft, flexible electronic implant with the same ability to bend and stretch as dura mater, the membrane that surrounds the brain and spinal cord. Gregoire Courtine teamed with electrical engineer Stephanie Lacour to create the implant, dubbed e-dura, which is made from silicone, gold wires, and electrodes embedded with platinum. Lacour notes if an implant is stiff, it will not stretch as the spinal cord does. "It slides against the tissue and causes a lot of inflammation," she says. "When you bend over to tie your shoelaces, the spinal cord stretches by several percent." The e-dura implant mimics a property of human tissue called viscoelasticity, which is a consistency between rubber and a thick fluid. The researchers report they could overcome spinal injury in rats by wrapping the flexible implant around the spinal cord and sending electrical signals to make the rodent's hind legs move. They also pumped in chemicals via a microchannel in the implant to enhance the process. After two months, they saw few signs of tissue damage compared to conventional electrodes. Eventually, the researchers say this type of soft electronic implant may help restore a paralyzed person's ability to walk.
Team Creates Automated Method to Assemble Story-Driven Photo Albums
Disney researchers have developed a method for selecting photos based on quality and relevance, and to order them in a way that makes narrative sense. Previous efforts on automated album creation have relied on arranging photos based mostly on chronology and geo-tagging. However, the Disney researchers found when people were asked to choose and assemble five photo albums that told a story, the individuals took photos out of chronological order about 40 percent of the time. Preference testing using Mechanical Turk found people preferred these annotated albums over those chosen randomly or those based on chronology. The researchers developed a model that creates albums based on a variety of photo features, including the presence or absence of faces and their spatial layout, overall scene textures and colors, and the aesthetic quality of each image. The model also incorporated learned rules for how albums are assembled, such as preferences for certain types of photos to be placed at the beginning, in the middle, and at the end of albums. The researchers used a machine-learning algorithm to teach the model how people use those features and what rules they use to assemble photo albums.
Simulations Aimed at Safer Transport of Explosives
Argonne National Laboratory (01/07/15) Jim Collins
University of Utah researchers are using supercomputing resources at the Argonne Leadership Computing Facility (ALCF) to simulate a 2005 accident in which a semi truck hauling 35,000 pounds of explosives through the Spanish Fork Canyon in Utah crashed and caught fire, causing an explosion that left a 30- by-70-foot crater in the highway. The researchers, led by University of Utah professor Martin Berzins, are performing large-scale three-dimensional simulations on Mira, ALCF's 10-petaflops IBM Blue Gene/Q supercomputer, to study the physical mechanisms that led to deflagration-to-detonation transition (DDT), the process that led to the accident. "The main focus of our simulations is to determine why the fire escalated to detonation," says researcher Jacqueline Beckvermit. The researchers spent more than two years and more than 100 million computing hours to successfully simulate the detonation. The simulations were a challenge due to the complex nature of DDT, which involves several strongly correlated processes, such as chemical kinetics, pressure waves, and turbulence, all occurring in multiple spatial and temporal scales. "The ultimate goal of our project is to propose ideas on how to package explosives for transport to make sure accidents like this don't happen any more," Beckvermit says.
No Need to Panic--Artificial Intelligence Has Yet to Create a Doomsday Machine
The Conversation (01/06/15) Tony Prescott
Last year saw several major figures in science and industry raising concerns about artificial intelligence (AI), chief among them physicist Stephen Hawking, who warned AI could surpass human intelligence, potentially leading to the end of humankind. However, University of Sheffield professor Tony Prescott says the day people create such dangerous AI is far off. Numerous tests have been proposed for gauging the progression of computer intelligence, from computing pioneer Alan Turing's Turing Test to the more exacting Lovelace 2.0 test announced last year. The last year also saw a computer "pass" the Turing Test, but Prescott says the winning chatbot, which purported to be a 13-year-old that spoke English as a second language, was more a clever gimmick than a true evolution in computer intelligence. Prescott believes for the foreseeable future AI's intelligence capabilities will be entirely defined by the humans creating and working with it, and "it will be a long time before these tests will demonstrate anything other than how far machine intelligence still has to go before we will have made our match."
Abstract News © Copyright 2015 INFORMATION, INC.
To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.