Association for Computing Machinery
Welcome to the January 6, 2017 edition of ACM TechNews, providing timely information for IT professionals three times a week.

ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


Humans Mourn Loss After Google Is Unmasked as China's Go Master
The Wall Street Journal (01/05/17) Eva Dou; Olivia Geng

An upgraded version of Google DeepMind's AlphaGo algorithm was unmasked Wednesday after beating Chinese Go champions in a follow-up to its March 2016 victory over South Korea's top player. Disguised as a character called "Master," the updated AlphaGo's true identity was revealed by Google after winning 60 games without any losses over a week. Master bewildered its human competitors by placing pieces in unconventional positions early on and shifting strategies between games. "AlphaGo has completely subverted the control and judgment of us Go players," notes Go player Gu Li. "I can't help but ask, one day many years later, when you find your previous awareness, cognition, and choices are all wrong, will you keep going along the wrong path or reject yourself?" The Master guise was adopted as a way to anonymously test the new AlphaGo, says DeepMind CEO Demis Hassabis. Sun Fuchun, a computer science professor at Tsinghua University in China, believes computers will eventually surpass humans in increasingly complex skills. He predicts computers will continue to assume more roles currently performed by humans. "I believe AI will replace humans in many different ways, including medical treatment, the military, teachers, and even intimate relations," he says.
View Full Article - May Require Paid Subscription | Return to Headlines | Share Facebook  LinkedIn  Twitter 


BigGIS: A Continuous Refinement Approach to Master Heterogeneity and Uncertainty in Spatio-Temporal Big Data
CCC Blog (01/04/17) Helen Wright

Researchers led by Patrick Weiner at the Karlsruhe University of Applied Sciences in Germany authored a paper than won an award in the Computing Community Consortium-sponsored Blue Sky Ideas Track Competition at the ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems 2016 in San Francisco, CA. Their research intends to make geographic information systems (GIS) capable of deriving meaning from heterogeneous spatio-temporal data from unstructured and unreliable datasets via the BigGIS predictive and prescriptive analytics platform. The team proposed continuously refining BigGIS to steadily improve the analysis results as well as to build the user's trust in those results by generating awareness of underlying uncertainties and data provenance. "Our approach consists of an integrated analytics pipeline which blends big data analytics and semantic Web services on system-side with domain expert knowledge on human-side, thereby modeling uncertainty to continuously refine results to generate new knowledge," Weiner said. He noted the continuous refinement model offers a holistic strategy that accounts for all big data dimensions. Including the user in the process enables computers to learn from human analysis to produce hidden links between data and the problem domain. Weiner said the BigGIS team is applying the model to the disaster management, smart city and health, and environmental management domains.


Green Is the New Black
Asian Scientist (01/03/17) Rebecca Tan

As supercomputers continue their upward performance climb, power consumption soars, leading scientists to develop more energy-efficient high-performance computing (HPC) machines. Supercomputer performance is dictated by the size of the machine, which in turn is proportional to the demand for energy. Given that large machines are surpassing megawatt-scale, increasing performance depends on reducing power consumption. Japan's Tsubame-KFC supercomputer ranked in first place on both the Green500 and Green Graph500 lists when it was unveiled in 2013, drawing on various technological elements and unconventional approaches to drive down energy costs. KFC's efficiency partly comes from its adoption of many-core technologies in the form of densely packed graphics-processing units (GPUs) and proactive control of the voltage and frequency of the GPUs. Another major element is oil immersive cooling, which significantly reduces the power required for cooling. As oil has a superior thermal conductivity compared with air, the machine's components are able to run at much cooler temperatures than air-cooled systems, eliminating the need to run power-draining fans. Satoshi Matsuoka, lead architect of the KFC project (and recipient of the ACM Gordon Bell Prize in 2011), is working with the Japanese government's 2020 Post-K program to develop exascale supercomputers, which will require further innovations in power efficiency and scalability.


Tech Trends in 2017: Energy Harvesting
Computing (01/05/17) John Leonard

Some experts see energy harvesting technology as essential to successfully realizing the Internet of Things, and research into micro-scale energy harvesting has gained momentum in the last few years. For example, Stanford University researchers have prototyped HitchHike, an integrated radio/processor that applies backscattering to incoming radio waves and re-transmits data on a different Wi-Fi channel, which dramatically extends the Wi-Fi device's battery life. Meanwhile, Swiss researchers last year tested prototype medical implants powered by subcutaneous solar cells. Another notable innovation by scientists at the University of Cambridge in the U.K. is an ultra-low-power transistor capable of running for months or years at a time on ambient radiation; the device could power wearable devices or implantable electronics. The Cambridge researchers say the device harvests a "leakage" of electrical current characteristic of all transistors. Experiments with static electricity also are proceeding, with scientists testing triboelectric nanogenerators (TENGs) to tap the charge that accumulates when two materials repeatedly come into contact and then separate, exchanging electrons. TENGs could power small medical implants, or be incorporated into hybrid energy harvesters that use both TENGs and solar cells to harness energy generated by wind and sunshine.
View Full Article - May Require Paid Subscription | Return to Headlines | Share Facebook  LinkedIn  Twitter 


Racing Robot Cars Will Help AI Learn to Adapt to the Real World
New Scientist (01/04/17) Timothy Revell

The robotic games laboratory at the University of Essex in the U.K. has scheduled a competition for miniature cars controlled by artificial intelligence (AI) for this August. For the past three years, Essex has hosted a competition in which teams have competed to develop intelligent systems that could take on different video games, but now the lab will take the concept out of simulated environments and into the real world. AIs will control physical robots in competitions that will not be so straightforward. In one scenario, for example, the cars' brakes might be disabled, and in another, obstacles could be added to the track. Developers will not know in advance what challenges they will face, so their systems will need to determine for themselves how to win each time. AIs excel at very specific tasks, but in real-world environments in which situations are unpredictable, the ability to adapt is very important. Essex's new robotics arena, which can handle other types of robots, could host competitions involving AI-controlled drones in the future. "With the right games [AI] can slowly figure out how to do more complex tasks," says New York University professor Julian Togelius.


Man vs. AI Machine in Texas Hold'em Matchup
Computerworld (01/04/17) Sharon Gaudin

Four top professional poker players will challenge an artificial intelligence (AI) system in a 20-day Heads-Up No-Limit Texas Hold'em poker tournament billed as an "epic rematch" by the AI's developers at Carnegie Mellon University (CMU). The participants will compete for a $200,000 jackpot against CMU's Libratus AI system beginning Wednesday, Jan. 11 at the Rivers Casino in Pittsburgh. Unlike games such as chess and Go, at which AI systems have excelled in past competitions with human champions, "poker...poses a far more difficult challenge...as it requires a machine to make extremely complicated decisions based on incomplete information while contending with bluffs, slow play, and other ploys," says CMU professor Tuomas Sandholm, recipient in 2001 of the ACM/SIGART Autonomous Agents Research Award. Sandholm previously built an AI system called Claudico, which was matched against professional players in a 2015 poker tournament, only to be beaten. CMU researchers say the 80,000 hands played in the contest were not sufficient to definitively establish a human or computer's supremacy with statistical significance. The new contest will involve 120,000 hands, with Libratus' strategies calculated by the Pittsburgh Supercomputing Center's Bridges system. "We don't write the strategy," Sandholm says. "We write the algorithm that computes the strategy."


States Wire Up Roads as Cars Get Smarter
The Wall Street Journal (01/02/17) Paul Page

To prepare for the day when self-driving cars will travel on technology-aided roads, U.S. state transit planners aim to outfit those roadways with fiber optics, cameras, and linked signal devices to make traffic safer and more efficient. Planners say the technology will lead to faster trips, fewer accidents, and fuel savings. Smart-road technology currently is being installed in only a few miles of highway in a handful of states, but state transit authorities hope the incoming presidential administration will help accelerate the move to smart roads by honoring its promise to boost infrastructure budgets. Ohio in December announced it would invest $15 million in smart-road technology along a stretch of Route 33. States will need to choose how to communicate with autos as various manufacturers and technology firms develop self-driving car applications independently--a job made challenging by a lack of standards. The biggest obstacle cited by highway researchers is guaranteeing smart-road technology, especially costly software, can work. Road connections to autos mostly use wireless dedicated short-range communications. However, experts say the industry may opt for cellular-data systems or Wi-Fi if they offer sufficiently reliable and rapid information transmissions.
View Full Article - May Require Paid Subscription | Return to Headlines | Share Facebook  LinkedIn  Twitter 


Move over Bitcoin--MIT Cryptographer Silvio Micali and His Public Ledger ALGORAND…the Future of Blockchain?
Blockchain News (01/05/17) Richard Kastelein

Massachusetts Institute of Technology professor Silvio Micali recently published a paper describing a decentralized and secure way to manage a shared ledger that provides a solution to the Byzantine General's problem. Micali, who has received the 2012 ACM A.M. Turing Award, the 1993 Goedel Prize, and the 2004 RSA Prize in cryptography, developed a new approach to proof of work that cryptographically selects a set of verifiers in charge of constructing a block of valid transactions, including traditional Blockchains. The overall concept is called cryptographic certation, which involves selecting a small group of people randomly and suddenly who will be in charge of the next block, and be rewarded with a percentage of the block transaction. In addition, the whole process will be executed in a way that cannot be manipulated by an adversary. The group decides the next block by a redesigned Byzantine Agreement, in which a leader is picked randomly from the group. If the chosen leader is a bad choice, an agreement will not be able to be made and the group will receive no money. "I believe the public ledger is going to be as beautiful and as useful as any physical infrastructure that we have created and I really urge you to devote all of your attention to it," Micali says.


ELFI: Engine for Likelihood-Free Inference Facilitates More Effective Simulation
Aalto University (01/04/17)

New technology developed by researchers at Aalto University in Finland and colleagues could be used to model reality as accurately as possible in a simulator. The team says the Engine for Likelihood-Free Inference (ELFI) will significantly reduce the number of simulation runs necessary for the estimation of unknown parameters and make it easier to add new inference methods. "The ELFI inference software we have developed makes this previously extremely difficult task as easy as possible: software developers can spread their new inference methods to widespread use, with minimal effort," says Aalto professor Samuel Kaski. The researchers say ELFI will likely be beneficial to scientists from fields in which traditionally used statistical methods cannot be applied. The team describes ELFI as easy to use and scalable, and notes the inference problem can be easily defined with a graphical model. The team believes the engine has the potential to revolutionize many fields that use computational simulation. "For example, a simulation of a disease can take into account how the disease is transmitted to another person, how long it will take for a person to recuperate or not recuperate, how a virus mutates, or how many unique virus mutations exist," says Aalto professor Aki Vehtari. "A number of simulation runs will therefore produce a realistic distribution describing the actual situation."


ZigBee's Dotdot Language Aims for IoT Harmony
IDG News Service (01/03/17) Stephen Lawson

The ZigBee Alliance, a group of more than 400 companies that make products that utilize the ZigBee wireless protocol, have announced Dotdot, a universal language for the Internet of Things (IoT). Dotdot is intended for use with any wireless technology, defining how devices tell each other what they are and what they do. Dotdot and other application layers are widely expected to be more important than a common networking protocol for unifying IoT because it is becoming easier to include many different wireless radios in one device. Dotdot is more mature than other application layers because it is based on the ZigBee Cluster Library (ZCL), which works with devices on other types of Internet Protocol networks. Dotdot eventually could handle application-layer tasks such as device discovery throughout a home or over networks such as Wi-Fi, Bluetooth, Ethernet, and Narrowband IoT, a low-power cellular system, according to the ZigBee Alliance's Daniel Moneta. At this week's Consumer Electronics Show, ZigBee demonstrated about a dozen Dotdot prototypes from various vendors, including thermostats, lights, window blinds, and sensors.


Organic Dye-Based Film Boosts Data Storage Density
EE Times Asia (01/03/17) Graham Prophet

Researchers from Imperial College London (ICL) in the U.K. say they have discovered an electro-optical phenomenon that could be used to boost the density of information storage in optical media by many orders of magnitude. Their research can circumvent the limitations of focusing a light spot onto recording mediums, which have been a fundamental restriction on achievable greater data storage density. The researchers accomplished the increased storage density using organic dyes based on azobenzene, and a special light antenna. They found shining a laser on azobenzene molecules in an electric field causes them to flip, producing a change in the optical properties of the dye molecules, which enables them to act as information carriers. The researchers used azobenzene films to create optical memory that "violates" the diffraction limit. They also developed methods for recording and reading information from such films using a nanoantenna, which absorbs the laser light, amplifies it, and focuses it on the point where the information is written or read. "With further improvement to this technology we could reach data storage densities of petabytes per square inch, in other words a conventionally sized disc would hold about a million times more information than a modern DVD and hundreds of times more than the most capacious modern hard disks," says ICL professor Sergei Kazarian.


Quantum Computers Ready to Leap Out of the Lab in 2017
Nature (01/03/17) Davide Castelvecchi

This could be the year quantum computing technology makes the transition to real-world applications, with Google and Microsoft setting challenging goals while startups and academic labs have similar ambitions. Google this year, in furtherance of its goal of "quantum supremacy," plans to execute a computation with a "chaotic" quantum algorithm that generates what appears to be a random output, demonstrating quantum computers' superiority in certain tasks and drawing potential customers, says Google team leader John Martinis. Meanwhile, Microsoft is focusing on topological quantum computing, which relies on matter excitations for "braided" data encoding that is more resistant to outside disruptions and which can make error correction easier. Yale University physicist Robert Schoelkopf is racing to construct a quantum computer, and says "we have demonstrated all the components and all the functions we need." Schoelkopf performed pioneering work in building quantum bits (qubits) by encoding quantum states as oscillating currents in superconducting loops--an approach Google adopted. A second method involves qubit encoding in single ions isolated by electric/magnetic fields in vacuum traps. The IonQ startup has embraced this method to pursue machines with up to 64 qubits and more flexible and scalable designs than superconducting circuits, says IonQ co-founder Christopher Monroe.


UCLA Mathematicians Bring Ocean to Life for Disney's 'Moana'
UCLA Newsroom (01/03/17) Stuart Wolpert

University of California, Los Angeles (UCLA) researchers applied their knowledge of math, physics, and computer science to help animate "Moana," a three-dimensional (3D) computer-animated Disney film in which the ocean is a character. To address the challenge of animating flowing water realistically, UCLA mathematicians leveraged APIC, a state-of-the-art simulation method for fluid in graphics. Alexey Stomakhin, a former ULCA doctoral student, led the development of the code used to simulate the movement of the ocean in the film. "The increased demand for realism and complexity in animated movies makes it preferable to get assistance from computers," Stomakhin says. "This means we have to simulate the movement of the ocean surface and how the water splashes, for example, to make it look believable." The animation's algorithms closely approximate partial differential equations to preserve angular momentum and energy. Stomakhin also was involved in creating waves that break at a certain time and place, and choreographing the movement of water around boats. In addition to providing visual effects for movies, the work done by the UCLA team could be used for plasma simulations, three-dimensional (3D) printing, and surgical simulations.


Abstract News © Copyright 2017 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]