Association for Computing Machinery
Welcome to the March 28, 2016 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


The Race Is On to Control Artificial Intelligence, and Tech's Future
The New York Times (03/26/16) John Markoff; Steve Lohr

Major technology companies are engaged in an intense "platform war" among each other to become the preferred provider of artificial intelligence (AI) software, which advocates believe could control the direction of the industry for years to come. "Whoever wins this race will dominate the next stage of the information age," reports machine-learning specialist Pedro Domingos. Among the resources the rival companies are trying to tap for an edge in the AI wars are startups and gifted researchers, using strategies that include holding man-versus-machine contests. IDC estimates the machine-learning application market will reach $40 billion by 2020, with 60 percent of those apps operating on platform software from Google, IBM, Amazon, and Microsoft. IDC also predicts at least half of developers will embed AI features in their products by 2018. IBM is making the most wide-ranging entry into AI by positioning its Watson division as both a software and a services business, and the division's David Kenny says the overarching goal "is to have hundreds of millions of people use Watson as self-service AI." Google, which recently scored a victory with its AI's defeat of a master Go player, is opening such technology to outside developers, and in November it open-sourced its core machine-learning component, TensorFlow. Google researcher Jeff Dean anticipates intelligent software apps becoming commonplace, "and machine learning will touch every industry."


It's Your Fault Microsoft's Teen AI Turned Into Such a Jerk
Wired (03/25/16) Davey Alba

Microsoft unveiled a new online chatbot on Twitter Wednesday morning but took it offline by the evening because Twitter users coaxed it into regurgitating offensive language. Called Tay, the chat bot is designed to respond to messages in an "entertaining" way, impersonating 18- to 24-year-olds in the U.S. There were no humans making the final decision on what Tay would publicly say. Tay was likely trained with neural networks using vast troves of online data to train the bot to talk like a teenager. The system evaluates the weighted relationships of two sets of text--often questions and answers--and resolves what to say by picking the strongest relationship. Microsoft's Xiaoice has a big cult following in China, with millions of young people interacting with the chatbot on their smartphones every day, and the success of this project probably gave Microsoft the confidence it could replicate it in the U.S. Dennis R. Mortensen, CEO and founder, x.ai, says humans should not necessarily blame the artificial-intelligence technology because it is just a reflection of who we are. Ryan Calo, a law professor at the University of Washington who studies AI policy, says in the future developers could employ a mechanism for labeling so the process of where Tay is pulling responses from is more transparent.


Male Computer Programmers Shown to Be Right Up There With Chefs, Dentists on Gender Pay Gap Scale
Network World (03/23/16) Bob Brown

Male computer programmers' salaries are far higher than those of their female peers, according to a new Glassdoor report. The study showed women earn about 76 cents for every dollar men earn in the U.S. Glassdoor names "women and men sorting into different jobs or industries with varying pay" the biggest contributor to the gender pay gap in the U.S. Age also plays a role, with the gap among employees 55 or older being about twice as large as the national average of 5.4 percent. In terms of occupation, male computer programmers earn 28.1 percent more than their female counterparts, a slightly larger gap than chefs and dentists. The other prominent information technology job is information security specialist, where the gap favors men by 14.7 percent. "To help close the [overall] gender pay gap, we should focus on creating policies and programs that provide women with more access to career development and training, such as pay negotiation skills, to support them through their lives in any job or field they choose to enter," recommends Glassdoor's Dawn Lyon. Glassdoor also supports greater pay transparency at organizations.


Quantum Networkers, Watch Out: Twisting Light Slows It Down
IDG News Service (03/23/16) Stephen Lawson

University of Ottawa researchers say light twisted into a corkscrew shape slows the speed at which it travels by 0.1 percent, which could create serious bottlenecks for future networks and even quantum computing. Twisting beams of light and changing how they travel through space can generate another variable for encoding information. Each kind of twist produces a different corkscrew pattern, and University of Ottawa professor Ebrahim Karimi says the number of patterns that can be created is theoretically infinite. He speculates such encoding could enable the expansion of network data loads by up to four-fold using existing fiber. However, Karimi notes the role twisted light could potentially play in quantum computing is complicated by the slowdown effect. Delays of up to 23 femtoseconds were demonstrated by the researchers, and Karimi says the problem is bits transmitted over a network have to arrive at precisely the right time if they are to be rebuilt into meaningful data, unless the system can correct for the delay. Battelle Memorial Institute researcher Nino Walenta says although the reduction in speed is problematic, the twisting effect might have useful applications as well.


Phone-Based Laser Rangefinder Works Outdoors
MIT News (03/25/16) Larry Hardesty

Massachusetts Institute of Technology researchers working in the Computer Science and Artificial Intelligence Laboratory have developed a new infrared depth-sensing system, built from a smartphone with a $10 laser attached to it, that works indoors and outdoors. The researchers will present the system at the International Conference on Robotics and Automation in May. The new system performs several measurements, timing them to the emission of low-energy light bursts. The system captures four frames of video, two of which record reflections of laser signals and two of which record only the ambient infrared light. The system then subtracts the ambient light from its other measurements. The prototype uses a phone with a 30-frame-per-second camera, so capturing four images imposes a delay of about an eighth of a second. However, a 240-frame-per-second camera, which is already commercially available, would reduce that delay to a 60th of a second. The system uses a technique called active triangulation, in which a laser mounted at the bottom of the phone emits light in a single plane; the angle of the returning light can thus be measured from where it falls on the camera's two-dimensional sensor. At ranges of three to four meters, the system gauges depth to an accuracy measured in millimeters.


Unlocking the Gates to Quantum Computing
Griffith University (03/28/16) Helen Wright

Griffith University researchers have created an experiment that successfully implements the quantum Fredkin gate for the first time. The experiment, which also included researchers from the University of Queensland, demonstrated how to build large quantum circuits in a more direct way without using small logic gates. Normally, the Fredkin gate requires implementing a circuit of five logic operations, but the researchers used the quantum entanglement of photons to implement a Fredkin controlled-SWAP operation directly. "There are quantum computing algorithms, such as Shor's algorithm for finding prime factors, that require the controlled-SWAP operation," says University of Queensland professor Tim Ralph. In addition, the quantum Fredkin gate can be used to perform a direct comparison of two sets of qubits to determine whether they are the same or not, a feature that is useful in computing and essential for some secure quantum communication protocols. "What is exciting about our scheme is that it is not limited to just controlling whether qubits are swapped, but can be applied to a variety of different operations opening up ways to control larger circuits efficiently," says Griffith University professor Geoff Pryde.


Quantum Computing Gets a Performance Boost--Is It Enough to Break RSA?
Security Intelligence (03/25/16) Larry Loeb

A team from the Massachusetts Institute of Technology (MIT) and the University of Innsbruck in Austria has developed a way to solve a scalability problem that affects current implementations of quantum computers. The researchers made a proof-of-concept machine that could factor the number 15 in five qubits instead of the 15 qubits normally used by quantum computing, but their breakthrough goes beyond this particular computer. The team demonstrated how a different architecture of quantum computers could be constructed to enable the scaling of Shor's algorithm, and with this new architecture, it can be used to get much greater efficiency that is possible with a traditional computer. "This algorithm has been realized scalably within an ion-trap quantum computer and returns the correct factors with a confidence level exceeding 99 percent," the researchers note. MIT professor Isaac Chuang says this achievement shows "Shor's algorithm, the most complex quantum algorithm known to date, is realizable in a way where...all you have to do is go in the lab, apply more technology, and you should be able to make a bigger quantum computer." The work indicates that machines will someday speed up Shor, and enterprises must prepare for this threat to encryption.


Bristol and Lund Set a New World Record in 5G Wireless Spectrum Efficiency
University of Bristol News (03/23/16)

The Universities of Bristol and Lund, in collaboration with National Instruments (NI), have demonstrated a 12-fold boost in spectrum efficiency over current 4G cellular technology using a massive antenna system. The Bristol scheme deploys an enormous multiple antenna technology (MIMO) architecture involving 128 base-station antennas, using a flexible prototyping platform from NI based on LabVIEW system design software and PXI hardware. Bristol's system runs at a carrier frequency of 3.5 GHz and supports simultaneous wireless connectivity to up to 12 single antenna clients, with each client sharing a common 20-MHz radio channel. Digital signal processing algorithms disentangle the individual data streams in the space domain observed by the antenna array. "This activity reinforces our well-established propagation and system modeling work by offering a new capability in model validation for Massive MIMO architectures," reports Bristol professor and dean of engineering Andrew Nix. Spectrum and power efficient wireless communications are essential to Bristol's Communication Systems & Networks Group and the EPSRC Center for Doctoral Training in Communications. "We see massive MIMO as the most promising 5G technology and we have pushed it forward together with partners in Bristol and in our [European Union] project MAMMOET," notes Lund University professor Ove Edfors.


New Open Source Software for High Resolution Microscopy
Bielefeld University (03/23/16)

Bielefeld University researchers have developed a new open source software solution that can process raw data quickly and efficiently. Researchers currently try to attain better resolution by using structured illumination, but they cannot use the raw images gained with this method right away. "The data obtained with the microscopy method require a very laborious mathematical image reconstruction," at which point the raw data recorded with the microscope result in a high-resolution image, according to Bielefeld professor Thomas Huser. This stage previously required a complicated mathematical procedure that was only available to a few researchers, and there was no open source software solution that was easily available to all researchers. This is a major obstacle to the use and further development of the technology, and the new technology is filling this gap, Huser says. For the necessary post-processing, researchers "no longer need to develop their own complicated solutions but can use our software directly, and, thanks to its open source availability, they can adjust it to fit their problems," says Bielefeld researcher Marcel Muller. The software is freely available to the global scientific community as an open source solution.


Record-Speed Data Transmission Could Make Big Data More Accessible
University of Illinois News Bureau (03/22/16) Liz Ahlberg

As big data has become more expansive, the need has grown for a high-speed data transmission infrastructure that can accommodate the massive amounts of data being transferred from one place to another. To help alleviate this issue, University of Illinois (UI) researchers have advanced oxide-VCSEL technology, which is the basis for fiber-optic communications systems. UI professor Milton Feng and his research team have been pushing VCSEL technology to higher speeds in recent years, and in 2014 achieved the first error-free data transmission at 40 Gbps. The team's most recent achievement is for 57-Gbps error-free data transmission at room temperature, as well at 50-Gbps speeds at temperatures up to 85 degrees Celsius. Achieving high speeds at high temperatures is very difficult due to the nature of the materials used, which work better at lower temperatures. Feng hopes his research proves high-speed operation at high temperatures is scientifically possible and useful for commercial applications. "This type of technology is going to be used not only for data centers, but also for airborne, lightweight communications, like in airplanes, because the fiber-optic wires are much lighter than copper wire," Feng says.


Breakthrough Technology to Improve Cybersecurity
Phys.Org (03/21/16)

A collaborative project between physicists at the University of Sydney's Center for Ultrahigh Bandwidth Devices for Optical Systems (CUDOS) and electrical engineers from the School of Electrical and Information Engineering has yielded a major breakthrough in generating single photons as carriers of quantum information in security systems. Photons are generated simultaneously in pairs, each in one of the photon streams, and the detection of photons in one stream signals the timing information of those in the other. By tapping this data, a proper timing control is dynamically applied to those photons so they appear at regular intervals. This new method boosts the rate of photons at the regular interval, which is very useful for quantum secure communication and quantum photonic computation. "This research has demonstrated that the odds of being able to generate a single photon can be doubled by using a relatively simple technique--and this technique can be scaled up to ultimately generate single photons with 100-percent probability," says lead author Chunle Xiong. CUDOS director and professor Ben Eggleton predicts single-photon generation "will drive the development of local secure communications systems--for safeguarding defense and intelligence networks, the financial security of corporations and governments, and bolstering personal electronic privacy." He reports the demonstration exploits a photonic chip CUDOS has been developing over the last 10 years.


Computer Scientist Codes His Way to Success
News Hub (03/19/16) Dan Satherley

Victoria University professor James Noble has been awarded the 2016 AITO Dahl-Nygaard Senior Prize, considered one of the most prestigious awards of its kind in the world, for his work on Grace, a computer language designed from the bottom-up to be easy to use. "Grace has flexibility--that is, students can be introduced to it in stages, and can grow to the full version at their pace," Noble says. Grace can be run inside a Web browser, and is open source, meaning anyone can use it and it is free of charge. "In JavaScript there are lots of things that you have to do in just the right way or something weird will happen, and we try and at least make [Grace] regular, predictable, and understandable," Noble says. He likes the idea of teaching coding in schools, but says it is not as simple as simply adding it to the curriculum. "There's always two difficulties--one of which is, what do you displace to put coding in?' he notes. "And the other one is, who do you get to teach it?" Prior to winning the AITO Dahl-Nygaard Senior Prize, Noble was the first computer scientist to be awarded a James Cook Research Fellowship. When the fellowship is over, Noble plans to focus on teaching.


Abstract News © Copyright 2016 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe