Association for Computing Machinery
Welcome to the March 4, 2016 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).


Florida Senate Approves Making Coding a Foreign Language
USA Today (03/01/16) Madison Iszler

Florida senators have approved a bill allowing high school students to take computer coding classes in place of foreign language requirements. The bill, introduced by Sen. Jeremy Ring, will take effect during the 2018-2019 school year. "It's ahead of its time, but in reality, it's in its time,” Ring says. "If you don't have an understanding of technology, you will be left behind. It's a basic skill, as much as reading and writing." The bill permits high schools to offer students the opportunity to take computer coding courses, and requires Florida College System institutions and state universities to accept two coding credits in place of the current two-credit foreign language requirement. For those school districts that do not offer coding courses, students can take classes through Florida Virtual School, a state-funded online school. Ring notes the bill received a boost from technology companies and educational institutions, and he calls learning to code "the great educational equalizer." He says the legislation puts Florida out in front of what will soon become a national trend in education.

Conversing With Computers
National Science Foundation (03/02/16) Aaron Dubrow

University of Southern California (USC) researchers are developing high-speed language-processing systems that rival the speed and efficiency of human speakers in specific settings. "While we human speakers can often understand and respond to what someone is saying to us in a fraction of a second, a typical voice interface will require much longer--often a second or two--to try to understand what you have said and respond in an appropriate way," says USC professor David DeVault. The researchers are studying new techniques that can streamline human-machine conversations by enabling the system to perform all necessary computer-processing steps in real time while the user is talking. The researchers note the systems often can determine what the speaker means and how it should respond before the user finishes speaking. To demonstrate this breakthrough, the researchers developed a high-performance game-playing agent called Eve. In the game, users describe the pictures they see on their computer screen and the agent tries to guess which picture they are talking about as fast and accurately as possible. The system uses "incremental" speech-processing algorithms to increase the agent's speed of understanding. "These findings underscore the importance of enabling systems to not only understand what users are saying, but to do so as quickly as a human would," DeVault says.

Do Universities, Research Institutions Hold the Key to Open Data's Next Chapter?
Government Technology (03/02/16) Ben Miller

The solution to an overabundance of unused government-produced data may lie in open-data alliances between government and research institutions, with the former offering raw data and the latter applying expertise and analytics. "We believe in feedback loops from city officials and citizens while developing a project," says the Massachusetts Institute of Technology's Carlo Ratti. "By working in the same city, these flows of information speed up, with great benefits for all the parties involved." The John D. and Catherine T. MacArthur Foundation recently helped launch the MetroLab Network, an organization of cities teaming with local universities and research institutes for smart city projects. One project under its auspices is a collaborative effort between the Urban Center for Computation and Data (CCD), the Argonne National Laboratory, and Chicago officials, for the purpose of applying data toward enhancing municipal knowledge and solving problems. Urban CCD is pursuing projects such as the Array of Things, which aims to build a metropolitan Internet of Things demo using sensors installed within Chicago's infrastructure that can take environmental measurements and recognize objects. The incentives of such collaboration for research institutes include students' exposure to practical experience. These partnerships often revolve around project evaluation and impact analysis, and such efforts have encouraged government officials to make more data open to researchers and citizens.

Log In to Your Phone With a Finger-Drawn Doodle Instead of a Password
Technology Review (03/03/16) Rachel Metz

Researchers at Rutgers and Aalto universities are exploring the usefulness of "free-form gesture authentication," in which finger-drawn figures can be used in conjunction with touchscreens to prove a person's identity instead of a text password. After testing this method on Android smartphones with one group of people and comparing it to conventional text password authentication with another group, the researchers say their technique is faster and as easy to remember as text passwords. They found people using gestures instead of text as their passwords logged into accounts 22-percent faster, and spent 42 percent less time to come up with gesture passwords. The most frequent types of gesture passwords people invented were shapes, including squares, hearts, stars, and envelopes. The gesture-password cohort committed nearly twice as many errors in entering their passwords, but the researchers think such errors will decline over time as users become better with practice. Rutgers professor Janne Lindqvist says the gestures can offer better security than passwords because they can be more randomized. The researchers will present a paper on their work in May at the ACM CHI 2016 conference in San Jose, CA.

Monkeys Navigate a Wheelchair With Their Thoughts
IEEE Spectrum (03/03/16) Prachi Patel

Monkeys can successfully navigate a robotic wheelchair by thought in real time, thanks to a wireless brain-machine interface (BMI) developed by Duke University researchers. Researchers previously mainly used noninvasive electroencephalogram electrodes affixed to the subject's scalp to enable thought control of wheelchairs, but the low-frequency signals lack sufficient information to enable continuous, real-time control, while implanted electrodes linked by wire to a computer have proved impractical. The Duke researchers decoded neural signals for whole-body movements through two-dimensional space, and translated that into the translational and rotational velocities for a wheelchair so a monkey with the BMI implant could navigate the wheelchair toward a reward. The BMI features a 512-channel wireless interface that transmits signals to a computer. The researchers recorded the monkeys' neuronal signals as they perceived the wheelchair's trajectory, and these signals were used to educate a decoder program. The animals then attempted to control the wheelchair by thought, and the decoder converted their brain signals into motor commands for the wheelchair. "This is the first wireless brain-machine interface for whole-body locomotion," says Duke professor and project leader Miguel Nicolelis. "Even severely disabled patients who cannot move any part of their body could be placed on a wheelchair and be able to use this device for mobility."

The Beginning of the End for Encryption Schemes?
MIT News (03/03/16) Jennifer Chu

Researchers at the Massachusetts Institute of Technology (MIT) and the University of Innsbruck have designed and built a quantum computer from five atoms in an ion trap. The computer uses laser pulses to carry out the quantum algorithm of Peter Shor, Morss Professor of Applied Mathematics at MIT, on each atom, to correctly factor the number 15. The system is designed in such a way that more atoms and lasers can be added to build a bigger and faster quantum computer, which would enable it to factor much larger numbers. The researchers say their computer represents the first scalable implementation of Shor's algorithm. "It might still cost an enormous amount of money to build...but now it's much more an engineering effort, and not a basic physics question," says MIT professor Isaac Chung. Although it normally takes about 12 qubits to factor the number 15, the researchers have found a way to reduce that number to five qubits, each represented by a single atom that can simultaneously be held in a superposition of two different energy states. After using laser pulses to perform logic gates on four of the five atoms, the researchers can store, forward, extract, and recycle the results via the fifth atom, thereby carrying out Shor's algorithm in parallel. "In future generations, we foresee it being straightforwardly scalable, once the apparatus can trap more atoms and more laser beams can control the pulses," Chung says.

Aging Voting Machines Cost Local, State Governments
Stateline (03/02/16) Sarah Breitenbach

Many U.S. voters are still relying on outdated electronic voting machines at least 10 years old, which raises the specter of massive voter disenfranchisement in the event of breakdowns. However, cash-strapped state and local governments are wrestling with the high costs of replacing the systems with newer models, while some officials and lawmakers are concerned the new machines could be hacked to perpetrate voter fraud. States such as Maryland are opting for electronically scanning paper ballots, and although optical scanners are expensive, relatively few need to be procured as a single device can be used for each polling place. Meanwhile, Los Angeles County has set aside $70 million to create a voting system that uses touchscreens to cast ballots. Verified Voting president Pamela Smith estimates approximately 25 percent of U.S. voters will use electronic voting systems this year, versus 30 percent to 40 percent in years past. A report from the Brennan Center for Justice says officials in at least 31 states want to acquire new voting systems within five years, but at least 22 are uncertain of how to pay for it. The center estimates it would cost more than $1 billion to replacing existing machines.

U. Researchers Recruited for Tor Project, an Online Platform for Anonymity
The Daily Princetonian (03/01/16) Kristin Qian; Marcia Brown

The nonprofit Tor Project recently enlisted three Princeton University researchers to assist in the organization's current focus on mitigating "malicious nodes," or instances in which one Tor network user violates the anonymity or confidentiality of another user. The Tor website says the platform anonymizes the identity and location of Internet users by rerouting messages and data through multiple layers. Princeton postdoctoral researcher Philipp Winter says a node is a computer operated by an unknown volunteer, and a Tor network user sends data to another Web server through several nodes, via their own browser. In the malicious node scenario, the volunteer can rig the system to extract sensitive data such as passwords to secure accounts and record high volumes of personal information. Winter notes one protective strategy is to systematically spot and block malicious nodes while encrypting the users in traffic. An analysis of such methods Winter co-authored is lauded by the Tor Project's Kate Krauss. "This particular research enables us to both understand the network better and detect certain attacks before they can do harm," she says.

New NSF Partnership With the Semiconductor Research Corporation on Energy-Efficient Computing
CCC Blog (03/01/16) Helen Wright

The Energy-Efficient Computing: from Devices to Architectures program recently announced by the U.S. National Science Foundation (NSF) seeks to support research that can pave the way for the next computing paradigm by maximizing the energy efficiency of future computer systems. The joint project of the NSF's Computer & Information Science & Engineering Directorate and Engineering Directorate with the Semiconductor Research Corp. aligns with interagency programs such as the National Strategic Computing Initiative and the Nanotechnology-Inspired Grand Challenge for Future Computing. The program synopsis specifies "evolutionary approaches to addressing [the] challenge [of energy consumption in the processing, storage, and transfer of data] are no longer adequate. Truly disruptive breakthroughs are now required, and not just from any one segment of the entire technology stack. Due to the complexity of the challenges, revolutionary new approaches are needed at each level in the hierarchy. Furthermore, simultaneous co-optimization across all levels is essential for the creation of new, sustainable computing platforms." Successfully identifying and deploying these new solutions requires will require "a comprehensive and collaborative approach," which includes coders, system architects, circuit designers, chip-processing engineers, material scientists, and computational chemists, according to the synopsis.

These Engineers Are Developing Artificially Intelligent Hackers
The Guardian (03/03/16) Olivia Solon

In August, seven teams will compete in the U.S. Defense Advanced Research Projects Agency's (DARPA) Cyber Grand Challenge, the goal of which is an artificially intelligent hacking machine that can autonomously spot and correct vulnerabilities in computer systems before criminals can exploit them. Each team has a mandate to develop software that can attack the other team's vulnerabilities while also finding and fixing weaknesses in their own software without compromising performance and functionality. "Fully automated hacking systems are the final frontier," says University of California, Santa Barbara professor Giovanni Vigna, whose team built a system to participate in the Cyber Grand Challenge. "Humans can find vulnerabilities but can't analyze millions of programs." Other researchers are working on robo-hackers outside of the DARPA contest, including BT Americas' Konstantinos Karagiannis. His project is a hacking system that employs neural networks to model the human brain's learning and problem-solving processes. "Using this approach a security scanner could identify intricate flaws using creative approaches you would have never thought of," Karagiannis says. "And it can be written with very modest hardware."

Big Bang Data: How 'Citizen Data Scientists' Will Help Astrophysicists Look Back to the Dawn of Time
ZDNet (03/02/16) Danny Palmer

University of Manchester professor Danielle George says the Square Kilometer Array (SKA) project will rely significantly on the assistance of "citizen data scientists" to analyze a massive volume of generated data. Described as the largest radio telescope of its kind in the world, the SKA is 50,000 times more sensitive than any other existing radio instrument. It will search for cosmic radio signals in the hope of unlocking the universe's origins. "The amount of data the SKA will produce is enormous; it's estimated the dishes alone will generate 10 times the global Internet traffic," George notes. He added that the SKA supercomputer will also need to perform one quintillion operations per second in order to process all the data from the telescopes. The project calls for real-time, automated data processing, and George says releasing this data to the public to help analyze it could be the solution. She believes physicists, technologists, engineers, and citizen data scientists must work together if the SKA project is to fully analyze the signals the telescope picks up. "Innovation is most likely to reside at the boundaries of all our disciplines and I think that's a pointer towards progress in this area; get experts together from different fields, give them challenging problems, and build up a culture of sharing those outcomes with everybody," George says.

World's Top Cryptographers on Encryption Backdoors: No Way
Network World (03/02/16) Tim Greene

A panel of leading cryptographers at this week's RSA Conference agreed inserting backdoors to unscramble encrypted communications is a threat to confidentiality, and Congress should act to balance this with law enforcement and national security needs. "The question is, where do you put the line?" asked Weizmann Institute of Technology professor Adi Shamir, co-inventor of the RSA algorithm. The panelists raised Apple's opposition to a court order to unlock the encryption of an iPhone used by one of the San Bernardino terrorists as a case in point. Shamir noted Apple made the mistake of claiming compliance was technically impossible when a loophole existed, and he said what it should have done was close the loophole immediately and upgrade to inhibit compliance. Meanwhile, former Twitter security head Moxie Marlinspike saw flaws in the U.S. Federal Bureau of Investigation's argument that allowing it a backdoor for electronic surveillance is socially beneficial. Moreover, he said if Apple loses the court battle, it could create more intrusive spying opportunities. Public key cryptography co-inventor Whitfield Diffie, who this week was named the co-recipient of the 2015 ACM A.M. Turing Award, predicted the outcome of the encryption debate will determine democracy's future. "We're in a new era of confrontation of humans and machines. It's the major issue of our age," Diffie said. "Who controls the machine is who will control the world."

The Most Important Object in Computer Graphics History Is This Teapot
Nautilus (02/29/16) Jesse Dunietz

A teapot has the distinction of being one of the most influential objects in the history of computer graphics, dating back to 1974, when computer scientist Martin Newell sought a digitized object to test algorithms for realistically rendering three-dimensional (3D) shapes while a student at the University of Utah. The teapot Newell used was ideal for his experiments thanks to its unique configuration, and for having such characteristics as being able to cast shadows on itself. After precisely sketching the teapot on a graph, Newell entered the coordinates on an early text and graphics computer terminal, and then his colleague Jim Blinn adjusted the object so it was a little flatter. The digital teapot's shape was sufficiently rudimentary to input and for computers to process. Moreover, its surface could maintain realism without overlaying an artificial pattern. The computer model of the teapot was released publicly so other researchers could use it as a testbed for 3D objects, and the computer graphics community has used it widely and liberally in the intervening decades. Today, the teapot is an embedded shape in many 3D graphics software packages employed for testing, benchmarking, and demonstration.

Abstract News © Copyright 2016 INFORMATION, INC.
Powered by Information, Inc.

To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe