Association for Computing Machinery
Welcome to the September 20, 2013 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).


Alan Turing’s Story Could Be Rebooted By Calls to Pardon Late Computer Legend
Washington Post (09/20/13) Anthony Faiola; Karla Adam

Computing pioneer Alan Turing was convicted of homosexuality after World War II and later committed suicide. However, more than 50 years after his death and following global strides in gay rights, a movement is cresting to reboot the record of the British mathematician's short but impactful life. Responding to a campaign by laureates, the British Parliament is moving toward granting Turing a posthumous pardon. However, opponents argue that a pardon could lead to an influx of petitions from families of other deceased convicts whose punishments in their day now seem barbaric. Turing's reputation preceded him in academic circles, even after the scandal of his conviction and subsequent death. His 1936 paper "On Computable Numbers" outlined the theory of a Universal Machine, a device that some scholars now call the conceptual predecessor of program-based computers. In the 1940s, Turing outlined what was arguably the first realizable design for a modern computer, and in 1950, he propagated an early concept of artificial intelligence. "As one of his colleagues once said, it was a very good thing that the government didn't know that Turing was a homosexual during the war, because if they found out, they would have sacked him and we would have lost," says Lord John Sharkey, sponsor of the Turing pardon in Britain's upper house.

Improving the Big Data Toolkit
New York Times (09/17/13) Steve Lohr

Hadoop allows for relatively inexpensive data analysis, and the next generation, called Hadoop 2.0, will make that analysis possible across many thousands of computers. Hadoop 2.0 is "an important step," making the technology "a far more versatile data operating environment," says Gartner analyst Merv Adrian. Hadoop 2.0 can handle larger data sets faster than the earlier version, and it opens the door to analyzing data in real-time streams. The new version of Hadoop also is designed to work more easily with traditional database tools such as SQL. Many modern organizations are trying to find cost-cutting or sales-improving insights in sensor, Web, and social media data. "Everybody has the amount of data Yahoo and Google did five years ago," says Hadoop 2.0 release manager Arun Murthy. Hadoop uses a more permissive open source license than Linux, allowing for organizations to mix additional features into Hadoop. A recent Gartner survey of 720 companies found that 64 percent are investing in or planning to invest in big data projects within the next two years, an increase from 58 percent last year.

NSA Wants Even Closer Partnership With Tech Industry
Network World (09/19/13) Ellen Messmer

The way to realize confidence in cyberspace is to boost collaboration between the government and the high-tech industry, according to the U.S. National Security Agency's (NSA's) Debora Plunkett, who was speaking at the New York Institute of Technology Cyber Security Conference. The keynote address was intended to get hardware and software vendors to work in ever-closer partnership with the NSA. The government today is largely dependent on commercial hardware and software for which the NSA itself cannot provide indemnification, Plunkett says. However, the NSA documents recently leaked by former contractor Edward Snowden show that the agency views its partnership with industry in part as a way to subvert security in commercial products and services to facilitate cyber-spying. The future of network security will revolve around more automated cyberdefense based on large-scale automation that will reduce the need for manpower.

Teaching Computers to See--By Learning to See Like Computers
MIT News (09/19/13) Larry Hardesty

Massachusetts Institute of Technology (MIT) researchers at the Computer Science and Artificial Intelligence Laboratory have created a system that provides humans with the perspective of an object-recognition system, with the aim of improving the software. Most object-recognition systems rely primarily on machine learning, and succeed only about 30 to 40 percent of the time, although researchers have not been able to pinpoint the reasons for the high failure rate. The MIT system converts an ordinary image into the mathematical representation used by an object-recognition system and back into a conventional image using new algorithms. The team discovered that human volunteers make classification errors similar to those made by the software when presented with the retranslation of a translation, which suggests the feature selections and not the learning algorithms are to blame. The most commonly used feature set in computer-vision research is the histogram of oriented gradients (HOG), which breaks an image into square chunks and assigns a gradient color change by region. The reconstruction algorithm with the most reliable results involves a dictionary consisting of a large group of HOGs with regular properties. The algorithm analyzes thousands of images from the Internet and uses the dictionary that enables the reconstruction of the HOG for each image with the fewest average atoms.

Security Researchers Create Undetectable Hardware Trojans
Computerworld (09/17/13) Jaikumar Vijayan

U.S. and European researchers have released a paper showing how integrated circuits used in computers, military equipment, and other critical systems can be compromised during the manufacturing process through almost undetectable changes at the transistor level. The paper describes how the method could be used to modify and weaken the hardware random number generator on processors and the encryption protections on a smartcard without anyone noticing the changes. This paper is the first to describe how cybercriminals can insert a hardware Trojan into a microchip without any additional circuitry, transistors, or other logic resources, according to Ruhr University's Christof Paar. Although previous research has described hardware Trojans consisting of small to medium-sized integrated circuits added to a chip during the hardware description language layer of the manufacturing process, the new research shows how a hardware Trojan can be introduced at a later stage of the design process by changing the doping on some of the chip's transistors. By switching the doping on a few transistors, parts of the integrated circuit no longer work properly. The approach the research describes is "undetectable by function testing and optical inspection," says security researcher Bruce Schneier.

Intell Community Seeks Energy-Efficient Supercomputer
Federal Computer Week (09/17/13) Frank Konkel

The intelligence community wants to develop superconducting supercomputers that could be much faster and use much less energy than today's conventional supercomputers. This supercomputing alternative could end up being a viable option for the United States in its quest to break the exaflop barrier in supercomputing, according to the Office of the Director of National Intelligence. The Office of Intelligence Advanced Research Projects Agency is currently seeking partners for its Cryogenic Computing Complexity program, which aims to demonstrate a small-scale computer based on superconducting logic and cryogenic memory that is energy-efficient, scalable, and able to solve interesting problems. "In the past, significant technical obstacles prevented serious exploration of superconducting computing, but recent innovations have created foundations for a major breakthrough," says the Office of the Director of National Intelligence's solicitation. "For example, the new single flux quantum logic circuits have no static power dissipation, and new energy-efficient cryogenic memory designs would allow operation of memory and logic in close proximity within the cold environment." In addition, superconducting systems would offer a significant reduction in power demand, potentially as low as 200 kilowatts for a 100-petaflop system, resulting in the savings of tens or hundreds of millions of dollars compared to traditional supercomputers.

iPad App Teaches Students Key Skill for Success in Math, Science, Engineering
UCSD News (CA) (09/16/13) Ioana Patringenaru

University of California, San Diego (UCSD) researchers have developed an iPad app designed to help students learn spatial visualization, which is important for doing well in science, math, and engineering. The app was inspired by Ohio State University professor Sheryl A. Sorby, who developed a course on spatial visualization that taught students to become better at visualizing objects in three dimensions. "With our iPad app, users can rotate 3D objects with their fingertip to learn how they look from different perspectives," says UCSD's Nate Delson. The app includes a sketch feature, which enables users to draw 3D shapes on a two-dimensional grid, and has an immediate feedback feature that tells users whether their sketches are correct. The researchers tested the app on 23 high school students enrolled in a summer residential science outreach program at UCSD. Eleven of the students struggled with spatial visualization at first, but after working with the app for about three weeks, more than half of that group reached a level where they no longer needed training. The researchers expect that initially colleges will adopt the app, but they believe it also could be used in K-12 classes, enabling students to gain spatial visualization skills at a younger age.

Large-Scale Pilot to Get a Read on Super Wi-Fi
Government Computer News (09/16/13)

The Gigabit Libraries Network (GLN) will conduct a large-scale pilot of Super Wi-Fi at U.S. public libraries in the months ahead. The participating library systems will implement Super Wi-Fi access points on e-bookmobiles and other publicly accessible locations. Super Wi-Fi potentially could be used to bring free wireless service to underserved, mostly rural areas. The U.S. Federal Communications Commission opened up TV white space, the unlicensed, low-frequency bands in the radio frequency spectrum that Super Wi-Fi uses, in 2010 after TV broadcasters switched from analog to all-digital signals. Although the lower frequency limits throughput, Super Wi-Fi expands the range to enable signals to go for several miles and pass through walls and buildings. In comparison, many libraries across the country have Wi-Fi access, but their short-range signals require people to be on the premises. Super Wi-Fi is not technically Wi-Fi because it does not conform to the set of IEEE 802.11 standards designated by the Wi-Fi Alliance as Wi-Fi, but it functions on the same interoperable principles. The pilot is very important for assessing its ability to help bridge the digital divide, says GLN coordinator Don Means.

Massive Data Analysis Fraught With Challenges, Says National Research Council
FierceGovernmentIT (09/16/13) Molly Bernhart Walker

Massive data analysis is the focus of "Frontiers in Massive Data Analysis," a new book published by the U.S. National Research Council. Federal agencies with missions related to science and technology are spending more on research in this area with the aim of building capabilities, and the book outlines emerging challenges and opportunities for the government with regard to funding. The book's authors note that the tools, skills, and approaches used for smaller datasets will not translate to big data. "A major part of the challenge of massive data analysis is that of developing statistically well-founded procedures that provide control over errors in the setting of massive data, recognizing that these procedures are themselves computational procedures that consume resources," the authors say. The challenge of gleaning meaningful information from massive data is not isolated to one field, and the solution to the problem may come from different fields. The authors note that massive data analysis is an interdisciplinary enterprise, and both the public and private sectors face a shortage of big data experts. "The availability of benchmarks, repositories (of data and software), and computational infrastructure will be a necessity in training the next generation of data scientists," the authors say.

Cyber Security: The New Arms Race for a New Front Line
Christian Science Monitor (09/15/13) Anna Mulrine

As the U.S. government expands its cyberdefense capabilities to prepare for modern cyberwarfare, concerns are growing that these cybersecurity initiatives will come at the expense of privacy. In addition, legal questions abound regarding who should respond to cyberthreats and what actions should be taken. The Pentagon has built a cyberwar simulator called CyberCity that enables military officials to model attacks on financial networks, power grids, and water supply systems. Over the next two years, the military is adding between 900 and 4,900 cyberspecialists to its Cyber Command. Although the military is barred from participating in law enforcement endeavors, a new Department of Defense publication contends that the Pentagon can provide "law enforcement actions that are performed primarily for a military purpose, even when incidentally assisting civil authorities," including cyberattacks. In the future, military cyberspecialists will use their skills offensively as well as defensively to defend U.S. interests. Private companies also sell cybersecurity services, and some companies take matters into their own hands when victimized by cyberattacks. Although government officials contend that the cost of cyberattacks is significant, most attacks today are intellectual property theft and other corporate espionage, rather than the doomsday scenarios outlined by the military.

Google's Quest to End the Language Barrier
Der Spiegel (Germany) (09/13/13) Thomas Schulz

German scientist Franz Josef Och is leading Google's ambitious effort to build a universal translation tool. Progress has been made, with Google Translate currently capable of back-and-forth text translation between 71 languages. The Translate researchers also have devised an application that enables smartphones to function as speaking translators, and they can thus far accommodate about 24 languages. The app's performance depends on the simplicity of the sentences, while Och says drawbacks include the awkwardness of pushing buttons and inconsistent translation quality. The Translate team is notable in that it contains no linguists, which fits with Och's strategy. "I have trouble learning languages, and that's precisely the beauty of machine translation: the most important thing is to be good at math and statistics, and to be able to program," Och says. He notes that the Google Translate system correlates existing translations so that it can train itself to translate. "The algorithms sift through the Internet, collect data, and learn as they go." He says the translations tend to be better when there is a great deal of data and the languages are grammatically and structurally similar. Ultimately, Och says he wants to produce a translation machine that is so fast and inconspicuous "that you hardly notice it all, except as a whisper in your ear."

Rice Prof’s Sleuthing Helps ID Lost Van Gogh
Rice University (09/12/13) David Ruth; Mike Williams

Rice University professor Don H. Johnson performed a statistical analysis of X-ray images of the canvas behind the previously unknown "Sunset at Montmajour" to help identify the painting as an authentic work by Vincent Van Gogh. The researchers compared the canvas of the painting in question to "The Rocks," another Van Gogh painting at the Museum of Fine Arts, Houston (MFAH). "I pointed out the very close, but not exact, relationship of this painting's canvas to the canvas of the only Van Gogh in the MFAH," Johnson says. "Apparently, this pointed them in the direction of examining the Houston painting for a more detailed comparison." Johnson used a signal-processing algorithm that automatically analyzes the thread density in X-rayed canvases to reveal previously unavailable details about the materials. The software shows how loosely or tightly a canvas is woven, which can be used to create a map of the weaving variation pattern that is common in many different paintings. Although the weaving patterns revealed for "Sunset at Montmajour" and "The Rocks" do not line up perfectly, they were found to be from the same bolt of fabric.

The '50-50' Chip: Memory Device of the Future?
ScienceDaily (09/13/13)

A material built from aluminum and antimony shows promise for building next-generation phase-change memory devices, according to Xilin Zhou and colleagues at the Shanghai Institute of Microsystem and Information Technology at the Chinese Academy of Sciences. The environmentally-friendly electronic alloy, consisting of 50 aluminum atoms bound to 50 atoms of antimony, is more thermally stable than the most popular material used for phase-change memory devices, compounds with germanium, antimony, and tellurium. The researchers note the electronic alloy has three distinct levels of resistance, and thus the ability to store three bits of data in a single memory cell, instead of two; this characteristic suggests that it could be used for multilevel data storage (MLS). Phase-change memory is being actively pursued as an alternative to flash memory for data-storage applications. A phase-change memory device can operate faster, squeeze more memory into tinier spaces, and would be relatively inexpensive. Such devices potentially could be the data-storage technology of the future. "The aluminum-antimony material looks promising for use in high-density nonvolatile memory applications because of its good thermal stability and MLS capacity," Zhou notes.

Abstract News © Copyright 2013 INFORMATION, INC.
Powered by Information, Inc.

To submit feedback about ACM TechNews, contact:
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe

About ACM | Contact us | Boards & Committees | Press Room | Membership | Privacy Policy | Code of Ethics | System Availability | Copyright © 2014, ACM, Inc.