Association for Computing Machinery
Welcome to the October 18, 2013 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


IBM Seeks to Marry Biological and Artificial Computing
CNet (10/17/13) Stephen Shankland

IBM aims to gain insights about biological and artificial computing by building next-generation systems that meld concepts from both worlds. For example, IBM is using the human brain as a model for new designs, such as a fluid-based system for cooling and distributing electricity throughout a computer, which could enable processing power to be packed within three-dimensional volumes rather than spread out flat in two dimensions. The system would address the inability of communication links between processing elements to keep pace with data-transfer demands, and their over-consumption of power. IBM also is contributing to Europe's Human Brain Project, which seeks to simulate the brain computationally to obtain new knowledge about cognitive disorders and other brain-related behaviors. Researchers will use supercomputers to replicate the formation of the brain, and then observe responses to input signals from simulated senses and nervous systems. The research efforts are part of IBM's emphasis on cognitive systems, which is exemplified by the company's shift of the metrics it uses to judge computing success from operations per second to operations per liter. "If we want to make an impact in the cognitive systems era, we need to understand how the brain works," says IBM Research's Matthias Kaiserswerth.


Graphics Chips Help Process Big Data Sets in Milliseconds
Technology Review (10/18/13) David Talbot

Harvard University graduate student Todd Mostak and Massachussetts Institute of Technology professor Samuel Madden have developed MapD, software that uses the graphics processors found on everyday computers to quickly process massive amounts of data. MapD achieves big speed gains by storing the data in the onboard memory of graphics processing units (GPUs), instead of in central processing units. Using a single high-performance GPU card can make data processing up to 70 times faster. MapD was used to create a database of Twitter visualizations that enables users to explore different search terms, extrapolate geographical trends, or zoom in on a specific tweet. "We have built a new kind of database system," Madden says. "It will answer and also map every request by scanning through every tweet in the database, which can be done in just a few milliseconds." MapD's ability to access data faster than existing technology will enable researchers to test hypotheses and refine visualizations more quickly. "By building a tool to explore data sets like this in a truly interactive fashion, with latencies measured in milliseconds rather than seconds or minutes, we hope to remove a computational bottleneck from the process of hypothesis formulation, testing, and refinement," Mostak says.


A Symptom of a Larger Problem for Women in STEM
Associated Press (10/17/13) Michelle R. Smith

Brown University professor Anne Fausto-Sterling and university graduate Maia Weinstock recently organized a Wikipedia "edit-a-thon" with the goal of adding more information about female scientists. Part of the issue surrounding the lack of women in science, technology, engineering, and math (STEM) fields is the lack of recognition for women who have done pioneering work in those fields. Fausto-Sterling and Weinstock gathered dozens of students and some faculty members to train them on how to add and edit pages. They also provided lists of suggestions for women to add, entries to clean up or those who needed more detail, along with links to source material. "You're helping change what everybody else gets to see on a particular topic," Weinstock says. The researchers also hope to increase the number of women who contribute to Wikipedia, as fewer than 20 percent of Wikipedia editors are women. The researchers held the event to coincide with Ada Lovelace Day, an annual observance started in Great Britain in 2009 to highlight women in technology.


Taking the Internet Underwater
University at Buffalo News (10/14/13) Cory Nealon

University at Buffalo researchers are developing a deep-sea Internet framework that could lead to improvements in tsunami detection, offshore oil and natural gas exploration, surveillance, pollution monitoring, and other activities. "A submerged wireless network will give us an unprecedented ability to collect and analyze data from our oceans in real time," says Buffalo professor Tommaso Melodia. "Making this information available to anyone with a smartphone or computer, especially when a tsunami or other type of disaster occurs, could help save lives." The network would transmit data from existing and planned underwater sensor networks to laptops, smartphones, and other wireless devices in real time. Melodia also says the network would encourage collaboration among researchers and potentially eliminate the duplicative deployments of sensors and other equipment. In addition, he says the wireless framework could be applied to the energy industry, which typically relies on seismic waves to search for underwater oil and natural gas. "We could even use it to monitor fish and marine mammals, and find out how to best protect them from shipping traffic and other dangers," Melodia says.


Internet Infrastructure Groups Move Away From U.S. Gov't Over Spying
IDG News Service (10/16/13) Grant Gross

After recent revelations about the U.S. National Security Agency's surveillance of Internet communications, the coordination of the Internet's technical infrastructure should move away from U.S. government oversight, according to 10 groups involved in the Internet's technical governance. The groups, which include the Internet Corporation for Assigned Names and Numbers (ICANN), the Internet Society, the Internet Engineering Task Force, and the World Wide Web Consortium, also said Internet organizations should accelerate the globalization of the Internet domain name functions performed by ICANN and traditionally overseen by the U.S. government. In addition, the 10 groups "expressed strong concern over the undermining of the trust and confidence of Internet users globally due to recent revelations of pervasive monitoring and surveillance," in a joint statement. Although China, Russia, and other countries have long been seeking to limit the U.S. influence on Internet governance, controversy over NSA's surveillance programs have brought the issue to fore. "Controversy over the extent of the NSA's PRISM program, the very concept of cybersurveillance, and the reactions of stakeholders...are just the most recent developments highlighting the complexity and reach of these issues," says .au Domain Administration CEO Chris Disspain.


Dude, Where's My Code?
MIT News (10/16/13) Larry Hardesty

Massachusetts Institute of Technology (MIT) researchers have developed Stack, a system that automatically combs through programmers' code, identifying those lines that compilers might discard, but that could be functional. When hundreds of developers work on an application with millions of lines of code that have been continually revised over long periods of time, some of those functions never end up getting executed. These functionless codes, known as dead codes, should be removed. However, problems come up when compilers also remove code that leads to undefined behavior. "It turns out that the C programming language has a lot of subtle corners to the language specification, and there are things that are undefined behavior that most programmers don't realize are undefined behavior," says MIT professor Frans Kaashoek. The researchers combed through the C language specifications and tried to identify every possible undefined behavior that a programmer might inadvertently invoke. Stack compiles a program once as it seeks to remove dead code, and then a second time to remove dead code and undefined behavior. Stack then identifies all of the code that was cut the second time but not the first, and warns the programmer it could pose problems.


The Rapid Advance of Artificial Intelligence
The New York Times (10/14/13) John Markoff

The annual Computer Vision and Pattern Recognition conference is a showcase for the latest work in the realm of artificial intelligence (AI) by scientists, roboticists, and others, which could potentially transform the world within the next five years. Replacing humans with robots for hazardous jobs is a key focus of such research, and in December the U.S. Defense Advanced Research Projects Agency will hold the first of two events in a $2-million contest to construct a machine that could substitute for rescue workers in dangerous environments. Some contest entrants will use the humanoid Atlas robot as the basis for their innovations, while others will take non-humanoid forms. Also forthcoming are intelligent workplace machines that can monitor and even tactilely sense coworkers so as to avoid harming them. Experts say that rapidly advancing computer-vision technology is just one of several artificial intelligence-oriented technologies that are leading to a sea change in the technology industry beyond the personal and the Internet. "During the next decade we’re going to see smarts put into everything,” says the University of Washington's Ed Lazowska. "Smart homes, smart cars, smart health, smart robots, smart science, smart crowds, and smart computer-human interactions."


The Algorithms That Shape the World
HPC Wire (10/14/13) Tiffany Trader

Massachusetts Institute of Technology professor Kevin Slavin is studying how algorithms have evolved to become the dominant mechanism for financial services. In an interview, Slavin notes that algorithmic trading grew out of the need to move millions of shares through the financial markets. As a result, a method was developed to break up the transactions into millions of smaller transactions using algorithms. "The magic and the horror of that, is that the same math that is used to break up the big thing into a million little things can be used to find a million little things and sew them back together to figure out what's actually happening in the market," Slavin says. He notes that these kinds of algorithms also are being used to set prices in online marketplaces, such as eBay and Amazon, while Netflix is using algorithms to power its recommendations. "When you think about this, that we're running through the United States with dynamite and rock saws so that an algorithm can close the deal three microseconds faster, all for a communications framework that no human will ever know, that's a kind of manifest destiny--and we'll always look for a new frontier," Slavin says.


Coders Go the Distance for Charity
SD Times (10/10/13) Suzanne Kattau

More than 2,000 independent developers in 21 cities around the world recently participated in #hack4good, a 48-hour coding marathon to create applications that solve technical problems for charities. Geeklist, a social networking site for developers, hosted the event, and the winning apps were chosen for each city based on the ability to address the issue for which it was developed. "But what was different about this event was that it was the first-ever, tandem, global hackathon where all the cities participated all at the same time--with the same start and end times in their respective time zones," says Geeklist CEO Reuben Katz. Several technology firms provided tools and other assistance to the participants, and the winning apps will benefit charities such as Amnesty International and Friends of the Earth. For example, one app enables pet rescue agencies to pinpoint where dogs and cats are found, record their characteristics, and develop a registry. Katz notes that people who participate in these events are not seeking financial rewards. "The reward that they get is that they've done something for social good that helps," he says. "And to maintain that sanctity of social good, we really try to focus in very heavily on running events that developers really get personal, long-term value out of."


Google Offers White Hats $3,000 for Open Source Project Security Patches
V3.co.uk (10/10/13) Alastair Stevenson

Google will pay bug hunters and security professionals up to $3,133 to improve the security of several open source projects. In announcing the extended Patch Rewards program, Google's Michal Zalewski says current bug bounty programs need to do more to improve the overall security of open source projects. Google is taking the model it pioneered with its Vulnerability Reward Program and applying it to improve the security of key third-party software critical to the health of the entire Internet, Zalewski notes. "So we decided to try something new: provide financial incentives for down-to-earth, proactive improvements that go beyond merely fixing a known security bug," he says. "Whether you want to switch to a more secure allocator, to add privilege separation, to clean up a bunch of sketchy calls to strcat, or even just to enable ASLR--we want to help!" Rewards will range from $500 to $3,133, and will be determined by a panel of Google judges. Current projects covered by the program include OpenSSH, BIND, ISC DHCP, libjpeg, libpng, giflib, Chromium, Blink, OpenSSL, zlib, and certain security-critical parts of the Linux kernel. Zalewski notes that Google intends to extend the program to include upgrades to Apache httpd, lighttpd, nginx, Sendmail, Postfix, Exim, GCC, binutils, llvm, and OpenVPN in the near future.


Building Strong Research Infrastructures for the Future
CORDIS News (10/10/13)

The European Union (EU)-funded EuroRIs-Net+ project has made a significant contribution toward improving Europe's long-term research capabilities. The project focused on building on the existing Network of National Contact Points for the Research Infrastructures program (RIs NCPs), which provide guidance, practical information, and assistance on all aspects of participation in EU funding schemes. The project found an increased number of services to the RI scientific communities, industry, and public stakeholders at the European and international levels. In addition, the RI Observatory, an interactive information infrastructure that facilitates access to integrated and updated RI-related information, boosts the dialogue between various RI stakeholders. As part of the EuroRIs-Net+ project, the Stakeholder Forum, which enables online discussions between members, has been upgraded with several new functionalities. The main goal of the research infrastructures part of the FP7 Capacities program is to optimize the use and development of the best research infrastructures existing in Europe. The program also aims to help create new research infrastructures of pan-European interest in all fields of science and technology.


The Future Fabric of Data Analysis
Quanta Magazine (10/09/13) Jennifer Ouellette

A new approach to computation is required for managing big data because of the shift to a decentralized, distributed computer architecture. "The challenge is not how to solve problems with a single, ultra-fast processor, but how to solve them with 100,000 slower processors," says Stanford University's Stephen Boyd. A new kind of computing fabric must accommodate growth in data sets' size and complexity that outpaces the expansion of computing resources, according to the California Institute of Technology's Harvey Newman. Boyd's consensus algorithms use a strategy in which a data set is split into bits and distributed across 1,000 agents that analyze their individual bit and generate a model based on the data they have processed, and all of the models must ultimately agree. The iterative process supports a feedback loop, in which initial consensus is shared with all agents, which update their models in view of the new information and reach a second consensus, and so on until all the agents are in agreement. Meanwhile, quantum computing might aid big data by searching large, unsorted data sets, and key to this advance is a quantum memory accessible in quantum superposition. Massachusetts Institute of Technology professor Seth Lloyd has conceived of a prototype system and accompanying app that he believes could uncover patterns within data while preserving superposition by not actually looking at any individual records.


Abstract News © Copyright 2013 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe