Association for Computing Machinery
Welcome to the June 27, 2012 edition of ACM TechNews, providing timely information for IT professionals three times a week.

ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).


How Many Computers to Identify a Cat? 16,000
New York Times (06/25/12) John Markoff

Google researchers have created one of the largest neural networks for machine learning, consisting of 16,000 computer processors and more than one billion connections. The researchers fed the system 10 million digital images from YouTube for processing, and it performed much better than previous efforts by doubling its accuracy in recognizing objects from a list of 20,000 distinct items. The researchers also note the network taught itself to recognize cats. The software-based neural network closely mirrors theories developed by biologists that suggest individual neurons are trained inside the brain to detect significant objects. "The idea is that instead of having teams of researchers trying to find out how to find edges, you instead throw a ton of data at the algorithm and you let the data speak and have the software automatically learn from the data," says Google and Stanford University researcher Andrew Y. Ng. The system created a dream-like digital image of a cat by utilizing a hierarchy of memory locations to successively identify general features after being exposed to millions of images. The researchers say the neural network provides new evidence that current machine-learning algorithms improve as the systems are given access to larger pools of data.

Graph500 Adds New Measurement of Supercomputing Performance
Sandia National Laboratories (06/25/12) Sue Holmes

The Graph500 executive committee recently announced new specifications for a more representative way to rate large-scale data analytics in high-performance computing. Graph500 rates machines on their ability to solve complex problems that have seemingly infinite numbers of components, rather than ranking machines on how fast they solve those problems. Graph500 executive committee member and Georgia Tech University professor David A. Bader says the latest benchmark "highlights the importance of new systems that can find the proverbial needle in the haystack of data." The new specification will measure the closest distance between two things, such as the smallest number of people between two random people in the LinkedIn network, says Sandia National Laboratories researcher Richard Murphy. Large data problems are especially important in cybersecurity, medical informatics, data enrichment, and social and symbolic networks. "A machine on the top of this list may analyze huge quantities of data to provide better and more personalized health-care decisions, improve weather and climate prediction, improve our cybersecurity, and better integrate our online social networks with our personal lives," Bader says.

Computer Graphics Pioneer Ivan Sutherland Wins Kyoto Prize
IDG News Service (06/25/12) Agam Shah

Portland State University scientist Ivan Sutherland was awarded the Kyoto Prize in Advanced Technology by the Inamori Foundation for his contributions to the field of computer graphics. "Numerous computer graphic-based applications--ranging from films, games, and virtual reality systems to educational materials, scientific and technological simulations, and other design aids for engineers--are descendants of Dr. Sutherland's original work on Sketchpad," the Inamori Foundation says. Sketchpad, which Sutherland submitted as part of his doctor thesis at the Massachusetts Institute of Technology in 1963, enables a pointing device to interact and manipulate visible objects on a computer screen. The Sketchpad innovation has been linked to the commercialization of Microsoft's Windows and Apple's Macintosh user interfaces. Sutherland also has been recognized for his contributions to virtual reality and three-dimensional computer graphics. He was awarded ACM's A.M. Turing Award in 1988 for his contributions to the field of computer graphics. Sutherland's legacy also includes the work of former students and employees, such Silicon Graphics founder James Clark, Pixar Animation Studios co-founder Edwin Catmull, and Adobe Systems co-founder John Warnock.

Bot With Boyish Personality Wins Biggest Turing Test
New Scientist (06/25/12) Celeste Biever

The chatbot Eugene Goostman fooled Turing test judges 29 percent of the time into thinking it was human to take first place in the recent contest in the United Kingdom. The event marked the 100th anniversary of the birth of Alan Turing, who said that a machine that fooled humans 30 percent of the time would have beaten the test. Eugene creator Vladimir Veselov says the results were more statistically significant than previous Turing tests because it was the biggest contest. Thirty human judges had more than 150 separate conversations via a text interface with 25 hidden humans and five software programs. The judges had to determine whether they were chatting with a human or a machine. Typical tests involve just four humans and four machines. Robby Garner's JFred took second place and Rollo Carpenter's Cleverbot took third. Several bots put sentences together by imitating people they had spoken to before or by searching through Twitter transcripts for conversation ideas, but Veselov created Eugene with the specific personality of a 13-year-old boy. "He has created very much a person where Cleverbot is everybody," Carpenter says.
View Full Article - May Require Free Registration | Return to Headlines

Immigrants Are Crucial to Innovation, Study Says
New York Times (06/26/12) Andrew Martin

Immigrants played a role in more than three out of four patents at the top U.S. research universities, according to a study by the nonprofit Partnership for a New American Economy. Of these patents, nearly all were in science, technology, engineering, and math (STEM) fields that help spur job growth. The study sought to measure the potential costs of immigration policies by examining 1,469 patents from 10 universities and university systems that obtained the most patents in 2011. The schools included the University of California system, Stanford University, and the Massachusetts Institute of Technology. The study's authors say patents are a barometer of a nation's level of innovation and a key way for the United States to maintain an edge in STEM fields. The study found that nine out of 10 patents at the University of Illinois system in 2011, for example, had at least one foreign-born inventor. Of those, 64 percent had a foreign inventor who was not yet a professor but instead a student, researcher, or postdoctoral fellow, who are more likely to be subject to immigration restrictions. The study also found that although many leading foreign-born innovators are trained at U.S. universities, they frequently leave the country following graduation due to such obstacles.

An Online Encyclopedia That Writes Itself
Technology Review (06/26/12) David Talbot

The U.S. Defense Advanced Research Projects Agency (DARPA) recently collaborated with Raytheon BBN researchers to develop a system that can follow global news events and provide intelligence analysts with useful summaries in close to real time. The system gathers information from 40 news Web sites written in English, Chinese, and Arabic, and it will eventually include hundreds of news sites in all major languages, as well as links to an existing TV broadcast monitoring system. The BBN system captures everything that appears on news sites and constantly and automatically adds information, says BBN researcher Sean Colbath. The system starts by detecting a name or organization and then it identifies other entities that are connected to it, as well as statements made by and about the subject. "Here the machine has learned, by being given examples, how to put these relationships together and fill in those slots for you," Colbath says. The technology incorporates recent improvements in machine learning, enabling it to do a better job of understanding when the same underlying event is described in different ways, says DARPA's Bonnie Dorr.

What if There Were No More Disasters?
CCC Blog (06/25/12) Robin Murphy; Erwin Gianchandani

Texas A&M University professor Robin Murphy recently released "Computing for Disasters: A Report from the Community Workshop," which details the role of computing in disaster management, including preparedness, prevention, response, and recovery. Murphy, along with Trevor Darrell, recently co-chaired the Workshop on Computing for Disaster Management, which was jointly sponsored by the U.S. National Science Foundation and the Computing Community Consortium. The workshop, which brought together 45 participants, aimed to formalize what the participants were individually seeing that made computing research for disasters unique. The participants found that disasters were more than an application area, and required significant understanding of the larger socio-technical system in order to conduct research with far-reaching impacts. The report concludes that a robust, multidisciplinary community, in which researchers partner with practitioners to take on fundamental new research in socio-technical systems that enable decision making for extreme scales under extreme conditions, would be necessary in order to make disasters and emergency situations obsolete.

Sifting Through a Trillion Electrons
Lawrence Berkeley National Laboratory (06/26/12) Linda Vu

Researchers at Lawrence Berkeley National Laboratory, the University of California, San Diego (UCSD), Los Alamos National Laboratory, Tsinghua University, and Brown University have developed software strategies for storing, mining, and analyzing huge datasets that focus on data generated by VPIC, a state-of-the-art plasma physics code. When the researchers ran VPIC on the U.S. Department of Energy’s National Energy Research Scientific Computing Center’s supercomputer, they generated a three-dimensional (3D) magnetic reconnection dataset of one trillion particles. VPIC simulated the process in thousands of time-steps, periodically writing a 32-terabyte file to disk at specified times. The researchers applied an enhanced version of the FastQuery tool to index the massive dataset in about 10 minutes. "This is the first time anyone has ever queried and visualized 3D particle datasets of this size," says UCSD researcher Homa Karimabadi. VPIC accurately models the complexities of magnetic reconnection by breaking down the process into distinct pieces, each of which are assigned, using message passing interface, to a group of processors to compute, according to Lawrence Berkeley's Surendra Byna. By dividing up the work, the researchers can simultaneously use thousands of processors to simulate the complex phenomenon of magnetic connection.

'Twisted Light' Carries 2.5 Terabits of Data Per Second
BBC News (06/25/12)

Researchers at the University of Southern California (USC), Tel Aviv University, and the U.S. National Aeronautics and Space Administration's Jet Propulsion Laboratory have developed a method for using light to carry data. The researchers used orbital angular momentum (OAM) to create light beams that can carry 2.5 terabits of data per second. OAM, which only recently has been realized as a viable solution for transmitting data, is used to create light waves with different amounts of twist, like screws with different numbers of threads. The researchers prepared two sets of four light beams, each with a set level of OAM twist, and each of the eight beams containing their own data stream. At the receiving end, the process is undone and the single beam is unpacked to yield its eight constituent beams, which combined carry about 2.5 terabits of data per second. "For situations that require high capacity... over relatively short distances of less than 1km, this approach could be appealing," says USC professor Alan Willner. "Of course, there are also opportunities for long-distance satellite-to-satellite communications in space, where turbulence is not an issue."

You Are Where You E-Mail: Global Migration Trends Discovered in Email Data
Max Planck Gessellschaft (06/25/12) Silvia Leek

Researchers at the Max Planck Institute for Demographic Research (MPIDR) and Yahoo! Research recently completed a large migration database based on the global flow of millions of emails. MPIDR's Emilio Zagheni and Yahoo! Research's Ingmar Weber traced 43 million anonymous Yahoo! users' emails from September 2009 through June 2011 in order to infer the residence of the sender. In addition to the date and geographic location of each message, the researchers incorporated the self-reported age and gender of the sender. When a user started sending emails from a new location permanently, the researchers assumed that they had changed residence, which allowed them to determine rates of migration to and from almost every country in the world. In the United States, the researchers were able to produce a chart of emigration by age and gender. The data is supported by the vast number of emails available, as well as by a mathematical model that adjusts for typical shortcomings of email statistics, such as the fact that older generations tend to use email less or not at all, and thus are underrepresented. "This research has the most potential in developing countries, where the Internet spreads much faster than registration programs develop," Zagheni says.

Computer Science Tackles 30-Year-Old Economics Problem
MIT News (06/25/12) Larry Hardesty

Massachusetts Institute of Technology (MIT) researchers have developed an algorithm for finding an almost perfect approximation of the optimal design of a multi-item auction. The difficulty in finding the optimal multi-item auction suggests that it has such a wide range that there is no simple description that provides the optimal outcome, says MIT professor Constantinos Daskalakis. To maximize revenue, the auctioneer might have to agree to sell an item at some fraction of the highest bid, a fraction that could depend on several factors, including the difference between the top two bids, the final price of the previous item on the docket, and the populations from which the bidders are drawn. The researchers demonstrated that the optimal auction can be described as a probabilistic combination of many simple auctions, known as Vickrey-Clarke-Groves (VCG) auctions. "The crucial thing is understanding what deterministic algorithms are in this decomposition," Daskalakis says. The researchers studied a specific type of VCG auction, in which the bids are modified according to the populations from which the bidders are drawn. They found that awarding an item to the lower bidder, as often happens with modified bids, can increase the auctioneer's revenue.

Researchers Create the First GPS for the Blind
UAB Barcelona (06/21/12)

Autonomous University of Barcelona researchers have developed OnTheBus, an Android application for traveling around big cities. They say its universal design principles make it appropriate for people with visual, hearing, or cognitive impairments. OnTheBus features a compass and accelerometer, and support for global positioning system, voice recognition, and 3G or Wi-Fi technologies. The app provides a list of optimal routes, the user makes a choice, and then the app guides the user to the nearest bus stop and indicates the amount of time remaining until the bus arrives. Once on the bus, the app informs the user of the number of stops, signals when it is time to press the button to exit, and continues to guide the user to the destination. The researchers have made the app available at Google Play, and it can be used in Barcelona, Madrid, and Rome. They also plan to make the app available for other markets, and expand it beyond the current Spanish, Catalan, English, and Italian language versions. The team also wants to incorporate other transportation services such as taxis, use augmented reality techniques to locate stop signs and public transport stops, and integrate social networks.

North American Software Developers Getting Younger: Study
eWeek (06/21/12) Darryl K. Taft

Thirty-eight years is now the median age of software developers in North America, which is a significant decline from a peak of 46 in 2008, according to a recent Evans Data study. North American developers also are statistically younger than their counterparts in the Europe, Middle East, and Africa region, which has a median age of 39. However, North American developers are older than those in the Asia-Pacific region or Latin America, where the median ages are 34 and 35, respectively. "The median age decline in North America is interesting and most likely reflects two situations we've been experiencing since 2008," says Evans Data CEO Janel Garvin. "The advent of mobile with new devices and distribution channels has attracted younger developers, while at the same time, the recession has resulted in many older developers retiring or being laid off." The average age remains relatively high in North America at 44, which suggests that some very old developers have not left programming. In other regions, the median and mean ages have risen steadily in recent years.

Abstract News © Copyright 2012 INFORMATION, INC.
Powered by Information, Inc.

To submit feedback about ACM TechNews, contact:
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe

About ACM | Contact us | Boards & Committees | Press Room | Membership | Privacy Policy | Code of Ethics | System Availability | Copyright © 2014, ACM, Inc.