Association for Computing Machinery
Welcome to the April 10, 2013 edition of ACM TechNews, providing timely information for IT professionals three times a week.

ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


Laureates Urge No Cuts to Budgets for Research
New York Times (04/09/13) William J. Broad

More than 50 Nobel laureates are urging Congress to save the federal research establishment from budget cuts, which they say could endanger the innovation engine that is essential to the U.S. economy. "We urge you to keep the budgets of the agencies that support science at a level that will keep the pipelines full," the laureates say in a letter to Congress. The group's concern is mostly for young scientists who might be poised to produce the breakthroughs of tomorrow, but who are not at the top of the list for federal funding. From 2009 to 2012, the federal financing of research and development (R&D) dropped 18 percent from $172.5 billion to $140.6 billion, according to the American Association for the Advancement of Science. This year's budget cuts would lower the R&D budget another six percent, to about $131 billion, the association adds. The letter notes that R&D spending has previously received bipartisan support. "The agreement exists, because of recognition that this sort of research fuels the innovation engine that is essential to our economy," the letter says.


Supercomputers Could Generate Warning System for Stock Market Crashes
San Jose Mercury News (04/09/13) Lisa M. Krieger

Lawrence Berkeley National Laboratory's Edison supercomputer can track ultra-fast trading across the U.S.'s financial markets, detecting precursors to a crash. "If improved monitoring and regulation can build some greater trust in the market, everyone benefits," says Berkeley Lab's Center for Innovative Financial Technology director David H. Bailey. When fully developed later this year, Edison will be able to perform up to two quadrillion operations a second, which is fast enough to track every trade, in real time, on every U.S. stock exchange. In 2011, using Berkeley Lab's Hopper supercomputer, researcher David Leinweber found that a supercomputer could use a recently identified measure to warn of a looming flash crash. The method, called Volume-synchronized Probability of Informed trading, detects an imbalance between buy and sell orders, and growing volatility, about 45 minutes before a crash. A second measure of market instability, the Herfindahl-Hirshman Index, also rose sharply for some stocks, although not for others. These two signals of instability might be useful "for regulators to impose some rules that might slow down the market so we don't get into a sort of feeding frenzy," says Berkeley Labs researcher John Wu.


UC San Diego Computer Scientists Develop First-person Player Video Game That Teaches How to Program in Java
UCSD News (CA) (04/08/13) Ioana Patringenaru

University of California, San Diego (UCSD) researchers have developed CodeSpells, an immersive, first-person player video game designed to teach students how to program in Java. The researchers tested the game on a group of 40 girls, ages 10 to 12, who had never been exposed to programming before. The researchers found that within just one hour of play, the girls had mastered some of Java's basic components and were able to use the language to create new ways of playing with the game. "CodeSpells is the only video game that completely immerses programming into the game play," says UCSD professor William Griswold. The researchers plan to release the game for free and make it available to any educational institution that requests it. The researchers designed the game to keep children engaged while dealing with the difficulties and frustrations of programming. "We’re hoping that they will get as addicted to learning programming as they get addicted to video games," says UCSD graduate student Stephen Foster. The program is based on research Foster and fellow UCSD graduate student Sarah Esper conducted on what made programmers successful. Their survey of 30 computer scientists identified five keys to learning programming outside of a classroom.


Should Universities Offer Cobol Classes?
Computerworld (04/08/13) Patrick Thibodeau

There are billions of lines of Cobol code still in use at large businesses and in government agencies, and many experts say that will be the case for years. However, Cobol is only taught as an elective at most universities. Syracuse University professor David Dischiave wants students to emerge from college as critical thinkers with practical skills, including having experience with Cobol. "Employers are knocking on our door trying to hire as many [Cobol-trained students] as they can," Dischiave says. A recent Micro Focus survey of 119 universities found that 73 percent do not offer Cobol programming as part of their curriculum, 18 percent have Cobol as a core part of the curriculum, and 9 percent offer Cobol as an elective. The survey also found that 71 percent of respondents believe businesses will continue to rely on Cobol-based applications for at least the next 10 years. Carnegie Mellon University professor Ray Scott sees value in a familiarity with Cobol, noting that it helps students understand, for example, how a legacy backend system puts out a payroll. "They really don't see that, so much of what they do is Web interface," Scott says.


Technique Finds Software Bugs in Surgical Robots and Helps Developers Fix Flaws, Ensure Safety
Carnegie Mellon News (PA) (04/08/13) Byron Spice

Researchers at Carnegie Mellon and Johns Hopkins universities have demonstrated that methods for reliably detecting software bugs and verifying software safety can be applied to surgical robots. The researchers used theorem-proving techniques to analyze a control algorithm for a research robot that would help a doctor perform surgery at the base of the skull. "These techniques are going to change how people build robotic surgery systems," says Johns Hopkins professor Yanni Kouskoulas. The researchers developed an approach based on differential dynamic logic and an associated tool called KeYmaeraD that can be used to model a hybrid system and its properties and then pick it apart. The approach can verify that a design is safe or generate counter-examples of how the system can fail. The researchers applied their approach to evaluate the control algorithm for the skull-base surgery robot. As the process approaches the surgical boundary, it exerts force feedback to warn the surgeon. If the tool reaches the boundary, the robot stops it from going farther. "This study shows that formal verification methods can be applied successfully to medical robotics and that further development is warranted," says Carnegie Mellon professor Andre Platzer.


Tech Titans Plot to Reprogram Internet of the Future
Wired News (04/08/13) Cade Metz

The Internet and corporations still largely depend on switches, routers, and other networking hardware, but these components run on aging software that makes it nearly impossible for the gear to do something new. Google hopes to change this by designing its own networking hardware that operates its own software; this trend is called software-defined networking (SDN) and provides more leeway in purchasing hardware. Many tech firms are now collaborating to develop an open source SDN project known as OpenDaylight. Its backers include Cisco, Juniper, Hewlett-Packard, IBM, and Microsoft, which hope to create software that enables the creation of networks that are more flexible than conventional ones. Ideally, networks will be able to mix and match software and hardware from multiple sources. "With mobile and social and cloud computing, networks are growing to unprecedented sizes, and we've been forced to find new ways of building them," says Wiretap Ventures' Matthew Palmer. OpenDaylight is comparable to an existing project called Floodlight, overseen by Big Switch Networks, which also has joined the OpenDaylight initiative.


Radical Roads Drive Robot Cars
BBC News (04/08/13) Jon Stewart

Los Angeles has completed the Automated Traffic Surveillance and Control system to synchronize its 4,400 traffic lights using roadway cameras and sensors to measure traffic and a centralized computer system to adjust lights to maintain flow. The $400-million system, which is designed to enable drivers to cross the city without stopping, is expected to boost travel speeds by 16 percent and reduce trip times by 12 percent. Although the system is currently cutting-edge, it might be superannuated after engineers redesign streets to make way for autonomous vehicles. Driverless cars are projected to line up close together and travel at speeds faster than currently permissible, maneuvering through unmarked intersections without traffic lights or lane markings. Autonomous cars will represent 75 percent of all vehicles on the roads by 2040 in the United States, according to the Institute of Electrical and Electronics Engineers. Vehicles such as Google’s self-driving car are emerging, and some cities are experimenting with linked-up roads. The potential exists for such cars to operate on intelligent road networks without congestion or collisions, sharing their intended routes and current positions through a vehicle-to-infrastructure communication system. The Vehicle Infrastructure Integration Consortium is working toward this in the United States.


The Potential and the Risks of Data Science
New York Times (04/07/13) Steve Lohr

Columbia University recently held a symposium to introduce its new Institute for Data Sciences and Engineering, a collection of interdisciplinary centers, including cybersecurity, financial analytics, health analytics, new media, and smart cities. The event featured presentations by Columbia professors as well as computer scientists from companies such as Google, Facebook, and Bloomberg on the potential of the technology in a range of fields. Although privacy issues did not play a significant role in the symposium, Google CIO Ben Fried said his biggest "concern is that the technology is way ahead of society,” suggesting risks associated with public rejection or runaway technology if only a limited number of people understand big data. Massachusetts Institute of Technology Media Lab computational social scientist Alex Pentland said he is leading a group that is examining the implications of “a data-driven society," while Columbia journalism professor Mark Hansen is teaching journalism majors some data programming so that “society’s explainers of last resort” have a better grasp on technology. One panelist suggested the Columbia institute might eventually incorporate a center similar to Harvard University's Berkman Center for Internet and Society, which focuses on technology's effects on society.


Baseball Meets Internet of Things: Goodbye, Bad Umpires?
InformationWeek (04/06/13) Michael Endler

Researchers at the Institute of Electrical and Electronics Engineers (IEEE) are using the concept of the Internet of things to enhance sporting events. For example, connected baseball equipment, in combination with sensors and transmitters located through a stadium, could deliver much more precise verdicts for frequently contested situations such as foul balls, stolen bases, and border-line pitches. The researchers say the technology should be accurate to an extent that human observers, such as umpires, cannot equal. Sensors also could be used to improve techniques during training. For example, connected bats could measure anything from a player's posture to his swing speed to how firmly he grips the bat. When this information is combined with the corresponding big data algorithms, the measurements could pinpoint small imperfections in a player's form and lead to more effective training strategies. Similar approaches could be applied to better understand sports injuries, or to enhance the fan experience watching TV. "Players are experiencing the plays and viewers are viewing them, but how could we provide a more real-world experience for the fans?" says IEEE senior member and University of Texas at Dallas professor Roozbeh Jafari.


Japan Lab Claims Its Software Can Read Dreams
IDG News Service (04/05/13) Jay Alabaster

Software developed by a Japanese research institute can determine what people are dreaming about by analyzing their brain waves. A team at the lab of the Advanced Telecommunications Research Institute International (ATR) connected test subjects to an electroencephalogram, had them sleep inside a magnetic-resonance imaging machine, asked what they were dreaming about, and then showed images that corresponded to the general dream categories. The ATR's Brain Information Communication Research Lab created a database of brain activity when dreaming about objects and when actually viewing objects. The researchers say the software was able to match the subject of dreams to one of about 20 general categories, with about 70-percent accuracy. The method could be used with people who are awake but have trouble expressing themselves, such as people who suffer from hallucinations or are mentally ill. The researchers say ATR's goal is to improve the field of brain-machine interfaces.


New Tool Promises Private Photo-Sharing--Even Using Facebook and Flickr
USC News (04/05/13)

University of Southern California researchers have developed Privacy-Preserving Photo Sharing (P3), software designed to ensure the privacy of photos shared with cloud file-sharing services such as Facebook and Flickr and to enable the owner to retain the rights to the photos. P3 removes small amounts of crucial data from a photo and encrypts it. Facebook, for example, has a non-exclusive license to use a photo until the account is deleted, but with P3 the site would only gain rights to the uploaded portion of the photo, which would be degraded and unrecognizable. The tool enables the user to retain the rights to the complete photo. "Nobody doubts the convenience of cloud-based sharing, the question is whether we can trust third parties to protect our photos from unauthorized distribution or use," says USC professor Antonio Ortega. "With P3, you decide how your photos can be used, without losing the convenience of sharing them on through the cloud."


British Library Sets Out to Archive the Web
Associated Press (04/05/13) Jill Lawless

The British Library is archiving every British website and e-book in a monumental undertaking aimed at maintaining the country's digital memory for future researchers, regardless of technological change. With the rise in popularity of computers and mobile phones, future historians have been losing valuable material, such as firsthand accounts of the 2005 London transit bombings. "The average life of a Web page is only 75 days, because websites change, the contents get taken down," says the British Library's Lucie Burgess. "If we don't capture this material, a critical piece of the jigsaw puzzle of our understanding of the 21st century will be lost." Although the British Library has been archiving pieces of the Web for years and has collected about 10,000 sites, in the past it had to obtain permission from website owners to do so. A 2003 law changed the permission requirement, but legislative and technological issues have slowed the archiving project. The effort relies on an automated Web harvester that will scan and record 4.8 million sites with 1 billion Web pages. To protect the archive for future generations, there will be multiple self-replicating copies on servers across the United Kingdom, and files will be converted into updated formats as technology evolves.


Welcome to the Data Driven World
NextGov.com (04/05/13) Joseph Marks

The University of Wisconsin's GeoDeepDive is a huge database that aims to leverage big data in the field of geosciences by helping geologists access data that otherwise would have been buried under mountains of accumulated information. Funded by a White House initiative launched in March 2012 to help government agencies, businesses, and researchers make better use of big data, GeoDeepDive will cull scanned pages from pre-Internet science journals, generations of websites, archived spreadsheets, and video clips into a database that aims to include all geological data. The system will eventually use contextual clues and technology similar to IBM’s Watson to enable geologists to query what professors Miron Livny and Christopher Re, the project's creators, refer to as dark data. "These new tools have that promise--to change the types of questions we’re able to ask and the nature of answers we get," University of Wisconsin geologist Shanan Peters says. Although vast and ever-growing seas of data have long existed, the ability to make that data useful is relatively new. Three factors have converged that offer to make big data usable, including the growth of large computer clouds, new software that connects hundreds of computers to act as a single system, and a greatly improved ability to parse unstructured data.


Abstract News © Copyright 2013 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe