Association for Computing Machinery
Welcome to the January 20, 2012 edition of ACM TechNews, providing timely information for IT professionals three times a week.

ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


Public Outcry Over Antipiracy Bills Began as Grass-Roots Grumbling
New York Times (01/19/12) Jenna Wortham; Claire Cain Miller

Online protests against two U.S. antipiracy bills were backed by 115,000 Web sites and prompted 3 million people to email Congress and voice their opposition to the legislation, according to Fight for the Future. The protests demonstrate the growing power of the Internet and social media to influence public policy and forced many lawmakers to change their positions on the bills. The protests are offshoots of a broader grass-roots movement whose origins can be traced to some less-mainstream Web segments, such as the Reddit social news site. "The tech community is using its own technology to rally around the issue," notes Silicon Valley entrepreneur Ron Conway. The movement against the Stop Online Piracy Act (SOPA) and the Protect Intellectual Property Act began to gather momentum when the blogging service Tumblr added a feature that blacked out the dashboard users see when they log in and directed them to information about SOPA. Reddit members started tracking down SOPA advocates and pressuring them to drop support for the bill or lose business. New York University professor Clay Shirky compares the SOPA protest movement to a 21st century phone tree, noting that "it pervaded people's consciousness more and more as time went on."


The Faster-Than-Fast Fourier Transform
MIT News (01/18/12) Larry Hardesty

Massachusetts Institute of Technology (MIT) researchers say they have developed an algorithm that improves on the fast Fourier transform (FFT). They say the algorithm could be particularly useful for image compression, enabling smartphones to wirelessly transmit large video files without draining their batteries or consuming their monthly bandwidth allotments. FFT takes a digital signal containing a certain number of samples and expresses it as the weighted sum of an equivalent number of frequencies. The new algorithm determines the weights of a signal's most heavily weighted frequencies, and if the signal is sparse enough, the algorithm can sample it randomly instead of reading the entire signal. The researchers' algorithm relies on dividing a signal into narrower slices of bandwidth, so that a slice will generally contain only one frequency with a heavy weight. The MIT researchers also developed a more efficient technique that borrows a signal-processing method from 4G cellular networks. The algorithm "greatly expands the number of circumstances where one can beat the traditional FFT," says University of Michigan professor Martin Strauss.


CCC Launches Undergraduates Summer Research Listing Site
CCC Blog (01/06/12) Erwin Gianchandani

The Computing Community Consortium (CCC) is offering a new Web site for listing undergraduate summer research positions. Researchers will be able to post their summer research opportunities on the listing site for free. The site will enable students to find summer research programs, and will enable the CCC to promote a pipeline of young talent for careers in computing research. The CCC's relatively new Computer Science Research Opportunities & Graduate School (CSGS) site will offer a link to the listings. The CSGS site provides information on summer research opportunities, a Q&A on "why do research," and links to summer programs from the U.S. National Science Foundation, Research Experiences for Undergraduates, the CRA Committee on the Status of Women in Computing Research, and Canadian Collaborative Research Experiences for Undergraduates, among others. Students also will be able to find information and advice on applying to graduate school in computing fields. Features of the CSGS site include Q&As with faculty from around the U.S. and current Ph.D. students, and a "Day in the Life" blog about student graduate school experiences.


Software Could Spot Face-Changing Criminals
New Scientist (01/18/12) Jacob Aron

University of Notre Dame researchers have developed facial-recognition software that can match faces before and after plastic surgery. The researchers found that matching individual facial features was more successful than trying to match whole faces. The researchers, led by Kevin Bowyer and Gaurav Aggarwal, were inspired by a facial-recognition technique known as sparse recognition, which matches an image of a face by comparing it with combinations of individual features from faces already in the database. However, the new system uses two databases, one of which is full of random faces while the other contains all of the before surgery pictures. When an after surgery picture comes up, it is analyzed against the pictures in the before database, developing a composite picture. If the composite picture created using the after picture matches closely with any of the composite pictures derived from the before pictures, the two are declared a match. Combining the matches of all facial features gave the team a 78 percent success rate when comparing pre- and post-surgical photos.


Yahoo! Predicts America's Political Winners
Technology Review (01/19/12) Christopher Mims

Yahoo! researchers are using prediction markets, polls, Twitter sentiment analysis, and search query trends to create a political prediction engine called the Signal. The researchers, led by Yahoo!'s David Rothschild and Dave Pennock, plan to create data visualizations that convey probability to the general public. They say the key to the system is called Fantasy Politics, which is based on Yahoo!'s success with fantasy sports and enables users to bet on the outcomes of almost anything. Pennock says users’ bets will take Yahoo!'s political prediction markets to a level of complexity and predictive power not seen elsewhere. The prediction markets run by the Signal are constantly polled, as are the results of Yahoo! search queries. In addition, Rothschild says Twitter sentiment analysis can provide a level of detail unmatched by polls because polls are generally binary systems, while Twitter sentiment can be analyzed as a fluctuating scale. Prediction markets already seem better than polls at taking the longevity of a trend into account. For example, in the recent Republican Presidential primary race various candidates took turns at the top of polls, but Intrade and other prediction markets monitored by the Signal always had Mitt Romney coming out ahead. Pennock, who crunches the numbers for the Yahoo! team, says their latest result puts the chances for President Obama's victory at 52.9 percent.


Federal Researchers Push Limits of Cloud Computing
InformationWeek (01/18/12) John Foley

Researchers at Argonne National Laboratory and Lawrence Berkeley National Laboratory recently completed the Magellan project, which determined that although cloud computing offers many advantages for scientific researchers, there are several hurdles to overcome, including a steep learning curve, performance and scalability shortcomings, and missing pieces in the cloud software stack. The report also found that commercial cloud services could be several times more expensive than the high-performance computing (HPC) environments they currently operate. In general, "the cloud is seven to 13 times more expensive," according to the report. However, the report also provides insights into some of the ways cloud computing could be used for leading-edge research. Magellan researchers concluded that the cloud model is well suited to certain types of scientific applications, specifically those with minimal communications, but it cannot outperform HPC systems for most national lab requirements. The Magellan team found that the top motivations for using the cloud are easing access to computing resources, the ability to control the software environment, and the ability to share the setup of software and experiments with peers.


Cracking Open the Scientific Process
New York Times (01/17/12) Thomas Lin

Dissatisfied with a traditional scientific research publication process that practices elitism while also being expensive, protracted, and hidebound, proponents of open science support a friction-free online collaborative environment that promotes faster knowledge sharing. Momentum is building for such concepts, as is the establishment of open access archives and journals such as arXiv and the Public Library of Science. One open science venture that is growing increasingly popular is ResearchGate, a social networking site where scientists can answer each other's queries, exchange papers, and find collaborators. ResearchGate's membership currently tops 1.3 million, says founder and researcher Ijad Madisch. Although editors of traditional scientific journals say open science may be positive from a theoretical perspective, they note that the scientific community itself is very conservative in practice. Established journals also contend that such conservatism is needed to defray the costs of peer-reviewing and publishing research. Still, some scientists are refusing to provide peer reviews for scientific journals and instead are participating in open online communities. "We’re not talking about new technologies that have to be invented," says the Massachusetts Institute of Technology's Scott Aaronson, an active member of MathOverflow. "Journals seem noticeably less important than 10 years ago.”


A New Artificial Intelligence Technique to Speed the Planning of Tasks When Resources Are Limited
Carlos III University of Madrid (Spain) (01/17/12)

Universidad Carlos III in Madrid (UC3M) researchers have developed an artificial intelligence technique that can automatically create plans and quickly solve problems in situations with limited resources. They say the technique can be applied to logistics, autonomous control of robots, fire extinguishing, and online learning. The goal is to get the system to independently find an ordered sequence of actions that will enable objectives to be reached. "With regard to time, our technique is three to 10 times faster, and with regard to quality, our solutions offer similar quality to that obtained by the best technique that is currently available," says UC3M's Angel Garcia Olaya. "Now we are making modifications that we hope will allow us to give still greater quality to our solutions." The technique can be applied to any industry in which it makes sense to implement automatic planning. For example, Spain's Ministry of Industry, Tourism and Commerce used the technique to create a system of automatic planning for the multimodal transport of goods. The researchers also are using the system in conjunction with the European Space Agency for planning and observation operations in space.


Improving Web Search
Victoria University of Wellington (New Zealand) (01/16/12)

New algorithms developed by Victoria University of Wellington researcher Daniel Crabtree would enable search engines to better understand the meaning of a user's query. Crabtree notes, for example, that a search for "jaguar" would yield mixed results that include the animal, the car, and even an Apple operating system or the 1980s video game console. "Search engines currently don't deal with that ambiguity because they simply search for Web pages that contain the words you've entered," he says. Crabtree's algorithms cluster pages together to separate different interpretations, and they use statistical language models to "see through" the search terms and capture the intended meaning of a query. He says the model recognizes word order to help search engines group them together and return what the user is "really searching for." Crabtree has tested the model on a small scale. "Search engines don’t appear to have improved that much in recent years," he says. "That’s partly because they’ve been focused on other issues, such as revising their search algorithms to stop spam or companies ‘gaming’ the search results."


Map Making, Made Easy
Harvard Gazette (01/17/12) Peter Reuell

Harvard University researchers have developed WorldMap, a cloud-based open source Web map-making platform that facilitates the use of large, detailed datasets and supports a number of formats. The researchers say WorldMap will make it easier for scholars to share maps and other geospatial data, and increase the amount of high-quality spatial data in the public sphere. Scholars will be able to integrate data from various sources by overlaying data in their own computers with materials on the Web, as well as incorporate paper maps, perform online digitizing, and link locations to other media. WorldMap is a collaborative tool, and all participants in groups will have editorial rights to interactive publications for large audiences, and users will be able to keep information private before making it available to larger groups for refinement and releasing it to the public. A beta version of the program was released last July, and it already has 1,250 users from more than 100 countries that have contributed more than 1,700 mapping layers and created more than 500 map collections to support their research. New features under development include the ability to visualize change over time, searching place names for current and historic locations, and creating and editing online map layers.


Computer Models That Predict Crowd Behavior Could Be Used to Prevent the Spread of Infections at Mass Gatherings
University of Bristol News (01/16/12) Joanne Fryer

University of Bristol researchers are studying how crowd behavior can be sensed, analyzed, and modeled, and explain how this knowledge can be used to manage environments in which mass gatherings (MGs) take place to improve safety and security. The researchers note that although the objective of MGs is to bring people together, crowd management strategies aim to keep people separated. The researchers developed agent-based computer models with fine-scale data from actual movements of individuals taken from detailed video recordings, global positioning systems, or mobile phone tracking to identify points of congestion and overcrowding that are useful for crowd management. The models were used for the Notting Hill Carnival to simulate the ways crowds interact and disperse under different conditions of movement and congestion. The researchers also describe how models of crowd movement can be adapted to take into account other scenarios, such as how individuals in confined spaces might spread disease through their proximity. "Such models would allow us to test various interventions on a virtual population with a computer and measure their success rates before testing them on real populations, possibly saving both resources and life," the researchers say.


Robots for Brain Surgery? EU Project Shows How
CORDIS News (01/13/12)

A European Union-funded research group has developed the Robot and Sensors Integration as Guidance for Enhanced Computer Assisted Surgery and Therapy (ROBOCAST) project, a robotic system that can help neurosurgeons perform keyhole brain surgery. ROBOCAST is equipped with enhanced memory features, 13 types of movement, and haptic feedback, which enables physicians to assess tissue and perceive the amount of force applied during surgery. The system uses hardware known as mechatronics, which constructs the robot's body and nervous system, as well as software that provides intelligence. The software consists of a multiple robot, an independent trajectory planning, an advanced controller, and a set of field sensors. The ROBOCAST robot can find its miniature companion robot through six degrees of freedom, moving from left to right, up and down, and backward and forward, all of which work together to find the robot's companion in a three-dimensional space. The miniature robot holds the probe that is used during surgery. The robot is currently being tested on dummies to ensure accurate performance during keyhole surgeries.


Microsoft to Launch Real-Time Threat Intelligence Feed
Network World (01/12/12) Colin Neagle

Microsoft announced plans to launch a real-time threat intelligence feed at the recent International Conference on Cyber Security. The project’s goal is to stream the company's security information on dangerous and high-profile threats to organizations running the gamut from business partners and private corporations to domestic and foreign governments. If the beta test is successful, Microsoft may make the feed publicly available. Microsoft's T.J. Campana says the feed will serve as a Hadoop-based cluster merged with Windows Server, streaming information from a database that now contains data on the Kelihos botnet Microsoft first reported on in September. "I don't see a decrease in threats, but I do see this [feed] limiting the possible damage from a given threat as the community will be able to respond faster," says Lumension analyst Paul Henry. Microsoft will need to allay the concerns of privacy skeptics, particularly since the feed will circulate Internet Protocol addresses of systems that are discovered to be elements in large botnets. However, Henry says that security threat information can be exchanged without causing privacy infringement, noting that the Microsoft feed will bear a similarity to practices at the SANS Internet Storm Center.


Abstract News © Copyright 2012 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe