Association for Computing Machinery
Welcome to the May 25, 2011 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Also, please download our new ACM TechNews iPhone App from the iTunes Store by clicking here and our new ACM TechNews iPad App by clicking here.


IT Professionals in High Demand: Report
eWeek (05/23/11) Nathan Eddy

There are more technology job openings in a single day on Dice's career Web site than there are computer science graduates joining the U.S. workforce, according a Dice Holdings report. The report also found that 18 states have shortages of local graduates compared to job openings, particularly in key tech markets such as Silicon Valley, Seattle, Dallas, Boston, New York, Washington, D.C., Los Angeles, and Chicago. The report notes that the gap between graduates and job openings has created competition for talent among tech companies. There are at least two or three jobs for every computer science graduate, says Massachusetts Institute of Technology's Ann Hunter. " 'America's Tech Talent Crunch' is a snapshot of how businesses, educational institutions, and employees are dealing with palpable shortages in real time," the report says. An earlier Dice report found that, on average, technology professionals had not received salary raises in the last two years. However, "companies can no longer get away with paltry salary increases for their technology staffs based on the demand we are seeing for talent," says Dice's Tom Silver.

Tim Berners-Lee Calls for 'Sophisticated' Social Network (05/23/11) Rosalie Marshall

Academics and scientists should have a next-generation social network for sharing knowledge more efficiently and collaborating on projects, says World Wide Web inventor Sir Tim Berners-Lee. Speaking at the recent Profiting From the New Web conference, Berners-Lee noted that social networks do not talk to each other. He says that scientists need a different type of medium for working on the Web, but current social networks are not designed to connect them. "Twitter is not really designed for middle of the way discussion," Berners-Lee says. Web scientists should help with the structure of this new network, but Berners-Lee notes that they also need to be more aware of social and economic factors. "Most systems are done with a hunch and are not done with understanding," he says. The conference focused on how the Web has transformed business models and what companies should do to take advantage of the opportunities offered by the Web.

Homemade Cyberweapon Worries Federal Officials
Washington Times (05/24/11) Waterman Shaun

Security researchers Dillon Beresford and Brian Meixell recently developed a cyberweapon similar to the Stuxnet computer worm that disrupted Iran's nuclear program computer systems last year. The researchers' ability to develop the program working at home on laptops has raised concerns at the U.S. Department of Homeland Security (DHS), which has asked the researchers to cancel their planned presentation of the technology at a computer security conference next week. DHS officials are worried that if the researchers' method is made public, other hackers will replicate the malicious software and cripple federal computer controls. The software was tested on equipment made by Siemens, and while Beresford worked with DHS officials on ways to protect industrial computer programs, he says Siemens' officials have been slow to respond to the hole in their security systems. "They requested that I not share the data, but it was absolutely my decision to cancel," Beresford says. The researchers' work is alarming because experts initially believed that it would take significant resources and access to detailed information on the intended target to duplicate the Stuxnet worm.

Stanford Computer Scientists Find Internet Security Flaw
Stanford Report (CA) (05/23/11) Melissae Fellet

Stanford University researchers have found a security flaw in audio-based completely automated public Turing test to tell computers and humans apart (CAPTCHAs), which are designed to provide Internet security for the visually impaired. Audio CAPTCHAs require users to listen to a string of spoken letters or numbers disguised with background noise. However, Stanford professor John Mitchell and postdoctoral fellow Eli Bursztein developed Decaptcha, a program that can understand commercial audio CAPTCHAs used by Digg, eBay, Microsoft, Yahoo, and reCAPTCHA. During testing, Decaptcha was able to decode Microsoft's audio CAPTCHA about 50 percent of the time. In addition, it broke about one percent of reCAPTCHA's codes, and even this small a success rate can result in a major security breach for Web sites such as YouTube and Facebook, which get hundreds of millions of page views a day. Decaptcha can recognize the distinct sounds of each letter and number, and compares the sounds it hears in audio CAPTCHAs to those sounds stored in its memory. The researchers created four million audio CAPTCHAs mixed with white noise, echoes, or music, and found that music gave the computer systems the most trouble.

What Makes an Image Memorable?
MIT News (05/24/11) Anne Trafton

Massachusetts Institute of Technology (MIT) researchers have developed an algorithm that can identify which photos humans will find memorable, based on statistical data taken from a study of people remembering images. The researchers, led by MIT's Phillip Isola, found that the most memorable photos are those that contain people, followed by static indoor scenes and human-scale objects, while landscapes are not as memorable. After gathering the data, the researchers created memorability maps of each image by asking people to label all of the objects in the images. A computer model analyzed the maps to determine which objects make an image memorable. Then the researchers used machine-learning techniques to create a computational model that analyzed the images and their memorability rate, which enabled them to create an algorithm that can predict the memorability of images the computer has not previously analyzed.

Pittsburgh Supercomputing Center Accelerates Machine Learning With GPUs
Pittsburgh Supercomputing Center (05/23/11) Shandra Williams

Researchers at the Pittsburgh Supercomputing Center and HP Labs recently used code designed to take advantage of graphical processing units (GPUs) to significantly boost the performance of k-means clustering algorithms. Using a GPU-based implementation, the researchers were able to cluster all of the five-word sets of the 1,000 most common words occurring in all books published in 2005, which totaled more than 15 million data points and 1,000 dimensions, in less than nine seconds. That was nearly 10 times faster than performing the same operation running on a central processing unit-based program. Faster execution speeds will enable researchers to develop more complex algorithms layered on top of k-means clustering. K-means clustering "is often used as a subroutine in spectral clustering and other unsupervised or semi-supervised learning methods," notes Carnegie Mellon University professor William Cohen. The advanced clustering algorithm also is very easy to use, which facilitated the quick implementation, says HP Labs' Ren Wu. "The key for any high-performance algorithm on modern multi/many-core architecture is to minimize the data movement and to optimize against memory hierarchy," Wu says.

U.S. International Cyberspace Policy Sounds Good; Will Be Hard to Implement
Network World (05/23/11) Tim Greene

Although ambitious, some experts say the White House's recently issued International Strategy for Cyberspace could be difficult to deploy, as some of its objectives conflict and pose seemingly unbeatable technical challenges. The strategy has been touted by U.S. Secretary of State Hillary Rodham Clinton as a framework to devise, implement, and coordinate policies that address all cybersecurity issues. "As we work to achieve a cyberspace that is open, interoperable, secure, and reliable, there is no one-size-fits-all, straightforward route to that goal," Clinton says. However, some experts say the strategy's goals are conflicting. For example, the policy urges support for free expression and commerce through the Internet while also denying those benefits to criminals and terrorists, with the challenge being to distinguish citizens from criminals while maintaining online privacy. Participants at a recent cybersecurity and privacy protection panel at the Massachusetts Institute of Technology CIO Symposium stressed that the government should get more involved with safeguarding Web infrastructure. Security consultant Jeffrey Carr says that although the White House strategy calls for shielding critical infrastructure, U.S. statutes calling for such measures lack force. Another problem is that the ideal of unfettered Internet use is contradicted by the fact that governments usually do what is in their own best interest.

Blind Children of India Helping Scientists See Into the Brain
Boston Globe (05/23/11) Karen Weintraub

Massachusetts Institute of Technology (MIT) researchers are working with more than 300,000 blind children living in the Indian subcontinent to treat their blindness and learn more about how the brain processes visual information and what goes awry with that processing in autism and other disorders. The research also is leading to advances in robotics, as computer scientists develop new algorithms for how the human brain sees. "When people succeed in combining that desire to learn and real health-provision service, that is fantastic," says Harvard Medical School professor Alvaro Pascual-Leone. In exchange for receiving treatment for the ailments that cause blindness, the children are observed, given magnetic resonance imaging scans before and after surgery, and questioned by the MIT researchers. "In our experience, the children actually enjoy doing these tests and treat them like fun games," says MIT's Pawan Sinha, who is leading the research. Pascual-Leone says there has, up to now, been no systematic analysis on a large group who all gained sight after years of visual impairment.

Microsoft Wants to Rule the White Spaces
Technology Review (05/23/11) Tom Simonite

Microsoft recently applied to the U.S. Federal Communications Commission (FCC) to become an approved administrator of a system that makes use of unused TV spectrum known as white spaces to provide wireless broadband connections. Microsoft's SenseLess system threads long-range wireless data signals through the white spaces, and the company says it could be put into use later this year. Google and eight other companies already have been given permission to operate white spaces databases. Microsoft's trial white spaces network can provide high-speed Internet at a range of more than a mile. The system involves a device supplying its location to the database using a permanently free frequency. SenseLess combines knowledge of every licensed TV signal in the United States with topographic maps to determine how best to send signals over distance and terrain. Microsoft researchers are working to fit more connections into the white spaces in an effort to make more space available. The SenseLess system has already been altered for use in other countries. "A database system like this could be the basis of a new system of access control, a universal intermediary between devices that want wireless connections and those with the capability," says University of Pennsylvania researcher Kevin Werbach.

A New System Increases Network Communication Security and Anonymity
Universidad Politecnica de Madrid (Spain) (05/23/11) Eduardo Martinez

Universidad Politecnica de Madrid researcher Carlos Caselles Jimenez has developed an anonymous communications system with automatic routing management. The system organizes a data transmission communications network whose users are unidentifiable, which protects user privacy and improves information exchange security, using multipoint software based on client-server applications. The Windows-based application was developed in Java and includes OpenSSL-based security features. The system's tools can implement SSL/TLS security protocols, including the HTTPS protocol, which provides Web browsers with secure access to Web sites that require the transfer of personal information. The system also serves as a data traffic management system, due to a routing algorithm that calculates the most efficient routes over time, allowing for an extension of the network without having to consider the number of connected machines, which lowers system performance.

Google Search Patterns Could Track MRSA Spread
Wired News (05/20/11) Brandon Keim

Google searches might enable public health experts to better fight drug-resistant staph infections. Researchers led by the University of Chicago's Diane Lauderdale compared records of Google searches for methicillin-resistant staphylococcus aureus (MRSA) between 2004 and 2008 with MRSA-related hospitalization records and found that the numbers matched, except for a search burst after a 2007 U.S. Centers for Disease Control and Prevention (CDC) study. The research suggests that search data would be a reliable indicator of infection. Currently, there is no surveillance system to track MRSA infections overall, and the CDC uses the Active Bacterial Core surveillance program, which focuses on nine regions. Google searches have the potential to provide near real-time, city-by-city information about the spread of MRSA. "Potentially, we can get from Google a more timely measure of trends," Lauderdale says. The methodology is similar to that used by Google Flu Trends, but more tests are needed to be certain it works.

University of Pennsylvania's PR2 Robot Learns to Read (05/20/11) Katie Gatto

The University of Pennsylvania's Robot PR2 now has the ability to read for itself. Nicknamed Graspy, the robot can read anything from simple signs to full-length warnings, and further research could lead to the ability to read longer texts. According to researchers at Pennsylvania's GRASP Lab, Graspy is learning to read just like a human toddler does. Graspy watches words in order to recognize the shapes they take, then gives a meaning to the shapes, associating the sound of the letter with the way it looks. Next, the robot begins to sound out words, and attempts to match new and less familiar words to known words. Graspy has the capacity to learn, unlike the simple word recognition ability of the current generation of text-to-speech systems. However, handling fonts is a challenge for robots. Although a human has an innate ability to deal with all but the most ornate and obtuse fonts, robots respond more literally and need more processing power and learning time to keep track of the many different ways to write letters.

Sound of Sex Could Alert Internet Porn Filter
New Scientist (05/20/11) Jacob Aron

Although automatic image-analysis systems are already used to catch unwanted pornography before it is shown on a computer monitor, they often mistake innocent images, such as swimmers or close-up faces, with sexual ones. However, instead of analyzing images, Korea Advanced Institute of Science and Technology researchers are analyzing audio to flag unwanted pornography. Their system uses audio-analysis technology to distinguish a sexual scream or moan from more common audio clips. The researchers used a technique known as Radon transform to create spectrograms of different audio clips, noting that speech signals are normally low pitched and vary gradually over time, while pornographic sounds tend to be higher pitched and change quickly. During testing, the researchers say their technique outperformed other audio-based methods, correctly identifying 93 percent of the pornographic content from the test clips.

Abstract News © Copyright 2011 INFORMATION, INC.
Powered by Information, Inc.

To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe