Association for Computing Machinery
Welcome to the January 3, 2011 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Also, please download our new ACM TechNews iPhone App from the iTunes Store by clicking here and our new ACM TechNews iPad App by clicking here.

HEADLINES AT A GLANCE


Computers That See You and Keep Watch Over You
New York Times (01/01/11) Steve Lohr

As computer-vision technology advances further, many scientists predict that people will be surrounded by technologies that can understand what they see. High-resolution, low-cost cameras have become more in common with devices such as smartphones and laptop computers, and it is becoming easier and more affordable to store images. Meanwhile, software for mining, matching, and analyzing visual data is rapidly advancing. "Machines will definitely be able to observe us and understand us better," says Google researcher Hartmut Neven. Although some researchers are uneasy about invading the public's privacy with computer-vision systems, the U.S. Justice Department wants to use the technology to spot terrorists, identify a lost child, or locate patients with medical problems. Microsoft's Kinect uses digital cameras to recognize users and their gestures, as well as voice commands. "It's a world where technology more fundamentally understands you, so you don't have to understand it," says Microsoft's Alex Kipman. Basset Medical Center uses computer-vision technology to monitor doctors, making sure the caregivers wash their hands before and after coming in contact with the patients. The camera also watches the patients and can alert nurses if a patient is close to falling out of bed.


Computing on Multiple Graphic Cards Accelerates Numerical Simulations by Orders of Magnitude
Fraunhofer SCAI (01/03/11) Michael Krapp

The Fraunhofer Institute for Algorithms and Scientific Computing (SCAI) and the University of Bonn have been chosen as one of the first German NVIDIA CUDA research centers. The researchers will focus on the development of paralleled multi-graphics processing units (GPUs) software for numerical simulation. "Our vision is to develop a massively parallel, completely multi-GPU-based high-performance molecular dynamics software package, as well as a massively parallel, completely multi-GPU-based high-performance fluid dynamics code," says SCAI professor Michael Griebel. Numerical simulations can take days to compute, but SCAI's research could significantly shorten that time. The CUDA parallel computing architecture uses a GPU's massive computing power to increase computing performance. The researchers want to modify the Tremolo-X software package, which is used for the molecular dynamics of atoms or molecules, for use on multiple graphics cards. Tremolo-X simulates materials at the nano scale, making it possible to efficiently design new and innovative materials. The GPUs also are much more energy efficient than standard CPUs.


Power to Women in Tech
Daily Star (Bangladesh) (12/30/10)

The Bangladesh Women in Technology (BWIT) platform was recently launched with the goal of empowering women with technology and helping them find computing careers. BWIT is exclusive to women entrepreneurs and is completely managed by women. "Our intention is to take entrepreneurship further and our effort is to make our professionals more competent and productive by being a part of the global knowledge industry," says Dohatec New Media chairman Luna Shamsuddoha, who was named BWIT's president. The group is open to information technology (IT) business entrepreneurs, IT professionals, senior corporate executives, and computer science professionals. "Although a lot of women IT graduates are created at university level, they are finding difficulties in building a career in IT due to the social stereotype towards women," says BWIT's vice president for computer science professionals and Dhaka University professor Suraiya Parveen. "Such organizations can contribute to the growth of IT industry in Bangladesh given the growing importance women are having in the economic growth," says Mahboob Zaman, president of the Bangladesh Association of Software and Information Services.


Robot, Robot, Wherefore Art Thou Robot?
New Scientist (12/30/10) Celeste Biever

Robotics researchers have begun to test robots in the theater and other venues in their quest to develop machines that can be socially aware, which is a key breakthrough if robots are to become effective assistants and companions. Data, a humanoid robot comedian developed by Carnegie Mellon University's (CMU's) Heather Knight, begins its performance with a preprogrammed routine but listens for laughter, applause, and chatter, and then uses software to pick jokes that are more likely to elicit laughs. The jokes are categorized according to their theme, degree of interactivity with the audience, and other traits. Data selects jokes according to where it is in its routine, saving the best ones for last. Knight intends to extend these methods to machines outside the theatrical venue, such as a CMU tour guide that will tailor its route for guests and suggest activities that guests might like. Knight also plans to instill variation in other aspects of the robot's behavior besides joke selection so that both its words and actions evoke a unique personality. The machine will perform the same gag or script to different audiences, while nonverbal cues such as gestures, volume, and mode of delivery of its speech will vary. The concept is to allow the software to use audience reactions to deduce which mixes of nonverbal communication work for the script, and Knight hopes audience feedback will provide insights for generating a spectrum of convincing robot personalities.
View Full Article - May Require Free Registration | Return to Headlines


Perfecting Animation, Via Science
New York Times (12/29/10) Patricia Cohen

Columbia University's Computer Graphics Group has been working with Walt Disney Studios, Pixar, Weta Digital, and Adobe Systems artists to help them solve computer-generated imagery (CGI) problems. The Columbia researchers use mathematical equations based on natural physics to create realistic images, and use discrete differential geometry to develop the equations. "We find equations that describe lots of different kinds of physical systems, the shape of a cable on a bridge, a spinning top, cilia," says group director Eitan Grinspun. The researchers chose a new approach to designing CGI systems. For example, instead of forcing a system that was designed to simulate straight hair to also produce curly hair, the Columbia team created a more advanced program that could do both. The new approach significantly reduces the time it takes to produce the finished product, Grinspun says. Johns Hopkins Medical Center is using the technology to simulate how needles move through human flesh, so that doctors can practice surgeries on virtual bodies instead of real ones.


Rules of Engagement for Cyberwars See Slow Progress
Financial Times (12/29/10) Joseph Menn

Rules of cyberwar engagement are lagging behind world powers' development of their cyberwarfare capabilities. Sources say U.S. officials have held preliminary dialogues with their Russian equivalents on the use of cyberweapons, as well as with the Chinese, but analysts say little progress has been made so far. Meanwhile, cyberwarfare funding continues to rise sharply, as does the potential of cyberarms to wreak havoc with both civilian and military networks. International Telecommunications Union (ITU) Secretary-General Hamadoun Toure says the emergence of the Stuxnet computer worm "should serve as a wake-up call for all nations regarding the threats we face." Toure is seeking a code of conduct that prohibits behavior opposed by all countries, such as data theft and network disablement. The United States is in favor of an ITU plan that imposes the burden of probing cyberattacks on the nations where those attacks came from, while it also supports a Russian effort urging the development of a cyberarms limitation treaty by a United Nations panel. However, the panel is not slated to convene for two years, and the U.S. State Department has yet to appoint a senior official to guide international initiatives on such issues.


UPC Team Presents a System for Analyzing Information on WikiLeaks
Universitat Politecnica de Catalunya (12/28/10)

Universitat Politecnica de Catalunya (UPC) researchers have developed DEX, a system for finding information on networks that is designed to complement Internet search engines. DEX offers fast processing, configurable data entry from different sources, and network management with billions of nodes and connections from a desktop PC, says UPC's Josep Lluis Larriba. The researchers say the technology has social media applications, such as the ability to analyze data on WikiLeaks. They note that DEX is already being used by groups in a variety of fields. For example, the Notary Certification Agency uses DEX to discover fraudulent activity in real estate transactions, and the Catalan Institute of Oncology uses the system to study the evolution of cancer in Catalonia. The UPC group also is studying how quickly information spreads across the Internet in the Social Media project. In addition, the researchers are working on a project designed to study audiovisual content for primary and secondary schools in the field of e-learning. Another system, called BIBEX, explores scientific publications and relates it to other forms of literature.


2011 Preview: Peak Internet Comes Into View
New Scientist (12/28/10)

The Internet is ubiquitous in developed countries, but only 20 percent of individuals in developing countries are online. Such low penetration, cheaper devices, the ability to connect via cell phone networks, and increasing broadband reach will ensure that the global online population continues to rise for decades. However, the Internet is mature enough for the number of new adopters to soon begin growing less each year. To determine if this would happen in 2011, the growth of the world's online population since 1990 was plotted, which appears to be consistent with a logistic curve. Assuming the rate of increase in adoption would continue to follow a logistic curve, the inflection point when adoption would hit 50 percent was estimated. The calculation suggested this would occur in 2013, assuming Internet access would eventually reach 100 percent of the population. However, if Internet access plateaus at 80 percent, the inflection point would be reached in 2012.


The A.I. Revolution Is On
Wired (12/27/10) Steven Levy

Artificial intelligence (AI) technology has evolved into a system that uses machine learning, huge data sets, complex sensors, and new algorithms to complete specific tasks. The systems are designed to master these tasks in ways that humans never could. In the 1980s, researchers began to concentrate on the kinds of skills computers were good at, building groups of systems that operated according to their own brand of reasoning. The researchers used probability-based algorithms to teach computers how humans completed a certain task, and then let the systems determine how to best emulate those behaviors. The researchers also used genetic algorithms, which analyze randomly generated chunks of code, to pick out the highest performing ones and splice them together to create new codes. As the process is repeated, a highly efficient program evolves, and these developments have led to a variety of AI systems. For example, the Massachusetts Institute of Technology's Rodney Brooks designed a six-legged insect-inspired robot that can navigate complicated terrain autonomously, while Google is working on a car outfitted with AI systems. "If you told somebody in 1978, 'You're going to have this machine, and you'll be able to type a few words and instantly get all of the world's knowledge on that topic,' they would probably consider that to be AI," says Google cofounder Larry Page. "That seems routine now, but it's a really big deal."


Interdisciplinary Research Partnerships Set Out to Uncover the Physics of Cancer
Scientific American (12/27/10) Olivia Koski

The U.S. National Institutes of Health (NIH) has called on scientists, engineers, and thinkers outside the field of cancer research to join the war on cancer. NIH has awarded more than $60 million since October 2009 to Physical Sciences-Oncology Centers (PSOCs) to think about cancer in new ways and devise new tools for treatment. "They threw up their hands and said, 'We're not winning this battle; we have to invite people in with different points of view,'" says Daniel Hillis, the project's principal investigator. The University of Southern California's PSOC is working to build a computer model of cancer that would capture everything from how a single molecule moves all the way up to the physics of how a tumor spreads in a host organism. The team of 20 investigators from nine institutions has five years and $16 million to complete the ultimate computer model. Doctors would be able to take patient medical data, run it through a computer simulation, and determine whether a specific treatment would work before trying it in the real world. "Maybe it's so complicated that we can't do it, but eventually someone has to do this," Hillis says.


7 Programming Languages on the Rise
InfoWorld (12/25/10) Peter Wayner

Python, Ruby on Rails, MATLAB, JavaScript, R, Erlang, Cobol, and CUDA are among the niche programming languages that are becoming increasingly popular in enterprises. Python has a structure that makes it easy to scale in the cloud, making it popular with scientists because it is flexible and allows users to improvise and quickly see results. Ruby on Rails is popular for prototyping and cataloging data that can be stored in tables. MATLAB was built to help mathematicians solve complex linear equations, but has found a market in the enterprise due to the large amounts of data that modern organizations must analyze. There are several open source alternatives to MATLAB, including Octave, Scilab, Sage, and PySci. Meanwhile, several new applications, including CouchDB and Node.js, have boosted the popularity of JavaScript. The R programming language carries out multiple functions for numerical and statistical analysis of large data sets, and is used to examine the feasibility of business or engineering projects. Erlang combines traditional aspects of functional programming with a modern virtual machine that organizes machine code. Cobol appeals to programmers who work well with syntax that is more similar to a natural language. CUDA extensions are being used by enterprise programmers for machine vision, huge simulations, and complicated statistical computations.


Fujitsu Accelerates Verification of Java Software Through Parallel Processing
PhysOrg.com (12/24/10)

Fujitsu Laboratories has developed a way to verify Java software using parallel processing by using cloud computing services to shorten the time need for verification. The technique expands on symbolic execution, which automatically executes tests on Java programs, making it possible to process character string data. In a recent experiment using processing nodes, the technique outperformed existing technology about tenfold. The program utilizes parallelization that divides the symbolic execution processing among multiple processing nodes, which accelerates the testing. The technique also uses dynamic load balancing, which redistributes the processing of overloaded nodes to idle nodes. If a certain node takes too long to process the data, the program moves some of the data to other nodes that are finished processing, thereby achieving the faster total processing time through parallelization.


1,760 PlayStation 3s Form New Supercomputer
Air Force Times (12/24/10) Jill Laster

The U.S. Air Force Research Laboratory at Wright-Patterson Air Force Base recently unveiled a supercomputer, named the Condor Cluster, consisting of 1,760 Sony PlayStation 3s. The Condor Cluster is the 35th fastest supercomputer in the world and can achieve about 1.5 GigaFLOPS, according to the laboratory's high-performance computing director Mark Barnell. "We're striving hard to make affordable and constrained systems, where they can really use them and make a difference," Barnell says. The Condor Cluster will process high-resolution satellite images and enhance surveillance capabilities. Some researchers want to use the supercomputer for neuromorphic computing, which mimics the brain's ability to solve complex problems. "We have quite a few research and development efforts working on those kind of applications to do confabulation and prediction, and that will open up a variety of areas which could help with a lot of other efforts and a lot of the areas in which the Air Force would like to go," Barnell says. He notes that the Condor Cluster also is energy efficient and ranks as the world's seventh "greenest" supercomputer.


Abstract News © Copyright 2011 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe