Association for Computing Machinery
Welcome to the August 29, 2012 edition of ACM TechNews, providing timely information for IT professionals three times a week.

ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).


Supercomputers Help New Orleans Prepare for Hurricane Isaac
Computerworld (08/29/12) Patrick Thibodeau

Researchers are using supercomputing centers at Louisiana State University (LSU) and the Texas Advanced Computing Center (TACC) to enable emergency planners to model what will happen once Hurricane Isaac sends water into canals, levies, and neighborhoods. The computer models also are being used to help determine the best staging areas for positioning people and supplies needed for the recovery, says LSU's Robert Twilley. The researchers are using an unstructured grid that enables them to concentrate the nodes in areas where the analysis is needed, such as near inland waterways and flood-prone areas. This technique allows them to adjust the detail to where it is needed most. The supercomputers being used can produce simulations with about 1.5 million nodes in just 90 minutes. Meanwhile, University of Texas at Austin researchers have been running computer models at TACC to assess the impact of the storm surge on Texas. Emergency planners "can look down at neighborhood scale and say 'on this street along the levy we're going to have water this high,' and plan accordingly," says University of Texas at Austin's Casey Dietrich.

Post-'Pinch'? Apple Patent-Case Could Point to New Digital Age for Smartphones
Washington Post (08/28/12) Craig Timberg; Hayley Tsukayama; Cecilia Kang

The future of smartphone development has been muddled with Apple's recent court win against Samsung in a patent infringement case involving the "pinch to zoom" and other key smartphone features. Analysts expect handset manufacturers might avoid features that could potentially infringe on Apple patents in future products. This could significantly impede the direction toward universal consumer acceptance that the pinch feature appeared to be enjoying. The decision has spurred searches for possible gestural substitutes for the pinch, while also raising issues about whether patenting the way humans interact with their devices is permissible once those interactions become standardized. Samsung is likely to file an appeal that, if it goes to court, would judge the validity of such patents. However, smartphone application developer Ken Yarmosh says Samsung could develop other interactive features such as zoom buttons or apps for automatic Web page optimization. Other experts project a cooling of competition between smartphone sellers if the jury reward in the Apple case surpasses the patents' market value. Analysts note the average smartphone relies on about 250,000 patented technologies. “Any time a patent covers what becomes a de facto industry standard, you have problems,” says Santa Clara University professor Brian J. Love.

Smartphones Challenge Chip Limits
Wall Street Journal (08/27/12) Shara Tibken

The conventional photographic process that designs circuitry on computer chips is not believed to be capable of creating the smaller patterns for the chips expected to be needed to power future devices. Meanwhile, chip manufacturers have encountered problems developing a new technique called extreme ultraviolet (EUV) lithography, because it is prohibitively expensive and it cannot yet process chips fast enough to be practical for high volume manufacturing. "It's not like the industry is totally hosed if it doesn't happen, but it will be bad," says analyst David Kanter. Current lithography systems use light to project patterns of circuitry on chips, which are fabricated on silicon wafers. However, the wavelength of conventional light is now larger than the features being defined. EUV offers shorter wavelengths of light than current lithography tools. Although more research is done on EUV technology, manufacturers are developing methods to extend conventional lithography techniques as far as they can. "We have a path to achieve our technology goals both with and without EUV," says Intel's Yan Borodovsky. One alternative approach is directed self-assembly, in which chemicals are combined to build superfine patterns on a chip.

Capturing Movements of Actors and Athletes in Real Time With Conventional Video Cameras
Saarland University (08/28/12)

Saarland University researchers have developed a method that integrates mathematics, computer science, and video cameras to simultaneously capture the movements of several people. The researchers say their method could be useful to animation specialists, medical scientists, and athletes. Several major motion pictures recently have been made using motion-capture technology that requires the actor to wear a suit with special markers. However, the actors "dislike wearing these suits, as they constrain their movement," says Saarland professor Christian Theobalt. The researchers developed an approach that works without these special markers and captures motion in real time. "The part which is scientifically new is the way in which we represent and compute the filmed scene," Theobalt says. "It enables new speed in capturing and visualizing the movements with normal video cameras." The approach also works if the movements of several people have been captured, or if they are obscured by objects in the studio and against a noisy background. "Therefore we are convinced that our approach even enables motion capture outdoors, for example in the Olympic stadium," Theobalt says.

Boo! Robots Learn to Jump Like Frightened Mammals
New Scientist (08/28/12) David Hambling

Researchers in the United Kingdom are working to give robots the equivalent of the mammalian amygdala, which is the part of the brain that responds quickly to threats. Mike Hook and colleagues at Roke Manor Research of Romsey have developed STARTLE, software that uses an artificial neural network to look out for abnormal or inconsistent data. Once the software learns what is out of the ordinary, it can recognize dangers in the environment. "If it sees something anomalous then investigative processing is cued; this allows us to use computationally expensive algorithms only when needed for assessing possible threats, rather than responding equally to everything," Hook says. The amygdala helps small animals focus on complex, fast-changing surroundings, enabling them to ignore most sensory stimuli. "The key is that it's for spotting anomalous conditions, not routine ones," Hook notes. The software has been tested in robot health monitoring to respond to danger signs, in computer networks to detect security threats, and in vehicle navigation systems to monitor driving conditions.
Share Facebook  LinkedIn  Twitter  | View Full Article - May Require Free Registration | Return to Headlines

Scientists Investigate Using Artificial Intelligence for Next-Generation Traffic Control
University of Southampton (United Kingdom) (08/24/12)

Researchers at the University of Southampton continue to study the use of artificial intelligence (AI) technology for controlling traffic lights. As part of their research, the team used computer games and simulations to examine what makes good traffic control, and the results show that under the right conditions, humans are excellent at controlling the traffic. The research demonstrates that humans can perform significantly better than existing urban traffic control computers. The team has developed machine-learning traffic control computers that can learn how to control lights like a human would. The machine-learning traffic control computers also can learn their own improved strategies though experience. "In transport research we are always looking ahead, and we can consider a future where all vehicles are equipped with Wi-Fi and [global positioning systems] and can transmit their positions to signalized junctions," says Southampton's Simon Box. "This opens the way to the use of artificial intelligence approaches to traffic control such as machine learning." He says AI-based approaches have the potential to improve the use of existing urban and road capacity while reducing the environmental impacts of road traffic.

Stanford Researchers Discover the 'Anternet'
Stanford Report (CA) (08/24/12) Bjorn Carey

Stanford University researchers have found that a species of harvester ants determine how many foragers to send out of the nest similar to the way Internet protocols discover how much bandwidth is available for the transfer of data. "The algorithm the ants were using to discover how much food there is available is essentially the same as that used in the Transmission Control Protocol [TCP]," says Stanford professor Balaji Prabhakar. A feedback loop allows TCP to run congestion avoidance, and the harvester ants behave in a similar manner when searching for food. "Ants have discovered an algorithm that we know well, and they've been doing it for millions of years," Prabhakar says. The Stanford researchers also found that the ants followed two other phases of TCP. One phase is known as slow start, which describes how a source sends out a large wave of packets at the beginning of a transmission to gauge bandwidth. Another protocol, called time-out, occurs when a data transfer link breaks or is disrupted, and the source stops sending packets. Stanford professor Deborah Gordon says there are potentially many other aspects of ant colony behavior that could be used in the design of networked systems.

'Frankenstein' Programmers Test a Cybersecurity Monster
UT Dallas News (08/24/12) LaKisha Ladson

University of Texas (UT) at Dallas researchers have developed Frankenstein, a software system that can cloak itself as it steals and reconfigures information in a computer program. "Criminals may already know how to create this kind of software, so we examined the science behind the danger this represents, in hopes of creating countermeasures," says UT Dallas professor Kevin Hamlen. He says Frankenstein is not a computer virus, but it could be used in cyberwarfare to provide a cover for a virus or another type of malicious software. "Just as Shelley's monster was stitched from body parts, our Frankenstein also stitches software from original program parts, so no red flags are raised," Hamlen says. "It looks completely different, but its code is consistent with something normal." Frankenstein takes code from programs already on a computer and repurposes it, threading it together to fulfill the malware's task with new instructions. "We wanted to build something that learns as it propagates," Hamlen says. He says the research is the first published example describing this type of stealth technology. "As a proof-of-concept, we tested Frankenstein on some simple algorithms that are completely benign," Hamlen notes.

Can 'Serious Games' Be an Effective Tool for Workplace Learning?
University College London (08/23/12)

University College London (UCL) researchers are analyzing TARGET, a computer game that could help workers develop skills such as negotiating and trust building. TARGET aims to use the Technology Enhanced Learning (TEL) environment provided by the game to support the development of workers. In each game, the user interacts with computer-based characters within a three-dimensional virtual environment and has to solve simulated project management missions, tasks, and problems. "Serious game applications include edutainment, higher education, health care, corporate, military, and non-government organizations," says UCL researcher Charlene Jennett. The researchers developed three learning measures, including multiple choice questions, scenario questions, and self-assessment questions, to assess different levels of learning in TARGET. The researchers' initial findings suggest that TARGET could be helping learners with interpreting different scenarios in the workplace. "Our findings also indicate that the TARGET system needs to be further developed in order to improve the experiences of users," Jennett says. She notes that not all serious games are effective as learning tools, which demonstrates the value of evaluation activities and "investigating whether a serious game achieves its intended learning outcomes with its intended target audience."

How to Feed Data-Hungry Mobile Devices? Use More Antennas
Rice University (08/23/12) Jade Boyd

Rice University researchers recently unveiled Argos, a multi-antenna technology that could help wireless providers keep up with the demands of data-hungry smartphones and tablets. The technology aims to increase network capacity by enabling cell towers to simultaneously beam signals to more than 12 customers on the same frequency. Rice researchers developed a prototype with 64 antennas that enables one wireless base station to simultaneously communicate directly to 15 users using narrowly focused directional beams. "The key is to have many antennas, because the more antennas you have, the more users you can serve," says Rice professor Lin Zhong. Before Argos, many researchers struggled to develop prototype test beds with just a few antennas. "There are all kinds of technical challenges related to synchronization, computational requirements, scaling up, and wireless standards," Zhong notes. The Argos prototype was developed with Rice-developed software and off-the-shelf hardware. "In Argos' case, we need only about one-sixty-fourth as much energy to serve those 15 users as you would need with a traditional antenna," says Rice researcher Clayton Shepard.

Robots to Rescue Coral Reefs
Heriot-Watt University (08/22/12) Lea-Anne Henry

Heriot-Watt University researchers are developing a swarm of intelligent robots to help save coral reefs. The "coralbots," each individually following simple rules, will fix damaged parts of coral, enabling them to regrow. The approach is based on the behavior of insect swarms such as bees, wasps, and termites, which collectively build sustainable and complex structures. The researchers say the project provides an innovative solution to restore the function of reefs, both shallow and deep, around the world. The system requires that the robots first must be driven by a computer and trained to recognize coral fragments from other objects such as rock, litter, sponges, and other sea creatures. The swarm of autonomous underwater robots then will work based on a set of micro-rules to seek out coral fragments and re-cement them to the reef. "This project explores one of the most intriguing and impressive feats of natural ‘swarm intelligence,' whereby collections of simple-minded individuals collaborate to construct complex and functional structures," says Heriot-Watt professor David Corne. He notes that coralbot swarms can reduce the engineering requirements for the robots.

America's Top Spies Go Up Against a Crowd
Los Angeles Times (08/21/12) Ken Dilanian

The Aggregative Contingent Estimation is a four-year, $50 million program run by the U.S. Intelligence Advanced Research Projects Agency that uses crowdsourcing, in which people are paid to predict major world incidents such as terrorist strikes. The study's purpose is to determine whether the U.S. intelligence community's 17 agencies can aggregate the assessment of its many analysts to issue more precise advisories to policymakers before and during global events. Corporate and university teams compete in the program, with each team featuring more than 12 social scientists and up to 2,000 participants. One of the program's participating teams is led by University of Pennsylvania professor Philip Tetlock, who says the initial results show promise. The teams use algorithms that assign greater importance to the most accurate forecasters among their participants over time, and winnow out the least accurate. Theoretically, crowdsourcing would entail surveying large groups across the 200,000-person intelligence community, or security-cleared outside specialists, to aggregate their opinions about how current events would unfold. Although skepticism about the value of crowd wisdom is deeply rooted in the intelligence sector, others say data-driven social science can enhance intelligence analysis.

Cutting the Costs of Secure, Evolving Software
Europe's Newsroom (08/15/12)

The European Union-funded SecureChange project found that only about one-third of a program's code changed from one version to the next. However, a substantial number of vulnerabilities were inherited by each new version from its predecessor. The SecureChange researchers developed a methodology, techniques, and tools to make the software lifecycle more efficient, flexible, secure, and less costly. "Our main idea was to consider change itself as a first-class citizen, using evolution rules for the software to make sure that each change respects the desired security properties," says University of Trento professor Fabio Massacci. "In this way, you automatically know that any modification satisfies your desired properties." The approach focuses on the difference between the old and new release of the software. "Test engineers can quickly and easily identify which tests are needed, what is new and what is obsolete, thereby avoiding the need to re-test millions of lines of code that have not changed and enabling them to focus their efforts on what is really new and hence potentially more risky," Massacci says. The approach also focuses on designing changes to the software in a granular fashion, so modifications to one element of the software do not impact other elements.

Abstract News © Copyright 2012 INFORMATION, INC.
Powered by Information, Inc.

To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe