Association for Computing Machinery
Welcome to the January 7, 2011 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Also, please download our new ACM TechNews iPhone App from the iTunes Store by clicking here and our new ACM TechNews iPad App by clicking here.


European Exascale Project Drives Toward Next Supercomputing Milestone
HPC Wire (01/06/11)

The goal of the European Exascale Software Initiative (EESI) is to help effect the migration from petascale to exascale systems over the next 10 years by bringing together industry and government organizations. "The expected outputs of the project is an exascale roadmap and set of recommendations to the funding agencies shared by the European [high performance computing (HPC)] community, on software--tools, methods, and applications--to be developed for this new generation of supercomputers," says EESI program leader Jean-Yves Berthou. EESI's first international workshop in Amsterdam convened 80 experts in the fields of software development, performance analysis, applications knowledge, funding models, and governance aspects in HPC. Eight working groups (WGs)--four focused on application grand challenges and four concentrating on enabling exaflop computing methods--have been organized to identify and classify the main challenges in their scientific area or technology component. Netherlands National Computing Facilities foundation's Peter Michielse says the workshops' purpose were twofold--to ensure that each WG was mulling the correct challenges within its scientific and technology discipline, and to become familiar with Asian and U.S. initiatives with respect to their exascale software projects. "An important role of EESI is to make sure that Europe is involved in global discussions on hardware, software, and applications design," Michielse says.

Gaming in Disaster Management With Constraints, Objectives
LiveMint (01/06/11) Priyanka Pulla

The development of disaster management protocols is the impetus behind the creation of a multiplayer game that simulates catastrophic events through a collaborative project between India's Center for Study of Science, Technology, and Policy (CSTEP) and the Defense Research and Development Organization's Center for Artificial Intelligence and Robotics. The simulation utilizes CryEngine3, the latest version of the game engines that drive first-person shooter games such as Crysis and Far Cry. Researchers at CSETP's Next Generation Infrastructure Lab (NGIL) have been designing both paper and computer-based games in the fields of energy policy, power price discovery, and supply-chain management over the past 18 months. The games operate on the same basic model--an artificial environment that assigns players a series of constraints, lines of action, and a common goal. "Each player or stakeholder in the game has a different version of the problem," says NGIL computer scientist Bharath Palavalli. "So they work out their trade-offs and arrive at a consensus that may or may not have been expected." In addition to modeling real life, the games often present unexpected risks and challenges.

U.Va. Computer Scientists Look to Biological Evolution to Strengthen Computer Software
UVA Today (01/04/11) Zak Richards

Computer scientists at the universities of Virginia and New Mexico recently received a $3.2 million U.S. Defense Advanced Research Projects Agency grant to develop more resilient software systems based on the biological concepts of immunity and evolution, with the goal of stopping cyberattacks. The researchers say the technology could have applications in a wide range of products, including laptops, cell phones, anti-lock brakes, and artificial-heart pumps. "In biological systems, the skin and the immune system work together to fight off threats, and diverse populations mean that not every individual is vulnerable to the same disease," says Virginia professor Westley Weimer. The researchers are using genetic programming techniques to develop software that can defend against attacks and self-repair, and then pass those traits onto later generations of the software. The researchers want to ensure that the software can automatically diversify programs, which will improve resiliency. "With millions of people using the same programs, it's also easier for a single virus or invader to find just one attack surface and destroy everything," Weimer says. The researchers also want to develop adaptable software that can learn to fend off attacks that come with the creation of new programs. The software also will use a distributed, decentralized search technique based on the behavior of ants, says New Mexico professor Melanie Moses.

Software for Programming Microbes
Technology Review (01/05/11) Katherine Bourzac

University of California, San Francisco (UCSF) researchers are working with Life Technologies to develop software that would automate the process of creating the biochemical pathways of genetically modified microbes. The goal is to enable biological engineers to design circuits for genes, proteins, and other biomolecules at a level abstraction, similar to the way programmers can write a new computer program without having to think about how electrons move through the gates in integrated circuits. "The vision is to take these software modules and develop them so that the process of biological parts selection and circuit design is far more automated and simplified than it is today," says Life Technologies' Todd Peterson. Thus far, the team has made basic circuit components called a NOR gate in E. coli bacteria. Also, the researchers have showed that the quality of the output of bacterial circuits could be improved by having them work collectively, forming a circuit of NOR gates, one in each cell. "If we apply computational processes to things that bacteria can already do, we can get complete control over making spider silk, or drugs, or other chemicals," says UCSF professor Christopher Voigt.

CMU Research Finds Regional Dialects Are Alive and Well on Twitter
Carnegie Mellon News (PA) (01/06/11) Byron Spice

Regional slang is evident in Twitter postings, but such dialects appear to be evolving in social media, as determined by a Twitter word usage analysis method developed by Jacob Eisenstein and colleagues in Carnegie Mellon University's (CMU's) Machine Learning Department. Eisenstein says Twitter offers a new means of examining regional lexicon, since tweets are informal and conversational, while tweeters using cell phones can opt to tag their messages with global positioning system coordinates. The CMU researchers collected seven days' worth of Twitter messages in March 2010, and chose geotagged messages from Twitter users who wrote at least 20 messages. From this they generated a database of 9,500 users and 380,000 messages. The team used a statistical model to identify regional variation in word use and topics that could predict the geographical whereabouts of a microblogger in the continental United States with a median error of approximately 300 miles. The researchers can only speculate on the profiles of the microbloggers, and Eisenstein says it is reasonable to assume that users who send many tweets via cell phone are younger than average Twitter users--and this appears to be mirrored by the subjects these users tend to discuss. Through automated analysis of Twitter message streams, linguists can observe the real-time evolution of regional dialects.

IIT-M to Aid Research on Innovative Projects
The Hindu (India) (01/05/11)

IIT-Madras will establish an interdisciplinary center of excellence for facilitating research on embedded systems, very large scale integration (VLSI) design, and enabling technologies. IIT-Madras professor Kamakoti Veezhinathan, speaking during the inaugural session of the 24th international conference on VLSI Design, said partnerships will be formed with industrial players to pursue innovative projects. More than 100 researchers, designers, and industry experts will discuss various aspects of electronic design automation and embedded systems that underpin the semiconductor industry during the three-day gathering. Participants also will address the challenges of India's VLSI sector, which is growing and needs to attract more young engineers. "Instead of training students of engineering to be 'industry-ready,' engineering colleges should equip them with the fundamentals of design and engineering, that would help them understand processes better," Veezhinathan says. India also has a shortage of skilled faculty in specialized fields such as circuit design and VLSI.

NASA Tries to Waken Robotic Mars Rover
Computerworld (01/06/11) Sharon Gaudin

The U.S. National Aeronautics and Space Administration (NASA) has been working to regain communication with the Mars rover Spirit after losing contact with the robot nine months ago. NASA researchers need to reestablish communication with the robot before the Martian spring ends in mid March, or risk losing communication permanently. The amount of solar energy, which helps to power the robot, is still increasing every day, and as long as that continues NASA will continue to try to contact the rover, according to Mars Rover project manager John Callas. Spirit's wheels got stuck in the Martian soil sometime last year, and the robot has been trying to conserve solar energy to keep it warm over the planet's winter. However the robot's twin, Opportunity, has had more success, having recently received new artificial intelligence software that allows the system to make some of its own decisions, such as where and when to stop and examine rocks as it moves across the Martian surface. NASA says the software offers a good test of robotic autonomy, technology that is expected to play an increasing role in future space missions.

Apache Object-Oriented Data Project Goes Top-Level
eWeek (01/05/11) Darryl K. Taft

The Apache Software Foundation (ASF) announced that its Object-Oriented Data Technology (OODT) has moved from the Apache Incubator to become a top-level project (TLP). Apache OODT is middleware and metadata used for computer processing workflow, hardware, and file management. The OODT system enables distributed computing and data resources to be searched by any user. The platform is used at the U.S. National Cancer Institute's Early Detection Research Network, as well as by several programs at the U.S. National Aeronautics and Space Administration (NASA). "OODT had been successfully operating within the [Jet Propulsion Laboratory] for years; the time had come to realize the benefits of open source in the broader, external community," says Apache OODT's Chris Mattmann. OODT is the first NASA-developed project to become an ASF TLP. "The Apache Software Foundation has a long history of software innovation through collaboration--the larger the pool of potential contributors, the more innovation we see," Mattmann says.

The Power of 'Convergence'
MIT News (01/04/11)

Convergence, a new model for scientific research, has the potential to revolutionize biomedicine and other scientific fields, according to a white paper presented to the American Association for the Advancement of Science by Massachusetts Institute of Technology (MIT) researchers. Convergence involves the combination of life, physical, and engineering sciences to facilitate innovation. "Convergence is a broad rethinking of how all scientific research can be conducted, so that we capitalize on a range of knowledge bases, from microbiology to computer science to engineering design," says MIT professor Phillip Sharp. Convergence research could provide a platform for meeting future medical and healthcare challenges, but federal investment is crucial "and a smart investment if we are to keep our biomedical research the finest in the world," Sharp says. Advances in fields such as information technology, materials, imaging, nanotechnology, computing, modeling, and simulation have transformed the physical sciences and could do the same for life science. In addition to financial support, the report calls for establishing a convergence ecosystem in which connections would be made between funding agencies, reforms to the peer-review process to support interdisciplinary grants, and programs for supporting future convergence researchers.

2011: The Year of the Personal Robot?
Scientific American (01/04/11) Larry Greenemeier

Willow Garage's PR2 personal robot platform, released last year, could lead to new advances in robotic technology. "There are a lot of innovations in the PR2, but the most significant thing from my perspective is that it is a standardized, well-designed, well-tested platform that has a whole bunch of software that works right out of the box," says Georgia Tech professor Charles Kemp. Kemp and his researchers are one of 16 teams that experimented with the PR2 in 2010. The Georgia Tech researchers are focused on creating robots that can help care for senior citizens by opening doors and retrieving objects. Meanwhile, Samsung Electronics is using the PR2 to help develop the company's robotics research. And the Bosch Research and Technology Center has launched a two-year project to equip PR2-based robots with its sensor technology, including microelectromechanical systems, accelerometers, gyroscopes, force sensors, and air-pressure sensors. Future robots will play a crucial role in the aware home, says Georgia Tech professor Wendy Rogers, who is working with Kemp. "We'll be looking to determine what tasks older adults, over 65, are open to having done in the house, and then Charlie's team is going to program its PR2 to do those tasks."

Why Are Health Data Leaking Online? Bad Software, Study Says
Wall Street Journal (01/03/11) Jennifer Valentino-DeVries

A recent Dartmouth College study found that sensitive health care data is being leaked online through peer-to-peer (P2P) file-sharing services. During a two-week period in 2009, the researchers were able to use P2P services to find more than 200 files that contained Social Security numbers, insurance numbers, names, addresses, and dates of birth. In addition, the researchers found that many people were using P2P services to find sensitive documents. During their study, the researchers tracked people using search terms such as "public health passwords" and "Columbia Center for AIDS Research." M. Eric Johnson, the director of Dartmouth's Center for Digital Strategies, says the searches may have been used to find information for corporate espionage, or to find numbers that could be used to commit fraud. Johnson says the biggest culprit for data leakage is hard-to-use software. He says poorly designed programs force health care industry employees to download files onto their home computers, where they are often forgotten. Johnson says that switching to cloud computing technology would make it possible for smaller businesses to have access to software that is easier to use. However, he notes that cloud computing also opens data up to other threats, including large-scale hackers.

Mathematical Model Shows How Groups Split Into Factions
Cornell Chronicle (01/03/11) Bill Steele

A mathematical model of how social networks evolve into opposing factions under strain has been developed by Cornell researchers. Earlier models of structural balance demonstrate that under suitable conditions, a group conflict will facilitate a split into just two factions, while the new model indicates how friendships and rivalries change over time and who ends up on each side. The model consists of a simple differential equation applied to a grid of numbers that can stand for relationships between persons, countries, or corporations. Cornell's Seth Marvel says people may forge alliances based on shared values, or may consider the social effects of allying with a specific individual. "The model shows that the latter is sufficient to divide a group into two factions," Marvel says. The model traces the division of groups to unbalanced relationship triangles that trigger changes that spread throughout the entire network. All too frequently the final state is comprised of two factions, each with all favorable links among themselves and all negative connections with members of the opposing faction. The model shows that if the average strength of ties across the entire system is positive, then it evolves into a single, all-positive network.

Better Benchmarking for Supercomputers
IEEE Spectrum (01/11) Mark Anderson

Many computer scientists say the High-Performance Linpack test used to rate the world's Top 500 supercomputers is not the best performance measurement for supercomputers. "What we're most interested in is being able to traverse the whole memory of the machine," says Sandia National Laboratory researcher Richard Murphy. He and his colleagues have developed the Graph500, a new benchmark that which rates supercomputers based on gigateps (billions of traversed edges) instead of petaflops. By the Graph500 standard, supercomputers have actually been slowing down, according to Notre Dame University professor Peter Kogge. Over the past 15 years, each 1,000-fold increase in flops has resulted in a 10-fold decrease in accessible memory. According to the Graph500 standard, the top supercomputer would be Argonne National Laboratory's IBM Blue Gene-based Intrepid, which recorded 6.6 gigateps. The U.S. Defense Advanced Research Projects Agency, the Department of Energy, and the National Science Foundation also have developed a new benchmark called the HPC Challenge, which tests computing power and memory accessibility.

Abstract News © Copyright 2011 INFORMATION, INC.
Powered by Information, Inc.

To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe