Association for Computing Machinery
Welcome to the June 24, 2009 edition of ACM TechNews, providing timely information for IT professionals three times a week.

HEADLINES AT A GLANCE


NTSB to Look at Possible Computer Role in D.C. Crash
Computerworld (06/23/09) Thibodeau, Patrick

The U.S. National Transportation Safety Board (NTSB) will explore whether computer systems, sensors, or cell phones contributed to the deadly Washington, D.C., Metrorail accident on June 22 in which nine people were killed. Although there are several other possible reasons for why the Washington Metropolitan Area Transit Authority (WMATA) train crashed into the rear of another train, the WMATA computer systems will likely be closely examined because they are designed to prevent such rear-end accidents. The computer systems are constantly making decisions on train speed using data from track-bed sensors that monitor train movement. NTSB investigators will likely try to rule out possible causes, such as a misconfigured control system, a physical computer or hardware failure, or a security breach, says consultant Kegan Kawano. Security breaches have been known to happen in transportation systems, and Kawano says he is aware of 10 security incidents in transit systems since 2003. For example, a Polish teen allegedly derailed a train by hacking the network, and in 2003 a widespread worm affected systems used by rail hauler CSX Corp., causing the company to stop some passenger and freight service. Kawano says the design of rail automation systems is so unique that hackers often cannot figure out how to access them. NTSB investigator Debbie Hersman also says the agency will examine the actions of onboard operators and investigate the possibility of a mechanical failure.


IBM, Cray Lead Top 500 Supercomputer Rankings
Network World (06/23/09) Brodkin, Jon

The top two computers from last year are still the most powerful machines on the newest release of the Top 500 Supercomputer Sites list. The total combined performance of all the machines on the list reached 22.6 petaflops, nearly twice the combined performance of last year's list. The top machine remains the IBM Roadrunner system at the U.S. Department of Energy's Los Alamos National Laboratory, which has a performance of 1.105 petaflops. The Cray XT5 Jaguar system at Energy's Oak Ridge National Laboratory remains the second most powerful supercomputer, with a performance of 1.059 petaflops. The highest-ranking new supercomputer on the list is the JUGENE, an IBM BlueGene/P system in Germany at the Forschungszentrum Juelich research center. JUGENE, which runs 825.5 trillion calculations per second and is capable of a theoretical peak performance of more than a petaflop, took over the third spot on the list, displacing a NASA machine called Pleiades. Only two supercomputers in the top 10 are located outside the United States, JUGENE and Europa, a computer at the Juelich research center that is capable of 274.8 trillion calculations per second. The last machine on the list, which runs 17.1 trillion calculations per second, would have placed 274th only six months ago.


Living Safely With Robots, Beyond Asimov's Laws
PhysOrg.com (06/22/09) Zyga, Lisa

As robots increasingly move from industrial environments to the real world, human safety has become an important issue. Unlike industrial robots that perform repetitive tasks in controlled environments, household and real-world robots will have a relative amount of autonomy, and will work in ambiguous, human-centered environments. Before such robots become widespread, regulators are trying to determine how to approach the safety and legal issues that may arise. In a study published in the International Journal of Social Robotics, researchers proposed a framework for a legal system for next-generation robot safety issues. The goal is to ensure safer robot design through safety intelligence and to provide a method for handling accidents when they occur. The guiding principle of the study's proposed system is to categorize robots as third-existence entities, since they are neither living or biological (first existence), or non-living/non-biological (second existence). Third-existence entities will resemble living things in appearance and behavior, but will not be self-aware. Robots are currently legally classified as second existence, or human property, but the authors believe that a third-existence classification could simplify incidents involving accidents and responsibility. A major component of integrating robots into human society will involve dealing with open texture risk, or risk that occurs due to unpredictable interactions in unstructured environments, such as getting robots to understand the subtleties of human language.


Where's the 'C' in STEM?
ACM (06/24/09)

The Computer Science and Information Technology (CS&IT) Symposium will take place Saturday, June 27, 2009, in Washington, D.C. Computer science education must not be left out of the initiative to define strong standards for science, technology, engineering and math for K-12 students, says Chris Stephenson, executive director of the Computer Science Teachers Association (CTSA), which is hosting CS&IT. "Inclusion in the larger skills and standards discussion will put us on the right path to ensuring that students have access to the skills and rigorous courses they need to thrive in the globalized knowledge economy," Stephenson says. CTSA cites the recent study by ACM and WGBH, which found that 74 percent of boys view majoring in computer science favorably, but only 32 percent of girls do. CS&IT will offer sessions on timely issues such as communicating the importance of computational thinking across all subjects and attracting minorities and girls to computing. Speakers at CS&IT, which is held in conjunction with the National Educational Computing Conference, will include University of California, Irvine professor Debra Richardson, who will discuss the need to better prepare K-12 students for college. "This year's symposium will not only show computing educators how to leverage the latest tools and technology to effectively teach computer science, but it will also give them the knowledge needed to ignite students' passion to excel at and pursue computing fields," Stephenson says.


Bringing Girls and Boys to Computer Science With 'Alice'
Duke University News & Communications (06/22/09) Basgall, Monte

An animation program called Alice, developed by the late Randy Pausch of Carnegie Mellon University, is being used by computer scientists around the country to engage young students in computer programming by encouraging them to create their own worlds, without realizing they are actually writing code. Duke University professor Susan Rodger says attracting females is key to the future of computer science. In 2008, only 11.8 percent of U.S. bachelor's degrees in computer science were awarded to women, according to the Computing Research Association. Rodger believes that Alice breaks down stereotypes about what computer science is, and gets students involved by challenging them with problems and allowing them to invent scenarios within virtual worlds that are created by the students and populated with the objects, animals, and characters that they choose. "It's a very easy way to learn programming if you've never done it before," she says. "It's hard to make mistakes. What you learn in the process are computer science concepts." Rodger has introduced Alice in Durham-area middle schools, and the National Science Foundation has funded similar programs in Denver, San Jose, Charleston, S.C., and Oxford, Miss. Duke also is hosting Alice events for teachers, including a symposium and a series of workshops, to encourage them to incorporate Alice into their classes.


Traveling the Web Together
Technology Review (06/23/09) Naone, Erica

Researchers from the College of William & Mary have developed real-time collaborative browsing (RCB) software that makes it easier for users to interact with each other while browsing the Web. Several ways of navigating the Internet collaboratively already exist, but all of them are limited in some way, such as not allowing users to browse together at the same time or limiting interactions on a single Web page. Screen sharing can allow users to browse together as if sharing a single machine, but such programs usually require connecting to an outside server. William & Mary professor Haining Wang says that with RCB, only the person leading the session needs to have the browser extension installed, and all other users only need a standard Web browser. The leader uses RCB to generate a session URL that can be sent to other participants, who can click on the link to join up with the leader. Once connected, both users can interact with a Web page and follow links, with all actions being funneled through the host's browser. The host can add or remove participants as needed, connecting up to 10 participants without a significant drop in performance, though the researchers say RCB works best between two people. RCB is not yet available to the public, but the researchers recently presented their work at the Usenix Technical Conference.


Beating the Bullies--Changing Real-World Behaviour Through Virtual Experience
ICT Results (06/22/09)

The European Education through Characters with emotional-Intelligence and Role-playing Capabilities that Understand Social interaction (eCIRCUS) project aims to create virtual worlds with characters that children can interact and empathize with on such a powerful level that it helps them change their own behavior. The project has developed two programs, FearNot! and ORIENT, to give students helpful roles in interactive virtual worlds that can encourage them to change their thoughts, feelings, and actions. ECIRCUS researchers first focused on helping children who were the victims of bullying by drawing from recent psychological theories that highlight the importance of feelings for changing how people treat each other. The researchers updated the FearNot! computer program, which was developed as part of an earlier European research effort, by expanding and enriching its content, and made the program more open-ended. For example, the researchers gave virtual bullying victims the ability to remember strategies they have tried, which enable the characters to reject approaches that have failed, and ask the children in the same situation to find better solutions. In the second program, ORIENT, three students are equipped with various handheld control devices and sent as a team to save the planet Orient, a virtual planet populated by frog-like aliens called Sprytes. Students must learn about the Sprytes and empathize with them to help them. The purpose of ORIENT is to make players feel alone in an alien culture, and to help them learn to appreciate people who are different.


House S&T Committee Discusses Cyberspace Policy Review Report With Federal Agencies
Computing Research Association (06/19/09) Gandomi, Nathan

The U.S. House Science and Technology Committee recently held a hearing to review responses from the Department of Homeland Security, the National Institute of Standards and Technology (NIST), the National Science Foundation (NSF), and the Defense Advanced Research Projects Agency in regards to the Obama administration's recently released Cyberspace Policy Review report. The report presents several near- and mid-term actions that involve federal agency efforts in research and development, education, standards, information coordination, and interagency collaboration. Technology and Innovation Subcommittee chairman David Wu (D-Ore.) said previous federal cybersecurity efforts were too "output oriented" and not "outcome driven," and was hopeful that the new administration will focus on achieving fewer breaches of federal systems, reducing incidents of identity theft, and ensuring the security of smart grid systems and health IT systems. Research and Science Education Subcommittee chairman Daniel Lipinski (D-Ill.) argued for greater collaboration between public and private sectors to expose weaknesses in security and share breach information, and for a multidisciplinary approach to cybersecurity to provide a better understanding of how people interact with computers and information. Cita Furlani, director of NIST's Information Technology Laboratory, said NIST will work with federal, state, local, private, and academic institutions to develop information security standards. Jeannette Wing, assistant director of NSF's Directorate for Computer & Information Science & Engineering, called for increased cybersecurity research openness and more collaborative research between industry and academia.


Indiana University, Technische Universitat Dresden Collaborate on Improved Life Sciences Data Transfer
Indiana University (06/19/09)

Information technology leaders from Indiana University (IU) and Germany's Technische Universitat Dresden (TUD) recently announced a collaborative effort to improve the levels of cooperation between scientists and medical researchers in the United States and Europe. Other partners in the effort include the Center for Information Services and High Performance Computing (ZIH) at TUD, and Indiana University's Pervasive Technology Institute (PTI) and School of Informatics. The first project the two universities will work on involves developing new approaches to biological data set management and trans-Atlantic data transfer. "IU and PTI are doing work related to data management that holds significant value both within the state of Indiana and internationally," says PTI executive director Craig Stewart. UI technologists are currently visiting Dresden, Germany, to work with ZIH on optimizing trans-Atlantic data transfers, using IU's high performance data storage system, the Data Capacitor. The researchers also are developing new techniques for managing data sets, particularly those from the biological sciences. "Having the ability to quickly access very large data sets across the Atlantic greatly expands the potential for successful international research collaboration and new scientific and medical discoveries," says Data Capacitor project leader Stephen Simms.


Interactive Robot Guided By Sensors--Not Remote
Futurity.org (06/16/09) Leonard, Jenny

Brown University professor Chad Jenkins and his team have developed a robot capable of holding a conversation, gesturing, and following a human's movement without using remote control devices. "We need robots that can adapt over time, that can respond to human commands and interact with humans," says Jenkins, the director of Brown's Robotics, Learning, and Autonomy (RLAB) group. One of RLAB's projects uses robotic soccer and a Nintendo Wii remote to enable users to control robots in the game from the robot's perspective. "The player sees what the robot sees, and decides what it should do in a given situation," Jenkins says. "The person knows what he wants the robot to do, yet the robot's control policy--the entity that makes decisions for it--may not be capable of reflecting that." The input from the human players is used to refine the robot control policy, helping the robot to improve its locomotion and manipulation skills. In another RLAB project, the objective is make robots more closely reflect the will and behavior of humans. Using a NASA humanoid upper-body robot, the researchers are using motion-capture systems to record human movement in three dimensions and translate that movement into digital models that can be used to create a more effective robot control policy. The new policy has enabled the robot to replicate basic human motion and manipulate objects. Jenkins also is developing interfaces that could be used with a neural cursor control system developed by Brown neuroscience professor John Donoghue.


Computer Scientists Model Cell Division
Harvard University (06/18/09) Rutter, Michael Patrick

Harvard University computer scientists have developed a model for studying the arrangement of tissue networks that are created by cell division. "We developed a model that allows us to study the topologies of tissues, or how cells connect to each other, and understand how that connectivity network is created through generations of cell division," says Harvard professor Radhika Nagpal. "Given a cell division strategy, even if cells divide at random, very predictable 'signature' features emerge at the tissue level." The new framework could create new insights on how multicellular systems achieve, or fail to achieve, robustness from the seemingly random behavior in groups of cells, and help researchers looking to artificially emulate complex biological behavior. Using the computational model, Nagpal and her colleagues demonstrated that the regularity of the tissue can act as an indicator for inferring properties about the cell division mechanism itself. "Even with modern imaging methods, we can rarely directly 'ask' the cell how it decided upon which way to divide," Nagpal says. "The computational tool allows us to generate and eliminate hypotheses about cell division." The researchers plan to use the new model to detect and study mutations that adversely affect cell division processes in epithelial tissues, which can cause cancer. "One day we may even be able to use our model to help researchers understand other kinds of natural cellular networks, from tissues to geological crack formations, and, by taking inspiration from biology, design more robust computer networks," Nagpal says.


Managing the Data Deluge
University of Texas at Austin (06/05/09) Dubrow, Aaron; Singer-Villalobos, Faith

The Texas Advanced Computing Center (TACC) at the University of Texas recently unveiled the Corral, a central repository for data collections designed to handle the processing requirements of data-driven science. Corral features 16 server nodes and 1.2 petabytes of storage, which is four times larger than any other data-collection resource on the TeraGrid. Larger and more sophisticated computing and visualization systems, such as the Ranger, Lonestar, and Stallion, generate immense amounts of data, which needs to be properly stored and managed. Furthermore, traditional data-collection repositories, such as museums and physical archives, are renovating themselves for the 21st century. Corral addresses the need for digital preservation and document and specimen management, and provides archives that allow data to be shared and explored more thoroughly than previously possible. "We're ahead of the curve in terms of providing this kind of dedicated data collection and application resource," says Chris Jordan, who is responsible for data infrastructure at TACC. "A lot of other sites are doing data collections, but very few sites are providing this kind of universally accessible, unified resource." TACC expects the repository to be completely full within two years and has designed the system so that it can be made up to 10 times larger. "The advantage of having Corral is that we have the ability to offer services based on new methodologies, and to support them in a very flexible way," Jordan says. "This gives us the opportunity to learn what some of the best practices are and share that information with a wide variety of projects."


Abstract News © Copyright 2009 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]




Unsubscribe
Change your Email Address for TechNews (log into myACM)