Association for Computing Machinery
Welcome to the June 10, 2009 edition of ACM TechNews, providing timely information for IT professionals three times a week.

HEADLINES AT A GLANCE


China Dominates NSA-Backed Coding Contest
Computerworld (06/08/09) Thibodeau, Patrick

The TopCoder Open, a National Security Agency-supported contest that tests individuals' programming and technology design skills, was dominated by competitors from China and Russia. Of the 70 finalists, 20 were from China, 10 were from Russia, and two were from the United States. Approximately 4,200 people participated in the contest, which is open to anyone, with about 894 contestants being from China, 705 from India, 380 from Russia, 234 from the United States, 214 from Poland, 145 from Egypt, and 128 from Ukraine, among others. The vast majority of contestants, 93 percent, were male, with 84 percent between the ages of 18 and 24. TopCoder's Rob Hughes says the success of China, Russia, and Eastern Europe is the result of the importance those countries place on mathematics and science education. "We do the same thing with athletics here that they do with mathematics and science there," Hughes says. The United States needs to make greater efforts to teach and get children in middle school and high school involved in math and science, he says. Of the participants in the content, more than 57 percent had bachelor's degrees, mostly in computer science, of which 20 percent had earned a master's degree, and 6 percent had earned a Ph.D. The winner of the algorithm was an 18-year-old student from China. TopCoder contestants are tested in design, development, and architecture, among other areas. This year, competitors were challenged to unscramble and label two scrambled and erased social networks to see if they could possibly be from the same group of people, a problem known as the network isomorphism problem. Two contestants solved the problem.


Opening Doors on the Way to a Personal Robot
New York Times (06/09/09) P. D2; Markoff, John

The ability to open doors is a significant step forward for robotics and an important milestone toward a personal robot industry. Such a milestone has been reached with Willow Garage's PR2, an experimental wheeled machine that was able to open and pass through 10 doors and plug itself into 10 standard wall sockets to recharge itself in less than 60 minutes. The PR2 has a maximum travel speed of 1.25 miles per hour, and it perceives its surroundings with a combination of sensors that include scanning lasers and video cameras. Stanford University roboticist Andrew Ng says PR2 appears to be the first robot capable of repeatedly and reliably opening doors and plugging itself in. Willow Garage is attempting to develop a new generation of robotic personal assistants by building a Robot Operating System (ROS), an open source project that seeks to take advantage of contributions from robotics experts around the world. Willow Garage CEO Steve Cousins says that a team of University of Tokyo roboticists recently tweaked the Willow Garage ROS to function on a machine they were developing. "The eventual goal is to provide a set of capabilities that are so generic and so universal that they can be used as building blocks in more complicated applications," says Willow Garage board member and Stanford Artificial Intelligence Laboratory director Sebastian Thrun.
View Full Article - May Require Free Registration | Return to Headlines


Extracting Meaning From Millions of Pages
Technology Review (06/10/09) Talbot, David

University of Washington researchers have developed an automated information extraction software engine that mines meaning out of more than 500 million Web pages, contributed by Google, by analyzing fundamental relationships between words. The project expands the scale of the TextRunner application in terms of the number of pages and the breadth of topics it can examine. "The significance of TextRunner is that it is scalable because it is unsupervised," says Google research director Peter Norvig. "It can discover and learn millions of relations, not just one at a time. With TextRunner, there is no human in the loop: It just finds relations on its own." University of Washington researcher and project leader Oren Etzioni says the prototype still has a simple interface and is meant to function as a demonstration of automated information extraction rather than as a public search tool. "This work reflects a growing trend toward the design of search tools that actively combine the pieces of information they find on the Web into a larger synthesis," notes Cornell University scientist Jon Kleinberg. The University of Washington researchers are now working on the building of inferences from natural-language queries, using TextRunner as a jumping-off point.


Computing Research That Changed the World--VIDEOS!
Computing Community Consortium (06/07/09)

The Computing Community Consortium held a day-long symposium at the Library of Congress in late March titled "Computing Research That Changed the World: Reflections and Perspectives." Ed Lazowska from the University of Washington gave the introductory talk, "Changing the World," and the symposium offered four sessions, which are now available as videos at http://www.cra.org/ccc/locsymposium_slides.php. The first was titled The Internet and the World Wide Web, and included a presentation by University of California, Berkeley's Eric Brewer on "The Magic of the 'Cloud': Supercomputers for Everybody, Everywhere." Google's Alfred Spector gave the talk "Why We're Able to Google" and a presentation by Luis von Ahn of Carnegie Mellon University was titled "Human Computation." The second session, Evolving Foundations, included the talk "Global Information Networks" by Jon Kleinberg of Cornell University, who won the 2008 ACM-Infosys Foundation Award in the Computing Sciences, and the presentation "Security of Online Information" by the Massachusetts Institute of Technology's (MIT's) Barbara Liskov, who won the 2008 ACM A.M. Turing Award. Transformation of the Sciences via Computation, the third session, offered the discussions "Supercomputers and Supernetworks are Transforming Research" by Larry Smarr of the University of California, San Diego, and "Computing and Visualizing the Future of Medicine" by Chris Johnson of the University of Utah. For the fourth session, Computing Everywhere!, UCLA's Deborah Estrin discussed sensing, Stanford University's Pat Hanrahan focused on pixels, and MIT's Rodney Brooks addressed robotics. Videos of all the presentations will be available on the ACM Queue web site, queue.acm.org, in the immediate future.


No More Geeky Glasses to Watch 3D
ICT Results (06/10/09)

European researchers are working on technology that may make it unnecessary to wear special eyewear in order to watch three-dimensional (3D) video. The breakthrough was accomplished through the HOLOVISION project, which ended in April 2008, and the OSIRIS project, which will be completed at the end of 2009. The chief goal of the HOLOVISION effort was to develop technologies that could generate a very high-resolution 3D image. "We basically organized projection engines in a special way and used holographic imaging film for the display screen," says Akos Demeter of Holografika. "The combination of these, with the projection engines being driven by a cluster of nine high-end PCs, and new sophisticated software, allowed us to achieve our aims." HOLOVISION yielded a prototype system with about 10 times the resolution of high-definition TV at 25 frames per second in six colors, rather than the standard three. One of OSIRIS' key objectives is the development of high-resolution, big screen, reflective projection 3D cinema, and the prototype being worked on has a wall-mounted screen and a ceiling-mounted projector. The OSIRIS technology employs an array of mirrors and light sources to provide the re-projected images and give the screen a depth of between 15 and 20 inches. Holografika's Zsuzsa Dobranyi envisions military combat training and gaming as potential applications for the technology, while computer-aided design and other industrial and professional applications could be rolled out in a matter of months.


The Future of Robots Is Rat-Shaped
Agence France Presse (France) (06/07/09) Hautefeuille, Annie

Some roboticists believe that artificial intelligence researchers are following the wrong path by trying to replicate human intelligence, and a better approach would be to start at a lower level and work out simpler abilities that humans and animals have in common, such as navigating, avoiding danger, and searching for food. France's Institute for Intelligent Systems and Robotics (ISIR) is working on a robot whose intelligence and body is modeled after the rat. ISIR is developing a rat robot called Psikharpax that uses wheels to move around and has two cameras for eyes and a pair of microphones for ears. The robot rat is outfitted with artificial whiskers to sense obstacles, as its real-life counterpart does. Data from the whiskers is fed into a chip whose software hierarchy mirrors the structures in a rat's brain that process and analyze visual, auditory, and sensual input. Training the rodent droid to survive in new environments--by detecting and circumventing objects, avoiding collisions, and spotting opportunities to recharge itself--is the goal of the experiment. "We want to make robots that are able to look after themselves and depend on humans as least as possible," says ISIR researcher Agnes Guillot. "If we want to send a robot to Mars, or help someone in a flat that we don't know, the robot has to have the ability to figure out things out for itself."


World's Best Data Mining Knowledge and Expertise on Show in Paris at KDD-09
Business Wire (06/08/09)

Knowledge Discovery and Data Mining 2009 (KDD-09), organized by ACM's Special Interest Group on Knowledge Discovery and Data Mining, will offer more than 120 presentations by data-mining experts from around the world and is expected to be attended by more than 600 leading data-mining researchers, academics, and practitioners. "Some of the best minds from the scientific and business communities will be there, ready and willing to share the results of their cutting-edge research and data-mining projects with end users," says KDD-09 joint chair Francoise Soulie Fogelman. "No other industry event offers anything like the depth and breadth of expertise on offer here." Social network analysis will be a focus of KDD-09. Data mining experts also will focus on using real-time Web applications for data mining for custom advertising and personalized offers. KDD-09 will take place June 28 through July 1 in Paris.


Is the Hacking Threat to National Security Overblown?
Wired News (06/03/09) Singel, Ryan

U.S. President Obama recently made cybersecurity a national priority, but at the ACM's Computers, Freedom, and Privacy Conference, Threat Level editor Kevin Poulsen asked whether hacking and cyberattacks are an actual threat to the United States or simply the latest exaggerated threat to national security. Former Bush administration cybersecurity czar Amit Yoran says that hacking is absolutely a national security threat, and cites stories about the denial-of-service attacks against Estonia, attacks against government contractor Booz Allen Hamilton, and the recently reported breach of defense contractor computers that gave the attackers access to information on the Joint Strike Fighter. Poulsen says the threat of cyberterrorism is "preposterous," pointing out the long-standing threat that hackers would attack the power grid, which has never happened, and arguing that calling such potential attacks national security threats means that information about any possibility of defeated attacks is unnecessarily classified. "If we can't publicly share info that the attackers already have--since it's about them--then we are doing far more harm than good," says Poulsen, who argues that classification makes it impossible for the security community to, as a whole, prepare defenses for such attacks. Furthermore, Poulsen points out that the Joint Strike Fighter attack involved only unclassified information. However, security expert Bruce Schneier says there will be cyberattacks that affect the real world, though the current threat is exaggerated. "Passive defenses alone are not sufficient," says National Research Council cyberattack expert Herb Lin. "You have to impose costs on an attacker and maybe the only way to do that is a cyberattack yourself."


Visual System that Detects Movement, Colors and Textures Created in Granada
Plataforma SINC (06/08/09)

A University of Granada research team has evaluated the accuracy of several models that estimate movement, and combined the responses of four movement detection cells, two of which are static, either on or off, and two are transitory, which have varying degrees. The objective was to create an artificial retina by combining movement and attention data based on information provided by a visual system capable of selectively capturing moving objects in real time. An event-driven model, which allows the system to focus only on areas of activity, was critical to the process in both the movement processing model and in the multimodal selective attention model. One result of the study is the ability to estimate movement reasonably accurately using the responses from the four cells. "By selecting only 10 to 20 percent of the information, which we selected on the basis of reliability of the measurements, we obtained precise results at a lower computational cost and with greater stability," says Granada researcher Fran Barranco. The researchers also developed advanced integrated intelligent sensors that can pre-process a scene using techniques similar to those used by retinas. The devices created by the project are designed for use in video surveillance and monitoring applications, though their low power consumption also makes them well suited for implants in patients or in research dedicated to understanding the brain, particularly the visual system.


Supercomputing From Clusters to Clouds
Northwestern University News Center (06/03/09) Monroe, Ian; Berret, Charles; Leonard, Michael Scott

The Intrepid supercomputer is the largest installation of IBM's Blue Gene architecture to date. Part of the Argonne Leadership Computing Facility, the $77 million supercomputer, the fifth fastest in the world, is used in a limited number of projects selected by the U.S. Department of Energy for their potential scientific significance. Some of the projects that will use the supercomputer this year include an experiment by University of Washington researcher David Baker designed to predict protein structures, and an Argonne project that will examine fluid flow inside nuclear reactor cores. The supercomputer draws a massive amount of power, 1.2 megawatts, for both its computing and ancillary systems, which is why Argonne scientists are working on ways of optimizing when software runs based on generated heat. Argonne's Mike Papka says some software run hotter than others, and programs that continuously run a processor at full capacity could run approximately 20 degrees hotter than programs that run intermittently. Part of Argonne's efforts to advance supercomputing is Nimbus, cloud computing software that allows nontechnical users to setup a virtual supercomputer at a low cost and in only a few minutes. Cloud computing can be used to pool the processing power, memory, and storage space of thousands of remote, small, and inexpensive computers to create virtual supercomputers. Nimbus is intended for scientific users, who can access a variety of science-related clouds for free, providing universities and researchers with access to computing power that previously may have been unobtainable.


Swiss Supercomputer 'Monte Rosa' Switches On
Swiss National Supercomputing Centre (06/04/09)

The Swiss National Supercomputing Center (CSCS) has completed the upgrade to its Monte Rosa supercomputer, making it the most powerful computer in Switzerland with a peak performance of 141.6 teraflops. The Monte Rosa supercomputer's peak performance has improved more than eightfold over its predecessor, making CSCS one of the world's leading high performance computing centers in terms of computer capacity. CSCS scientists say the Monte Rosa will place in the top 50 at the international supercomputer conference in Hamburg this June, where the top 500 fastest supercomputers will be announced. The machine's 14,752 processors are capable of performing 141 trillion computer operations per second. Swiss researchers have been called upon to submit high-impact projects involving challenging simulations that were previously impossible due to a lack of sufficient large-scale computing facilities. The CSCS is using these projects to test the capabilities of Monte Rosa. Despite the recent completion of Monte Rosa, CSCS team members are already planning the next upgrade, aiming to provide users with a petaflop computer capable of performing a quadrillion computer operations per second.


Improved Techniques Will Help Control Heat in Large Data Centers
Georgia Institute of Technology (06/02/09) Toon, John

About a third of the electricity consumed by large data centers is used for cooling the servers, an amount that is expected to rise as computer processing power expands and as cloud computing grows increasingly popular. "Some people have called this the Moore's Law of data centers," says Georgia Institute of Technology professor Yogendra Joshi. "The growth of cooling requirements parallels the growth of computing power, which roughly doubles every 18 months. That has brought the energy requirements of data centers into the forefront." Georgia Tech researchers are using a 1,100-sqaure foot simulation data center to optimize cooling strategies and develop new heat transfer models that designers can use to improve future data centers and equipment. The researchers' goal is to reduce the percentage of electricity used to cool data centers by as much as 15 percent. Most data centers rely on large air conditioning units to pump cool air into server racks, and traditional data centers have used raised floors to allow air to circulate beneath equipment. As cooling demands have increased, data center designers have developed systems of alternating cooling outlets and hot air returns throughout the facilities. "How these are arranged is very important to how much cooling power will be required," Joshi says. "There are ways to rearrange equipment within data centers to promote better air flow and greater energy efficiency, and we are exploring ways to improve those." The Georgia Tech simulated data center features different cooling systems, partitions to change room sizes, and both real and simulated server racks. Fog generators and lasers are used to visualize air flow patterns, infrared sensors quantify heat, airflow sensors measure the output of fans and other systems, and thermometers measure temperatures on server motherboards.


Abstract News © Copyright 2009 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]




Unsubscribe
Change your Email Address for TechNews (log into myACM)