Association for Computing Machinery
Welcome to the September 28, 2009 edition of ACM TechNews, providing timely information for IT professionals three times a week.


Google Opens Voting on Ideas to Change the World
Computerworld (09/25/09) Gaudin. Sharon

Google's Project 10 to the 100th--a contest celebrating the company's 10th anniversary--asked participants in September 2008 to submit ideas that could help change the world. Google is giving the five winners a collective $10 million to fund their projects. However, because there were more than 150,000 submissions, Google is still deliberating between 16 newly-released finalists. Google says there were many similar ideas, so the 16 finalists are idea themes rather than specific submissions. Google has posted the finalists and is asking users to vote on their favorites over the next two weeks. Their input will be considered when the five winners are selected two weeks from now. Once the winners are picked, Google will ask the individuals and teams to put together project proposals. Ideas include new public transportation technology, a news service updated by regular computer users, a genocide tracking system, and a natural crisis monitoring system. "We hoped to capture the imagination of people around the world and offer a way to bring their best ideas to fruition," the Google Project 10^100 Team wrote in a blog post. "We were overwhelmed by the response. It took more than 3,000 Googlers in offices around the world to review the submissions."

Ants Versus Worms: Computer Security Mimics Nature
Wake Forest University (09/21/09) Frazier, Eric

Pacific Northwest National Laboratory (PNNL) researcher Glenn Fink is working with Wake Forest University professor Errin Fulp to develop a computer security program that models itself after the defensive techniques of ants. The new anti-malware system uses itinerant digital ants to find problems in a large network. When an ant comes across something suspicious, it leaves behind a "scent trail" that draws an army of ants to the problem. The large group attracts the attention of computer users to a possible invasion. "Our idea is to deploy 3,000 different types of digital ants, each looking for evidence of a threat," Fulp says. Rather than equipping all digital ants with the same memory-heavy defenses, the program apportions certain threats to specific digital ants. The digital ants report to a "sentinel" located at each computer, which in turn is supervised by a "sergeant" of the network. All sergeants are monitored and controlled by human users. To test the program, the researchers sent a computer worm into the system and the digital ants were able to corner the worm. PNNL has given the researchers more time to study the program. If successful, the researchers say the program would be ideal for universities, government agencies, and corporations that rely on large networks.

Inducing Innovation With Prizes
Computing Community Consortium (09/25/09) Libeskind-Hadas, Ran

The successful completion of the Netflix Prize competition demonstrates that prizes are a viable mechanism for encouraging research in the computing fields, writes Harvey Mudd College professor Ran Libeskind-Hadas. In the broader computing community, the Clay Mathematics Institute now offers Millennium Prizes, which are awards of $1 million for solutions to each of seven famous open problems, such as whether P=NP. Researchers might not gear their work toward prizes, but one determined that a specific small (2 states and 3 symbols) Turing Machine is universal in 2007 to win a $25,000 prize sponsored by Wolfram Research. The history of prizes for technical innovation goes back to the early 18th century, when the British Parliament offered the Longitude Prize to encourage researchers to come with a practical method for determining a ship's longitude. Libeskind-Hadas says the industry should consider whether certain issues could be incentivized by prizes, but also address the potential risks of such an approach to research.

Canadian Research Network Led by University of Toronto Computer Scientist Gets $5 Million Boost to Improve Business Intelligence
University of Toronto (09/24/09) Dolman, Leslie; Bettam, Sean

University of Toronto researchers have received a $5 million grant from the National Sciences and Engineering Research Council of Canada to lead the creation of the Business Intelligence Network (BIN), which is designed to improve information management systems for business applications. "Business intelligence means using information to make informed decisions," says Toronto professor Renee Miller, BIN's primary investigator. "Our goal is to provide new solutions for business, scientific, and government organizations to enable them to solve modern problems and make decisions using integrated, trustworthy, and up-to-date data." BIN will create a system for enhancing collaboration between leading Canadian knowledge and information management researchers and Canadian business intelligence companies. "Through BIN, we will bring together organizations that are in the business of using, creating, and delivering business intelligence solutions, who need to strengthen their innovation pipeline," Miller says. "Our network will provide a venue for engaging business and government organizations to articulate the real problems that limit their efficiency and burden them with costs." In addition to the University of Toronto, researchers from the universities of Alberta, British Columbia, Ottawa, Waterloo, Carleton, and Dalhouise will be involved in the project, as well as researchers from IBM, SAP, Palomino, IC-Agency, and Zerofootprint.

Greener Computing in the Cloud
Technology Review (09/24/09) Talbot, David

Researchers say that custom data centers being built for cloud computing systems are very energy efficient. "There are issues with property rights and confidentiality that people are working out for mass migration of data to the cloud," says Yale University professor Jonathan Koomey. "But in terms of raw economics, there is a strong argument." Surging server-related energy consumption is a major issue. Energy consumption in data centers doubled between 2000 and 2005 and now accounts for 1.5 percent of worldwide electricity consumption. The Uptime Institute says that data center energy consumption could quadruple by 2020. Cloud computing could offer a solution by focusing on energy efficiency within large data centers. For example, Yahoo's Scott Noteboom says a Yahoo data center being built near Buffalo, N.Y., will use as little as 25 percent of the electricity used by older data centers. The new servers will be more efficient--they will use less electricity when performing fewer computations--and the building will primarily use natural air flows for cooling. Only when the temperature rises above 27 degrees Celsius will the center use air conditioning, which should only be needed about 212 hours per year. The new data center mimics the design of manufacturing facilities in the area that were built before air conditioning and will take advantage of winds coming off of Lake Erie. Koomey says moving data to the Internet has helped reduce overall energy consumption. "Moving bits is inherently environmentally superior to moving atoms," he says. "People worry about energy use of data centers, but they forget that IT enables structural transformation throughout the economy."

Software That Gets Reduced, Reused, Recycled
ICT Results (09/28/09)

European researchers have developed SeCSE, a service-centric software engineering platform for industrial-strength applications. In service-centric software, each task a computer completes, such as creating or printing a document, is a separate service, and by combining many services complex and sophisticated applications can be formed. Once a task is complete, the services disappear. When a new way of completing a task is found, services are simply exchanged instead of having to go through a complex integration or redevelopment process. So far, services has been deployed on a limited, small scale by Web developers. However, the SeCSE project has developed a service-centric development platform that spans the entire software lifecycle from design to deployment. The SeCSE platform uses Java to create software that integrates services from different providers, regardless of the underlying operating systems or programming languages. The Gartner group predicts that this service-centric approach will see widespread deployment for creating business applications.

Machines Could Ultimately Match Human Intelligence, Says Intel CTO
Network World (09/21/09) Brodkin, Jon

Intel CTO Justin Rattner believes that machine singularity is plausible, although he does not know when it might be achieved. Rattner says that machine development is growing at a rate best captured by Moore's Law. "There will be a surprising amount of machines that do exhibit human-like capabilities," he says. "Not to the extent of what humans can do today, but in an increasing number of areas these machines will show more and more human-like intelligence, particularly at perceptual tasks." Rattner says that machines could only enhance human capabilities. "Assuming that interface technology progresses in an accelerating way, the possibilities of augmenting human intelligence with machine intelligence become increasingly real and more diverse," he says. Scientists are developing brain chips and sensors that would give humans the ability to move prosthetic limbs with their minds, for example, and Rattner predicts that this will become common in a matter of years. Supercomputers are developing at a similar rate to machine intelligence. The first petaflop machines appeared online in 2008 and Rattner says the next goal is an exaflop machine, which he hopes could become operational in about 10 years with government funding. He says power considerations are one of the biggest stumbling blocks to boosting supercomputer performance. "You'd need a 500-megawatt nuclear power station to run the thing," says Rattner, who will deliver the opening address at the SC09 supercomputing conference, which takes place Nov. 14-20 in San Diego, Calif.

Berkeley Lab Scientists' Computer Code Gives Astrophysicists First Full Simulation of Star's Final Hours
Lawrence Berkeley National Laboratory (09/22/09) Bashor, Jon

Researchers from the U.S. Department of Energy's Lawrence Berkeley National Laboratory have developed the first full-star simulation of the conditions inside a white dwarf star in the hours before a Type Ia supernova. Type Ia supernovae are of particular interest to astrophysicists, who believe they can be used to measure the expansion of the universe. For the past three years, the researchers have been developing MAESTRO, a program that simulates the flow of mass and heat throughout the star over time, using the power of supercomputers to model the entire star. The simulation is designed for processes that take place far slower than the speed of sound, allowing the simulation to create detailed results while using far less supercomputing time than traditional programs. MAESTRO's approach is different because the sound waves have been removed, allowing the program to run far more efficiently. The simulation provided a valuable view into the end of a process that starts several billion years before the final supernova. Type Ia supernovae start as a white dwarf, the compact remnant of a low-mass star that never became hot enough to fuse its carbon and oxygen. However, if another star is close enough, a white dwarf can start taking on mass from the neighboring star until it reaches a critical limit. The simulations show that at the early stages, the motion of a fluid that carries heat within the star appears as random swirls, but as the heat in the center of the star increases, the flow clearly moves into the star's core on one side and out the other. Other simulations have seen this pattern, but the MAESTRO simulations are the first to capture the full star in three dimensions.

Is Wikipedia a Victim of Its Own Success?
Time (09/28/09) Manjoo, Farhad

The exponential growth of Wikipedia has flattened out since 2007, a development that may signal the limits of crowdsourcing. One explanation is that Wikipedia has reached the threshold of knowledge expansion, and most of the information left to be collated is obscure and thus less alluring to participants. On the other hand, Palo Alto Research Center scientist Ed Chi views the slackening of Wikipedia's growth as the first implosion of a major online ecosystem. Editors' excitement that the revisions they make will endure has given way to frustration that their contributions are usually undone by an elite cohort of editors with more seniority. "People begin to wonder, 'Why should I contribute anymore?' " Chi says. The Wikimedia Foundation has been trying to enhance the encyclopedia's appeal by improving its archaic, often difficult to understand editing interface, to name one example. But it has no plan for how to draw more diverse participants. "The average Wikipedian is a young man in a wealthy country who's probably a grad student--somebody who's smart, literate, engaged in the world of ideas, thinking, learning, writing all the time," says Wikimedia Foundation executive director Sue Gardner. This means that there is a significant underrepresentation of people from developing countries.

Quantum Computers Are Coming--Just Don't Ask When
New Scientist (09/21/09) Brooks, Michael

The development of quantum computers was heralded as being just over the horizon several years ago, only to fade into the background as progress turned out to be slower than advocates anticipated. But now the U.S. National Institute of Standards and Technology's David Wineland says the age of the quantum computer is almost upon us thanks to a "dramatic" rate of progress. In August, Wineland's group disclosed using laser pulses to execute simple computations with two ions and reading out the results, moving the ions around a processor without losing the information encoded on them, and repeating the process. Other processes for carrying out quantum computations have emerged, such as superconducting dots. These are minute dots of aluminum with billions of atoms whose movements are slowed down by extreme chilling, making them ideal quantum bits or qubits. The key milestone for quantum computing is scaling things up. However, the decoherence problem--the disruption of qubits' superposed states by outside forces--is still thwarting the viability of quantum computing schemes, whether based on ion traps or supercomputing dots.

Pedestrian Crossings Could Be Monitored
Plataforma SINC (09/18/09)

A surveillance system for monitoring whether cars and pedestrians are acting normally at crosswalks has been developed by researchers at Spain's University of Castilla-La Mancha (UCLM). "We have developed an intelligence surveillance software and related theoretical model in order to define 'normality' in any setting one wishes to monitor, such as a traffic scenario," says UCLM's David Vallejo. Normal behavior is defined as moving when lights are green, and stopping and not crossing safety lines when they are red. The artificial intelligence system makes use of software agents to monitor pedestrian crossings. The team developed the monitoring tool to determine the effectiveness of its model. "In this way we are able to identify any drivers and pedestrians behaving abnormally, meaning the program could be used in order to penalize such behaviors," Vallejo says. The researchers say the intelligent surveillance system also could be used to analyze behavior indoors, such as at museums, or to detect overcrowding.

American Graduate Programs With Overseas Partners Are on the Rise
The Chronicle of Higher Education (09/21/09) Labi, Aisha

International joint- and dual-degree programs are becoming more numerous in Europe, but interest also is rising in the United States, according to presenters at the annual meeting of the European Association for International Education. The Council of Graduate Schools detailed the results of a new poll of joint and dual degrees at U.S. graduate schools. The council says that enthusiasm for such programs are rising due to the dwindling interest in science and engineering degrees among U.S. graduate students, international realization that graduate education is essential to economic competitiveness, and signs that U.S. universities can no longer rely on a guaranteed steady flow of foreign graduate students to fill their programs. Most master's-level joint- and dual-degree programs in both Europe and the United States are in business and engineering, while the physical sciences and engineering are the most common fields for formal collaboration at the doctoral level. A 2009 study by the Institute of International Education and the Free University of Berlin found that both European and U.S. universities are more likely to have collaborative degree programs with European partners than with institutions elsewhere in the world. Motivating factors for international collaboration include a desire for development and for halting the movement of talent from the developing world. Analysts say that joint and dual degrees will probably become more common in the United States as U.S. universities strive to cost-effectively broaden their academic offerings on a global scale.

Abstract News © Copyright 2009 INFORMATION, INC.
Powered by Information, Inc.

To submit feedback about ACM TechNews, contact: [email protected]

Change your Email Address for TechNews (log into myACM)