Association for Computing Machinery
Welcome to the April 27, 2011 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Also, please download our new ACM TechNews iPhone App from the iTunes Store by clicking here and our new ACM TechNews iPad App by clicking here.


Digging Deeper, Seeing Farther: Supercomputers Alter Science
New York Times (04/25/11) John Markoff

Expanding computing power is transforming scientific research, making vast collections of digitized information the primary materials scientists study. "The profound thing is that today all scientific instruments have computing intelligence inside, and that's a huge change," says California Institute for Telecommunications and Information Technology (Calit2) director Larry Smarr. Indiana University computer scientist Katy Borner says software-based scientific tools are making previously invisible and incomprehensible phenomena and processes detectable. Borner says that computing is changing the research focus from individual scientists to research teams, while Web 2.0 software has eased the sharing of research findings and helped support a boom in collaboration as well as accelerated the range of cross-disciplinary initiatives. Calit2 researchers and others have started designing OptIPortal display systems that offer better representation of scientific data to bring visualization up to scale with growing computing capacity. Software-based tools are rapidly expanding functions in many scientific disciplines as open source systems let small groups and individuals add features that enable customization.

A New System Increases the Reliability of Opinion Polls
Universidad Politecnica de Madrid (Spain) (04/26/11) Eduardo Martinez

Universidad Politecnica de Madrid researchers have developed a fuzzy neural network that uses a numerical and categorical imputation method to reconstruct incomplete data sets, which could be used to determine the voting intention of a voter that has not answered all the opinion poll questions with near 90 percent accuracy. The system, developed by Jesus Cardenosa and Pilar Rey del Castillo, also can be used for medical diagnosis or surveying using categorical variables. The system works by first defining the distances between categories using fuzzy logic. It then determines where each category is located within the different dataset spaces using the neural network. Finally, the system extends the network architecture to all the data and processes the missing data.

Computers, Too, Can Give Away Location
Wall Street Journal (04/27/11) Amir Efrati; Jennifer Valentino-DeVries

Google and Apple recently revealed that in addition to collecting location information from mobile devices, the companies also collect and store location data from personal computers using local Wi-Fi networks. The companies obtain the information after a computer scans the area for available Wi-Fi networks and after users give a Web site permission to determine the computer's location. Both companies have said the data they collect are anonymous, and the information can be used to improve users' experience using sites such as Google Maps. Google collects the information through the Chrome browser and the Google toolbar, which is included in many other browsers. Google says the data it receives from computers when they scan the area for wireless networks is used to improve the quality of the Google location database. Meanwhile, Apple recently sent a letter to Congress describing its location-gathering methods. Macintosh computers running the Snow Leopard operating system send the company information about a computer's location if the user is on a Wi-Fi network, according to the letter. However, users can keep the information from being collected by turning off the data collection feature.

PlayStation 3 Clusters Providing Low-Cost Supercomputing to Universities
Government Technology (04/25/11) Colin Wood

The Air Force Research Laboratory's (AFRL) Condor Supercomputer consists of 1,716 Sony PlayStation 3 (PS3) video game consoles, 168 general-purpose graphics processing units, and 78 compute servers. The AFRL recently opened up the supercomputer to universities to conduct research on predictive modeling, computational intelligence, parallel data searches, and computational linguistics. University of Dayton researchers led by professor Tarek Taha have been using Condor to create artificial neural networks that could be used in computing and neuroscience. Taha is using Condor to develop algorithms that deal with visual problems such as facial recognition and maze navigation. "The PS3's Cell processor handles a lot of parallelism, like neurons in a brain," he says. University of Massachusetts, Dartmouth professor Gaurav Khanna uses a cluster of 16 PS3s to conduct gravitational research and black holes. Khanna also is helping AFRL develop benchmarking code that will prove that Condor is among the world's 40 fastest computers.

Really Remote Data
Technology Review (04/26/11) Christopher Mims

Cambridge University researchers want to move data centers to exotic locations such as deserts, mountaintops, and the middle of the Atlantic Ocean to harness renewable solar and wind energy and put it to use. The researchers describe an example of putting solar- and wind-powered data centers in the desert in Australia and another one in Egypt to take full advantage of the sun's energy. They found that connecting a remote renewable energy plant to a power grid would be prohibitively expensive, but running fiber-optic cable to the site would be relatively easy and inexpensive. "We envisage data centers being put in places where renewable energy is being produced and you could never economically bring it back to heat a house," says Cambridge researcher Andy Hopper. The key to incorporating far-flung, intermittently available data centers into a cloud infrastructure is to be choosy about which processes are offloaded to them, says Cambridge researcher Ripduman Sohan.

Machines Will Achieve Human-Level Intelligence in the 2028 to 2150 Range: Poll (04/26/11)

There is a 90 percent chance that machines will achieve human-level intelligence by 2150, according to the results of an informal poll conducted at the recent Future of Humanity Institute (FHI) Winter Intelligence conference on machine intelligence. Respondents said there is a 10 percent chance that it would occur by 2028, and a 50 percent chance by 2050. Industry, academia, and the military are most likely to develop human-level machine intelligence. Respondents said the consequence would either be extremely good or extremely bad, and not in between. Of the 32 responses to a question on the similarity to the human brain, eight thought human-level machine intelligence would be due to biologically inspired/emulated systems, 12 said brain-inspired artificial general intelligence (AGI), and 12 said it would arise from an entirely new AGI. "Human-level machine intelligence, whether due to a de novo AGI or biologically inspired/emulated systems, has a macroscopic probability to occurring mid-century," say report authors and FHI researchers Anders Sandberg and Nick Bostrom.

The Really Smart Phone
Wall Street Journal (04/23/11) Robert Lee Hotz

Researchers worldwide are studying mobile phone data in an attempt to find patterns in human behavior that could reveal how people interact. Meanwhile, advances in statistics, psychology, and social networks have provided researchers with tools to find patterns of human dynamics that were too subtle to be seen before. Northeastern University researchers studied more than 16 million European mobile-phone users and found that with enough information about past movements they could predict a user's future location with 93.6 percent accuracy. "We can quantify human movement on a scale that wasn't possible before," says Santa Fe Institute's Nathan Eagle, who used cell phone data to study how slums can be a catalyst for economic potential. Harvard University's Nicholas Christakis is using phone data to study how diseases, behavior, and ideas spread throughout social networks. AT&T researchers recently analyzed millions of call records from mobile phone users in New York and Los Angeles to compare commuting habits in the two cities. Indiana University's Johan Bollen found that analyzing millions of Twitter messages captured swings in national mood and could predict changes in the stock market up to six days in advance with 87 percent accuracy.

Scientists Launch Internet Protocol Research Center
InformationWeek (04/25/11) Chandler Harris

The Internet2 networking consortium, Indiana University, and Stanford University's Clean Slate Program recently launched the Network Development and Deployment Initiative (NDDI) to create new network platforms for global scientific research. NDDI will create multiple virtual networks that enable network researchers to test and experiment with new Internet protocols and architectures. NDDI includes a new Internet2 networking service called the Open Science, Scholarship, and Services Exchange (OS3E), which will connect Internet2's regional network connectors with international exchange points. "While the OS3E will be of immediate benefit to scientists, the NDDI will introduce major new capabilities for network researchers and other academic disciplines in the future," says Internet2 CEO Dave Lambert. The NDDI and OS3E platform includes the global exchange of massive datasets and large-scale network research initiatives such as the U.S. National Science Foundation's Global Environment for Network Innovations project.

Conducting Ferroelectrics May Be Key to New Electronic Memory
Oak Ridge National Laboratory (04/25/11) Morgan McCorkle

Oak Ridge National Laboratory (ORNL) researchers have found new properties of ferroelectric materials that could lead to a new paradigm of electronic memory storage. The researchers found that domain walls, which are the separation zones only a few atoms wide between opposing states of polarization in ferroelectric materials, act as dynamic conductors rather than static ones. "Our measurements identified that subtle and microscopically reversible distortions or kinks in the domain wall are at the heart of the dynamic conductivity," says ORNL's Peter Maksymovych. The novel component of the research is that this type of behavior is unlike traditional electronics, which rely on silicon transistors that act as switches in an electric field. The researchers want to use the discovery to advance memory storage and nanoelectronics technologies. "Finding functionality intrinsic to nanoscale systems that can be controlled in a novel way is not a path to compete with silicon, but it suggests a viable alternative to silicon for a new paradigm in electronics," Maksymovych says.

Obama: STEM Education a Must-Have
eSchool News (04/21/11)

Speaking at a recent town hall event held on Facebook's Palo Alto, Calif., campus, President Barack Obama talked about the importance of science, technology, engineering, and math (STEM) education for students and U.S. competitiveness. Prioritizing STEM education, especially for girls and minorities, is one of the most important efforts the U.S. can make if it hopes to produce college- and career-ready students, Obama says. He says recent federal programs, such as Race to the Top and Investing in Innovation, have encouraged local jurisdictions to think creatively, examine every aspect of their educational system, and develop solutions to better inform teachers and to keep students engaged in learning. "We've got to do such a better job when it comes to STEM education," Obama says. "That's one of the reasons that we had our first science fair at the White House ... because we want to start making science cool. That’s how we're going to stay competitive for the future."

SDSC to Venture Capitalists: Data-Intensive Supercomputing Is Here
UCSD News (CA) (04/22/11) Jan Zverina

The University of California, San Diego's San Diego Supercomputer Center (SDSC) plans to deploy a new data-intensive supercomputer system later this year. Named Gordon, it will be the first high-performance computer to use large amounts of flash-based solid state drive memory. Gordon will have 250 trillion bytes of flash memory and 64 input/output nodes, and will be capable of handling massive databases while operating up to 100 times faster than hard drive disk systems for some queries. "We are reengineering the entire data infrastructure in SDSC to support the capabilities offered by Gordon," says SDSC director Michael Norman. He says the supercomputer will be ideal for data mining and data exploration, where researchers need to run through large amounts of data in order to find a small amount of valuable information. Gordon also will be used for researching genomic medicine, conducting interaction network analysis for new drug discovery, and researching modestly scalable codes in quantum chemistry.

Highway Robbery: Car Computer Controls Could Be Vulnerable to Hackers
Scientific American (04/20/11) Larry Greenemeier

Researchers at the University of California, San Diego and University of Washington recently found that a hacker could use a cell phone to unlock a car's doors and start its engine remotely. The researchers inserted malware into a car's computer system using its Bluetooth and cell phone links. Their research "shows the need for security measures in vehicular onboard networks," says Fraunhofer Institute for Secure Information Technology researcher Olaf Henniger. He is a member of Europe's E-Safety Vehicle Intrusion Protected Application (EVITA) project, which aims to develop a security blueprint that car manufacturers can follow to build more secure onboard networks. The EVITA project is focused on protecting vehicle-to-vehicle and vehicle-to-infrastructure communication designed to prevent traffic accidents. A secondary project, called Preparing Secure Vehicle-to-X Communication Systems, will use EVITA's specifications to create standardized security hardware that would be less expensive to implement.

The New Cultural Form: Perfection Versus Mortality in Games and Simulation at Rensselaer
Rensselaer Polytechnic Institute (04/18/11) Mary L. Martialay

Ben Chang, co-director of the Rensselaer Polytechnic Institute's Games and Simulation Arts and Sciences program, sees the dialogue between perfection and mortality as an important influence in the world of games and simulation. Chang and his colleagues have been working on several projects, including Willy Nilly's Surf Shack, which gives flawless avatars real-world problems such as clumsiness, and Becoming, a computer-driven video installation in which the attributes of two animated figures are interchanged. "Over time, this causes each figure to take on the attributes of the other, distorted by the structure of their digital information," Chang says. His work has been exhibited in numerous venues and festivals worldwide, including Boston CyberArts, SIGGRAPH, the FILE International Electronic Language Festival, and the Athens MediaTerra Festival. Chang teaches a two-semester course on developing games that incorporates computer science, writing, art, design, and programming. "Think of it as a foundation into developing games that you can take into experimental game design and stretch beyond it," Chang says.

Abstract News © Copyright 2011 INFORMATION, INC.
Powered by Information, Inc.

To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe