Association for Computing Machinery
Welcome to the February 11, 2011 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Also, please download our new ACM TechNews iPhone App from the iTunes Store by clicking here and our new ACM TechNews iPad App by clicking here.


Exabytes: Documenting the 'Digital Age' and Huge Growth in Computing Capacity
Washington Post (02/10/11) Brian Vastag

The global capacity to store digital information totaled 276 exabytes in 2007, according to a University of Southern California (USC) study. However, that data is not distributed equally, with a distinct line dividing rich and poor countries in the digital world, says USC's Martin Hilbert. In 2007, people in developed countries had access to about 16 times greater bandwidth than those in poor countries. "If we want to understand the vast social changes underway in the world, we have to understand how much information people are handling," Hilbert says. The study found that 2002 marked the first year that worldwide digital storage capacity was greater than total analog capacity. "You could say the digital age started in 2002," Hilbert says. "It continued tremendously from there." Digital media accounted for 25 percent of all information stored in the world in 2000, but just seven years later 94 percent of all the information storage capacity on Earth was digital, with the remaining six percent comprised of books, magazines, video tapes, and other non-digital media forms. The study found that digital storage capacity grew 23 percent a year from 1986 to 2007, while computing power increased 58 percent a year during the same period. Hilbert notes that people generate 276 exabytes of digital data every eight weeks, but much of that information is not stored long term.

Super Computer to Be Used for Agricultural Research
Daily News & Analysis (India) (02/09/11) Arun Jayan

The Indian Council of Agricultural Research is building a national agricultural bioinformatics grid with assistance from the Center for Development of Advanced Computing (C-DAC). The grid is designed to improve agricultural productivity and help address issues such as food security. "Now scientists have to wait for a production cycle to get over to analyze various issues like quality of seed, produce, and weather pattern," says C-DAC's Goldi Misra. However, high-performance computers could be used for such analysis instead, Misra says. The first phase of the project will focus on connecting government agencies with high-speed networks. Agricultural universities and research centers also could be added to the grid to enable researchers to perform complex analytical processes. The grid will provide computational support for high-quality research in agriculture and biotechnology, says Indian Agricultural Statistics Research Institute researcher Anil Rai. "This will lead to the development of superior varieties [of] seeds, the right fertilizers, and will help various other processes to enhance agricultural productivity on sustainable basis," he says.

Exploration of Distributed Creativity in Multi-Site 3D Tele-Immersive
Computing Community Consortium (02/11/11)

Researchers at the University of Illinois at Urbana-Champaign and University of California, Berkeley recently developed teleimmersive technology that enables users to work together remotely and have a face-to-face meeting in a virtual environment. The technology creates a full-body, three-dimensional (3D) reconstruction of each user and can be applied to videoconferencing, work on large data sets, design models, remote training, distance learning, interactive gaming, and advanced social networking. Teleimmersion offers advantages over existing videoconferencing technology because it can humanize remote communication by placing all users in a common virtual space. The teleimmersion system consists of 48 cameras that can record images in all directions. Multiple servers run the stereo reconstruction algorithm that creates a full-body 3D reconstruction of the users inside the space. The researchers are developing a low-cost portable system that could be used in the office or home.

Computing Science Rewriting the Program to Get Girls in the Game
University of Alberta (02/10/11) Jamie Hanlon

University of Alberta researchers have found that high school girls become more interested in computer science if video game creation is incorporated into the lesson plans. The researchers conducted a study to determine if girls would have as much interest in developing video games as boys. "We thought we should have female students create games and see if they are just as excited about making games as male students and see whether it's an attractor to computing science that is independent of gender," says Alberta professor Duane Szafron. The study found that female students enjoyed developing games just as much as males, and preferred game design to other activities such as creative writing. "In terms of the quality of the games developed and the abstraction skills that the students learned, which could translate to knowledge of competing science--and in terms of the amount of fun that they had--there was no difference between the two groups," Szafron says.

Researchers at Harvard and MITRE Produce World's First Programmable Nanoprocessor
Harvard University (02/09/11) Caroline Perry

Researchers at Harvard University and the MITRE Corporation have developed a nanoprocessor featuring nanocircuits that can be programmed electronically to perform arithmetic and logical functions. "This work represents a quantum jump forward in the complexity and function of circuits built from the bottom up, and thus demonstrates that this bottom-up paradigm, which is distinct from the way commercial circuits are built today, can yield nanoprocessors and other integrated systems of the future," says Harvard's Charles M. Lieber. The researchers say the nanowire components have shown the reproducibility to build working electronic circuits and the tiled structure is fully scalable, which makes building larger nanoprocessors more feasible. The nanoprocessors also use very little power because they contain nonvolatile transistor switches, which means they do not need additional power to maintain memory. "Because of their very small size and very low power requirements, these new nanoprocessor circuits are building blocks that can control and enable an entirely new class of much smaller, lighter-weight electronic sensors and consumer electronics," says MITRE's Shamik Das.

Foreign Enrollments Explain Obama's Skills Push
Computerworld (02/09/11) Patrick Thibodeau

Foreign student enrollments at U.S. universities for doctorates in computer science, engineering, and math is rising. In 1980, less than 3,000 science and engineering doctorates were given to foreign students, representing about 16 percent of the total receiving such degrees. However, in 2005 more than 11,000 foreign students received science-related doctorates, representing 38 percent of the total, according to U.S. National Aeronautics and Space Administration researcher Robert Hamilton. "These student pipelines to the United States appear to be facing growing competition from other nations desiring the best and brightest foreign students of the world," Hamilton notes. Policies that give residency to foreign advanced degree holders can benefit the U.S. economy, says the Center for Technology Innovation's Darrell West. However, Institute for the Study of International Migration's Lindsay Lowell disagrees, citing Australia's similar program that led to students emigrating to Australia for land status instead of to study. Meanwhile, just five percent of H-1B holders applied for permanent residency between 2007 and 2009, according to Infosys.

Ancient Buildings Brought to Life From Historic Maps
New Scientist (02/09/11) Jacob Aron

The process of digitally reconstructing long-lost cities from historic maps should be much easier using new software developed by a team led by the University of East Anglia's Stephen Laycock. The software automates digitizing paper maps when buildings are shown in characteristic colors, and it is 10 to 100 times faster than tracing outlines by hand. The software detects blocks of color on a scan of a map, then highlights the edge of each block to generate an outline of a building. To work with black and white maps, users need to click a point inside each building, but using the software is still at least twice as fast as working by hand. The researchers also say the software can correct the distortion of scale in old maps by overlaying the building outlines on an accurately surveyed modern map. The outlines of buildings could be imported into the CityEngine program that generates three-dimensional images from information on how buildings look in a particular period. Museum curators would be able to use the software to create interactive exhibits of historic locations.

Neural Networks Make Intelligent Sensors and Smarter Smart Grids
Idaho National Laboratory (02/08/11) Kortny Rolston

Idaho National Laboratory researcher Milos Manic is developing neural networks combined with fuzzy math to create models of self-organizing data maps, which can find similarities and interdependencies for researchers to analyze. The maps are sent to a computer-assisted virtual environment (CAVE), where researchers can view the data in three dimensions. Manic says that self-organizing maps enable computers to organize and visualize data in a way that reveals correlations that might not have been found otherwise. One research project involves smart cybersensors that are used in industrial control and smart grid systems. "All of this allows researchers to see data in new ways and pick out patterns and explain scientifically why something is occurring," Manic says.

Signal Moment for Drivers as Software Sends Calls to Voicemail
Sydney Morning Herald (Australia) (02/09/11) Nicky Phillips

Researchers are developing technology designed to reduce driver distraction. A team at Queensland University of Technology has developed software that analyzes the safest way for drivers to interact with technology while behind the steering wheel. The software uses a brain monitor to measure the cognitive workload of a driver, and uses global positioning systems to chart the performed driving maneuvers. In combining the information, the software is able to provide an overall assessment of the driver's workload. "If the driver is in the middle of a complex maneuver then we need to delay the delivery of a message," says the Center for Accident Research and Road Safety-Queensland's Andry Rakotonirainy. "If a driver is performing a right-hand turn at an intersection that requires a lot of cognitive workload, an incoming phone call would automatically be sent to voicemail." The researchers also have assessed the best method to deliver a message to a driver, and found that a tactile warning such as a vibrating seat is less distracting than using vision or sound.

HP's Open Innovation Strategy: Leveraging Academic Labs
Technology Review (02/08/11) Neil Savage

Hewlett-Packard Labs launched the Innovation Research Program four years ago as part of its effort to embrace ideas and technologies that come from other sources, and to enable other researchers to adopt its technology. The program seeks proposals on 26 topics within the eight broad themes of HP Labs, including cloud computing, digital printing, and sustainability, and gives grants of $50,000 to $75,000 to university researchers. Rich Friedrich, director of the Open Innovation Office at HP Labs, says the idea is to identify long-term goals and determine what it takes to reach them. HP researchers and grant recipients have co-authored about 200 journal papers, and at least 21 invention disclosures, the first step in securing a patent, have been filed. Grant winners must sign an agreement on sharing any intellectual property that results from the research. Alan Willner, a grant winner and an electrical engineer at the University of Southern California, is working to improve signal processing in optical interconnects on computer chips. HP plans to double the performance of such interconnects by next year and increase it 20-fold by 2017.

Can 'Encrypted Blobs' Help With Secure Cloud Computing?
Network World (02/04/11) Ellen Messmer

IBM researchers say cloud computing security can be improved by using fully homomorphic encryption to send data as encrypted blobs, which can be understood without actually having to be decrypted. Breakthrough mathematical work in fully homomorphic encryption done by IBM researcher Craig Gentry is providing a "foundation for the encrypted path" that IBM thinks could radically improve how data can be kept secret and confidential, says IBM's J.R. Rao. The idea is to create encrypted blobs that do not have to be decrypted and still provide practical applications by being combined with and processed by other encrypted blobs. Such applications include computational arithmetic on encrypted data and privacy enhancements for Web services. Encrypted blobs also would offer a way to store data in the computer cloud so that there could be authorized processing of it without it having to be converted into cleartext and then re-encrypted. The goal is that "if end users are submitting private and sensitive information in the cloud," they would know that data would be kept confidential as encrypted blobs, Rao says.

Researchers Develop Transistor With High On/Off Switching Ratio
The Engineer (United Kingdom) (02/04/11)

Southampton University researchers have developed a graphene transistor with an on/off switching ratio that is 1,000 times higher than existing devices, a development that could increase the performance and functionality of future electronic devices. Southampton's Zakaria Moktadir led the development of a graphene field-effect transistor (GFET) with a channel structure at the nanoscale. "Other researchers had looked at graphene as a possibility, but found that one of the drawbacks was that graphene's intrinsic physical properties make it difficult to turn off the current flow," Moktadir says. The researchers solved this problem by using geometrical singularities in bilayer graphene nanowires and a helium-ion beam microscope, which resulted in an efficient on/off switching ratio. "Introducing geometrical singularities into the graphene channel is a new concept that achieves superior performance while keeping the GFET structure simple and therefore commercially exploitable," says Southampton professor Harvey Rutt.

What Is the Best Way to Protect U.S. Critical Infrastructure From a Cyber Attack?
Scientific American (02/04/11) Larry Greenemeier

Egypt's shutoff of Internet access has spurred anxieties of similar suppression elsewhere through an Internet kill switch, and compounding this fear is legislation that would empower the White House to sever critical U.S. infrastructure from the Internet in the event of a major cyberattack. James Lewis with the Center for Strategic and International Studies' Technology and Public Policy Program says the Protecting Cyberspace as a National Asset Act of 2010 provides the Department of Homeland Security with the authority to require security in critical infrastructure, which is essential because "when it comes to national security we can't depend on voluntary action." The bill is being redrafted as the authors try to determine an effective strategy for deterring cyberattacks. Lewis says the bill does not truly give the government control over the Internet, but rather authorizes it to disconnect compromised companies from the Internet without shutting it down. He says there must be serious debate on whether the government should have the right to intercede through regulation or through disconnection capability. He notes that many major militaries now possess cyberattack capability, and probably 20 to 30 nations are attempting to obtain it.

Abstract News © Copyright 2011 INFORMATION, INC.
Powered by Information, Inc.

To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe