Association for Computing Machinery
Welcome to the February 23, 2011 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Also, please download our new ACM TechNews iPhone App from the iTunes Store by clicking here and our new ACM TechNews iPad App by clicking here.

HEADLINES AT A GLANCE


More Companies Plan to Put R&D Overseas
Wall Street Journal (02/22/11) Joe Light

Emerging markets will handle more core research and development (R&D) from developed nations in the years to come, according to a recent Ernst & Young study. Sixteen percent of companies in developed nations now spend more than a quarter of their R&D budget in emerging markets, but 28 percent plan to surpass that mark in five years. In North America, about 11 percent of companies said emerging markets account for more than a quarter of their R&D spending, but more than 23 percent expect to reach that amount in the next few years. Only 7 percent of companies in Western Europe spend more than a quarter of their R&D budget in emerging markets, but about 19 percent plan to hit that mark within five years. "China, India, and Brazil are becoming true centers of innovation and research," says University of Maryland professor Anil Gupta. He says the work skills in emerging markets continue to improve, and researchers in emerging markets can be hired for about a quarter of the salary of a U.S. researcher.


CREAF Hosts the Start of a Project to Improve the Usability of Global Environmental Data
Center for Ecological Research and Forestry Applications (02/18/11)

The enablement of assessment and mitigation of catastrophic environmental effects and the provision of a stringent scientific foundation for debates on climate change and other controversial subjects is the purpose of a project headed by the Center for Ecological Research and Forestry Applications (CREAF) and co-funded by the European Commission. CREAF recently hosted the inaugural conference of the European GeoViQua project with a workshop comprised of all partners and various international experts. The project's goal is the development of a rigorous estimation scheme for quality, uncertainty, and usability of global environmental data and the distribution of the information to users through browsers and data viewers. GeoViQua will play a role in the development of the Global Earth Observation System of Systems (GEOSS), whose chief objective is to produce a global and public environmental data network that is understandable and available in real time. GEOSS content and accessibility will augment the comprehension of planetary dynamics and shape environmental policies. GeoViQua will take environmental information from various international entities and supply methodologies to improve data documentation, formalizing quality parameters and metrics, and extracting them from data sources, validation procedures, or expert users' commentary.


Intel Aims to Reshape Chips for Next-Gen Mobile Devices
IDG News Service (02/21/11) Agam Shah

Intel researchers are working to advance computer chip technology to improve the security and functionality of mobile devices. The company is studying specialized graphics accelerators, hardware layers, and new sensors and accelerators to secure mobile devices, measure air temperature and quality, check speed, and monitor location. Intel researchers already have developed smartphone sensors to measure air quality, targeting carbon monoxide, nitrogen oxide and ozone. However, advanced accelerators and security hardware still need to be developed to work within power restrictions of the software and devices, says Intel's Dadi Perlmutter. Intel researchers also are developing integrated security hardware that could store passwords and capture voice, fingerprint, or eye images to identify users. In addition, the researchers are studying methods to develop more advanced mobile chips and new software to deliver better visual experiences. For example, Intel is developing software tools that produce interactive graphics, such as three-dimensional images that can identify and react to human gestures, says Intel's Shekhar Borkar.


New Technology Shares Online Video in High Quality
Cornell Chronicle (02/21/11) Bill Steele

Most video-sharing sites compress files, reducing their quality and limiting their length. However, Cornell University computer scientists led by professor Gun Sirer have developed a video-sharing service called FlixQ that removes the limits on video quality and length without increasing bandwidth or storage costs. Sirer says that FlixQ technology could substantially reduce the cost of distributing data by minimizing the amount of central server space needed. The key to the technology is a hybrid peer-to-peer system that pairs a central server with a distributed network, enabling FlixQ to cache its files so a user can get all of the data from peers, the server, or parts from both. "People have tried peer-to-peer before, but hybrid peer-to-peer combines the efficiency of peer-to-peer with the centralized control and management of traditional systems," Sirer says. The technology could help large organizations reduce power requirements of data centers, and it also could help smaller groups more efficiently share information.


New Approach to Programming May Boost 'Green' Computing
Binghamton University (02/21/11) Rachael Coker

Binghamton University computer scientist Yu David Liu recently received a U.S. National Science Foundation grant to develop energy-efficient software. "Saving energy is an activity that should come from many layers," says Liu, who plans to build energy-related parameters into a programming language. A change at the programming language level would enable and encourage programmers to directly incorporate energy-saving ideas into software development. Energy-efficient solutions for programming languages also require a high level of platform independence, enabling them to have an impact in various technological devices, including phones and servers. Liu points out that although energy-aware programming is not currently supported by mainstream computer languages, language designers frequently generate an extensible template. New energy-efficient language designs could influence how programmers develop new applications in the future, Liu says. "Sometime in the future, every Computer Science 101 class may include a lecture or two on energy-aware programming," he notes.


Few Students Make Time to Study Computer Science
Pittsburgh Tribune-Review (PA) (02/20/11) Amy Crawford

ACM and the Computer Science Teachers Association's recent Running on Empty report examined the decline in the study of computer science in U.S. public schools. "As the digital age has transformed the world and workforce, U.S. K-12 education has fallen woefully behind in preparing students with the fundamental computer science knowledge and skills they need for future success," the report says. The study found that between 2005 and 2009 the number of secondary schools offering introductory computer science courses dropped by 17 percent, and the number of high schools offering Advanced Placement computer science fell by 35 percent. Some of the study's researchers, such as Carnegie Mellon University graduate student Leigh Ann Sudol-DeLyser, say states should have computer science standards that are a required part of the curriculum. Computer software engineering and information technology are among the fastest growing careers, with more than 300,000 additional jobs expected to be created by 2018, according to the U.S. Bureau of Labor Statistics. "It's not only important for a student to learn to write a letter in Microsoft Word," says Sudol-DeLyser, explaining that every student should learn about basic computer security, media production and simple programming, and interested students should be encouraged to study computer science in depth.


Security and Privacy Issues in the PDF Document Format
Universidad Politecnica de Madrid (Spain) (02/17/11) Eduardo Martinez

Researchers at Universidad Politecnica de Madrid (UPM) recently conducted a study examining security and privacy threats related to digital document publishing. The study focused on the PDF document format and addressed publisher-related information that is leaked once the document is distributed over the Internet. The UPM researchers developed several tools that extract information from PDF documents. The researchers say that users can be in danger every time a digital document is downloaded. For example, the study notes that metadata information such as the user name or the last day the document was edited can lead to privacy breaches since most document authors are not aware that the information remains available once the document is published. Meanwhile, the researchers found that poor document format design is responsible for leaking other potentially sensitive information. For example, the researchers note that when a paragraph is deleted, PDF authoring applications do not remove the text and instead mark it as invisible. As a result, the data can be read by malicious users that know what to look for. The researchers' main goal is to make users aware of the risks associated with publishing a document on the Internet and to provide effective guidelines to minimize the leakage of sensitive information.


Behind the Information Overload Hype
Wall Street Journal (02/19/11) Carl Bialik

Despite findings from a study published in Science that the world's information storage, communication, and computation abilities have increased by at least 23 percent annually since 1986, the accompanying information glut is not as immense as those statistics imply. A great deal of the expansion mirrors the growth in high-resolution video and photos, while each piece of new information is being consumed by far fewer people than before, on average. Almost 50 percent of the general growth stems from rapid upgrades in hard drive technology, making possible the storage of high-resolution videos, photos, video games, and digital music. The Science study determined that in 2007 the human race was able to store 295 exabytes of information, equivalent to about 500 million times an average desktop computer's capacity. The preliminary results of a study to measure information in terms of how much time is devoted to its consumption found that in 2005 people spent approximately one minute consuming media for every 1,000 minutes available, and this ratio has grown by about a factor of 10 since 1960. University of Michigan professor W. Russell Neuman, who is leading the study, says our capacity to use or filter information also can grow rapidly along with the volume of information. Neuman says that by quantifying the world's information in terms of bytes, researchers commit the error of "focusing simply on capacities of machines, and not on how people are responding to the capacities of machines."


New Anti-Laser Tech Paves Way for Optical Computing
InfoWorld (02/18/11) Joab Jackson

Yale University researchers have created an anti-laser, a device that can eliminate beams of light produced by a laser, which could be a key component of future optical computers. The researchers say their work is the first demonstration of light of a particular wavelength being absorbed. "After some research, we found that several physicists had hinted at the concept in books and scientific papers, but no one had ever developed the idea," says Yale's A. Douglas Stone. The researchers built a silicon wafer that can trap and dissipate incoming coherent light of a predefined wavelength. In other words, just as a laser generates coherent light, the CPA absorbs coherent light. The light's energy is dissipated as heat. The researchers call their device a Coherent Perfect Absorber (CPA). They say the CPA could help solve the problem of managing and manipulating the light used to encode information, which has been a major issue for optical computing development. The CPA could lead to optical switches that replace modern transistors in optical computers, allowing for smaller, more powerful devices. Although the current CPA absorbs 99.4 percent of all the light it receives, the researchers would like to achieve a 99.999 percent absorption rate.


SSDs Harder to Securely Purge of Data Than HDDs
eWeek (02/18/11) Fahmida Y. Rashid

University of California researchers have found that current disk sanitization techniques do not work on solid state drives (SSDs) because the technology's internal architecture is different than that of a hard disk drive. Sanitized is defined as erasing all or part of the storage device in such a way that the contained data was difficult or impossible to recover, according to the paper. "Reliable SSD sanitization requires built-in, verifiable sanitize operations," the researchers say. Although modern manufacturers implement standard sanitization techniques with most of their drives, the researchers found that many of the implementations were flawed. The researchers tested two standard commands, ERASE UNIT and ERASE UNIT ENH. After seven ERASE UNIT tests, the researchers found that just four out of seven drives executed it properly. "Standardized commands should work correctly almost indefinitely," the researchers say. Although current software applications for deleting the entire drive work a majority of the time, the researchers found that none of the available software techniques for securely removing individual files was effective on flash-based SSDs.


Scientists Steer Car With the Power of Thought
Freie University Berlin (Germany) (02/17/11)

Researchers at Freie University Berlin and AutoNOMOS have developed a system that enables human brain waves to control a car's steering. The system uses electroencephalogram (EEG) sensors to measure brain waves and employs algorithms to differentiate between the bioelectrical wave patterns for control commands such as left, right, accelerate, and brake. The researchers also developed an interface that connects the sensors to a computer-controlled car, enabling it to be driven with the user's thoughts. "In our test runs, a driver equipped with EEG sensors was able to control the car with no problem--there was only a slight delay between the envisaged commands and the response of the car," says Freie University Berlin professor Raul Rojas.


3D Video Without the Goggles
University of Southampton (United Kingdom) (02/17/11) Joyce Lewis

Researchers at the University of Southampton are working on a three-dimensional (3D) video system that would be capable of supporting flawless videoconferencing and home entertainment without the use of goggles. The telepresence research project will make use of 3D Avatar-style stereoscopic video and audio communications. The team recently made significant investments in 3D cameras and displays as well as holographic visualization facilities. "Our system is expected to become more 'immersive' by dispensing with the inconvenience of wearing goggles," says Southampton's Lajos Hanzo. "The first stage is to conceive flawless, immersive videoconferencing concepts and then to transfer the design principles to shirt-pocket-sized compact mobile devices, such as camera phones, within the next decade." In addition to ensuring the video is transmitted without errors, the team wants to develop a green wireless system that requires less energy.


Cerf: 2011 Will Be Proving Point for 'InterPlanetary Internet'
Network World (02/18/11) Julie Bort

The InterPlanetary Internet initiative to extend the Internet into outer space is using a new Bundle Protocol that was developed as part of a general notion for a delay-and-disruption-tolerant network, says Google's Vint Cerf. The InterPlanetary protocols have been uploaded to the EPOXI spacecraft for testing. Cert says the focus this year is to "space qualify" the interplanetary protocols to standardize them and make them available to other countries. "So what can happen over time is that we can literally grow an interplanetary network that can support both man and robotic exploration," Cerf says. Meanwhile, he also is promoting the SPDY Internet protocol, which is an effort by Google to make Internet implementations more efficient. Cerf is unwilling to rule out the Semantic Web as a potential positive outcome, which has been ongoing for about a decade, because Tim Berners-Lee has been successful in the past. A number of groups are working on cloud standards, but the real issue will be implementation and testing, Cerf says. "Until we have some serious experience in getting clouds to interact with each other in various ways, I think we won't know what works and what doesn't," he says.


Abstract News © Copyright 2011 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe