Association for Computing Machinery
Welcome to the December 15, 2010 edition of ACM TechNews, providing timely information for IT professionals three times a week.

HEADLINES AT A GLANCE


China Surpasses Japan in R&D as Powers Shift
Wall Street Journal (12/14/10) Gautam Naik

China is expected to become the world's second biggest spender on research and development (R&D) in 2011, overtaking Japan and trailing only the United States, according to a Battelle Memorial Institute report. China is expected to spend $153.7 billion on R&D in 2011, compared to Japan's $144.1 billion, although both countries still trail the U.S., which is expected to increase its $395.8 billion 2010 R&D budget by 2.4 percent, according to the Battelle report. "China has sustained this kind of growth [in R&D spending] for a number of years and they're sticking to it regardless of what's going on in the global economic cycle," says Battelle's Martin Grueber. China is focusing its R&D efforts in certain fields, including alternative energy, life sciences, and advance materials. Although U.S. companies are investing more in R&D now that the economy appears to be trending upwards, they are still below their historic R&D spending rate. "In a perfect world, the industry rate would be greater than five percent or even seven percent," Grueber says. Many large firms cut R&D spending in the first nine months of 2009, compared to the same time period in 2008, but increased spending in the first part of 2010.
View Full Article - May Require Paid Subscription | Return to Headlines


IBM Xeon-Based Supercomputer to Hit Three Petaflops
eWeek Europe (United Kingdom) (12/14/10) Matthew Broersma

IBM plans to build an Intel Xeon-based supercomputer that will reach a peak speed of three petaflops and use a hot water-cooling system, which will result in 40 percent less power consumption than an air-cooled machine. The system, called SuperMUC, will be housed at the Leibniz Supercomputing Centre (LRZ) as part of the Partnership for Advanced Computing in Europe high-performance computing (HPC) infrastructure, according to Intel. The SuperMUC system will be IBM's second water-cooled supercomputer, following the Aquasar system that was set up at the Swiss Federal Institute of Technology Zurich in July. IBM's hot-water cooling technique cools HPC components with warm water and uses micro-channel liquid coolers hooked directly to the processors. Water generally removes heat 4,000 times more efficiently than air, according to Intel. "SuperMUC will provide previously unattainable energy efficiency along with peak performance by exploiting the massive parallelism of Intel's multicore processors and leveraging the innovative hot-water cooling technology pioneered by IBM," says LRZ's Arndt Bode. The system will use more than 14,000 Xeon processors. IBM's development team will rely on Intel researchers for energy-efficiency contributions, and LRZ researchers for their expertise in high-end supercomputing systems.


At SIGGRAPH Asia, the Sky Is the Limit
JoongAng Daily (South Korea) (12/14/10) Jung Seung-hyun

The focal point of SIGGRAPH Asia 2010 is the Computer Animation Festival, which features the best in computer animation and visual effects. The festival features the Electronic Theater, which screens several computer animation and visual-effects movies, and the Animation Theater, which screen digital short films. "The SIGGRAPH Asia 2010 Computer Animation Festival has received more entries this year, allowing us to broaden the scope of programs and include stronger content," says festival chair Jinny HyeJin Choo. SIGGRAPH Asia 2010 features 761 pieces from 47 countries, with more than 40 percent of those coming from Asian countries. The Courses program, involving a series of educational sessions on the theory, application, and inspiration behind computer graphics advancements, is another big part of SIGGRAPH Asia 2010. "There is a good balance between technical and artistic courses, with tracks covering gaming, animation techniques, visual effects, real-time techniques, GPU computing, and the fundamentals of computer graphics," says Courses chair Sophie Revillard. The SIGGRAPH Asia conference also features the Technical Papers, Technical Sketches and Posters, and Trade Exhibition programs.


Cryptographers Chosen to Duke It Out in Final Fight
New Scientist (12/13/10) Celeste Biever

The U.S. National Institute of Standards and Technology (NIST) has selected five Secure Hash Algorithm (SHA-3) entrants as finalists for its competition to find a replacement for the gold-standard security algorithm. The finalists include BLAKE, devised by a team led by Jean-Philippe Aumasson of the Swiss company Nagravision, and Skein, which is the work of computer security expert and blogger Bruce Schneier. "We picked five finalists that seemed to have the best combination of confidence in the security of the algorithm and their performance on a wide range of platforms" such as desktop computers and servers, says NIST's William Burr. "We wanted a set of finalists that were different internally, so that a new attack would be less likely to damage all of them, just as biological diversity makes it less likely that a single disease can wipe out all the members of a species." The finalists incorporate new design ideas that have arisen in recent years. The Keccak algorithm from a team led by STMicroelectronics' Guido Bertoni uses a novel idea called sponge hash construction to produce a final string of 1s and 0s. The teams have until Jan. 16, 2011, to tweak their algorithms, then an international community of cryptanalysts will spend a year looking for weaknesses. NIST willl pick a winner in 2012.


Physical Protection for the Internet
AlphaGalileo (12/14/10)

Swiss Federal Institute of Technology Zurich (ETH) researchers say that physical attacks on critical Internet infrastructures, such as servers and data hubs, could be just as important to maintaining network functions as cyberattacks. The ETH team found that physical damage to important communication networks can result in massive failures of different critical infrastructures, largely due to the interdependency of these systems. "Effective functioning of today's societies is based on critical infrastructures, i.e., large-scale infrastructures whose degradation, disruption, or destruction would have a serious impact on the health, safety, security, or well-being of citizens or the effective functioning of governments and/or economy," says ETH researcher Ling Zhou. The researchers used the Swiss national network for research and education to study how physical damage to network components could affect the Internet as a whole and developed three strategies for protecting network infrastructure. First, common connection points should be viewed and protected as important Internet infrastructure. Second, Internet service providers should be supported in protecting themselves and should be encouraged to diversify the physical routing of fiber-optic cables. Finally, national governments must cooperate with service providers to develop safety standards for the currently unregulated sector of information and communication technologies.


Senator Proposes Cybersecurity Standards
InformationWeek (12/13/10) Elizabeth Montalbano

U.S. Sen. Ben Cardin (D-Md.) has introduced the Internet and Cybersecurity Safety Standards Act, legislation that requires top government officials to determine whether it would be cost effective to mandate that Internet service providers and others develop and enforce cybersecurity safety standards. The bill also requires the secretary of homeland security, the attorney general, and the commerce secretary to determine what impact the standards would have on homeland security, the economy, innovation, individual freedoms, and privacy. Before the standards are finalized, officials also must work together with several private-sector organizations, including companies that would be impacted by the standards and experts on technology. The bill gives officials one year to offer Congress recommendations on standards that would cover all Internet-connected devices. "We live in a digital world and we need to arm ourselves with the right tools to prevent a digital 9/11 before it occurs," Cardin says. "Failure to take such steps to protect our nation's infrastructure and its key resources could wreak untold havoc for millions of Americans and businesses, as well as our national security."


Maths Research to Improve Internet Reliability
University of Adelaide (12/13/10)

The Internet Traffic-Matrix Synthesis project will help make Internet services more reliable and efficient, according to University of Adelaide researchers. They say the project will synthesize Internet traffic matrices, based on the patterns of real network traffic, which will enable network researchers to test the designs of communication networks. "A traffic matrix looks at the volume, movement and type of data traffic over all starting points and all destinations," says Adelaide professor Matt Roughan. "This work is about discovering the mathematical underpinnings of these patterns and using them to generate new traffic matrices, against which network designs can be tested." Network researchers have not been able to use real Internet traffic data to test designs due to commercial sensitivity and privacy issues. Comparing synthetic traffic matrices with real matrices will enable network providers and researchers to have a better understanding of actual traffic patterns. Researchers from AT&T Labs Research will participate in the project.


Cloud Essential to R&D in Australia: NICTA
Computerworld Australia (12/10/10) Chloe Herrick

National ICT Australia (NICTA) is using cloud computing to access computational and storage resources at an unprecedented scale. "The ability to process literally billions and billions of records of data at a very short completion time means we can conduct science experiments in particular domains that we haven't been able to do so before," says NICTA principal research leader Anna Liu. "The other value of cloud computing is we can use it right now, we do not necessarily have to spend a lot of time to secure a large infrastructure grant in order to build up our own compute clusters and then to do science experiments with it." Microsoft recently announced a partnership with NICTA, the Australian National University, and the Commonwealth Scientific and Industrial Research Organization to provide the organizations with three years of free access to Microsoft's Windows Azure Cloud computing platform. "What we need to do, is let scientists be scientists, they don't want to be system administrators, they want to focus on the science and be able to access very large amounts of data and the tools to analyze that data in easy ways, and they want to be able to do it from their desktop," says Microsoft Research director Dennis Gannon.


UCLA Receives $5.5M for Ongoing Research on High-Speed, High-Density Computer Memory
UCLA Newsroom (12/09/10) Wileen Wong Kromhour; Matthew Chin

University of California, Los Angeles (UCLA) researchers recently received a $5.5 million U.S. Defense Advanced Research Projects Agency grant to continue developing technology that could lead to low-power computers that need almost no start-up time when activated. The researchers are working on a high-speed, high-capacity computer memory, known as spin-transfer torque magnetoresistive random access memory (STT-RAM), which is compatible with current standards and has advantages over other types of memory such as dynamic random access memory (DRAM) and static random access memory (SRAM). The researchers say that STT-RAM could combine the benefits of DRAM and SRAM, as well as flash memory common in USB drives, into a single scalable memory technology with great endurance and extremely low power requirements. The researchers recently completed the first phase of the project a year ahead of schedule by meeting the standards for speed, energy consumption, and stability for STT-RAM bits. As part of the second phase, the team will improve the energy and stability metrics and build prototype STT-RAM chips. "An important emphasis of the second phase ... will be statistical studies needed to facilitate integration with CMOS to realize a product," says UCLA researcher Pedram Khalili.


What Social Networks Reveal About Interaction
Irish Times (Ireland) (12/10/10) Karlin Lillington

Social network systems can reveal insights into how groups of people can efficiently analyze, filter, and use information to harness the wisdom of the crowd, according to researchers at the Palo Alto Research Center (PARC). A team led by PARC principal scientist Ed Chi is studying how people work together to produce Wikipedia entries in an attempt to understand how social computing systems can enhance the ability of a group of people to remember, think, and reason. Wikipedia's informative entries are assumed to be the result of benevolent cooperation, but the team has found that conflict actually drives the productivity. Meanwhile, Wikipedia's growth model has shown that the encyclopedia is moving in a more linear manner, influenced more by members who readily adapt and gain the most expertise, after its initial rapid expansion. Issues of growth mode and maintenance mode, which is similar to the stabilization of a biological system, have serious business implications, Chi says. PARC researchers also want to understand what motivates people to contribute to social network systems such as Twitter, and how information gets delivered to groups of people, which could help businesses better filter their information overload. Chi says the research could become useful for businesses because "the optimal distribution of knowledge across an organization is how an organization operates at ultimate efficiency."


Technique Turns Computer Chip Defects Into an Advantage
OSU News (12/09/10) Pam Frost Gorder

Tiny defects on computer chips can be used to adjust the properties of important atoms on the chip, according to Ohio State University (OSU) researchers. The method entails reconfiguring the holes left by missing atoms to adjust the characteristics of the chemical impurities, known as dopants, which allow the semiconductors in computer chips to have special properties. "Once industry takes this effect into account, our discovery could not only enable future computers with faster speeds, but could also enable new paradigms for computing--based, for example, on quantum mechanics," says OSU professor Jay Gupta. The researchers found that an arsenic space on the surface of a chip could be moved by pushing it into an adjacent hole, resulting in a new hole where the arsenic atom had been. The researchers discovered that moving the atoms around changed the dopant's energy level. "A dopant's ability to donate an electron or take it away depends on its binding energy, and by creating a vacancy near the manganese, we were tuning its binding energy," Gupta says. He says chip manufacturers could use the discovery to change tiny electric fields across the metal electrodes that already exist on chips. The method also could be used to improve current chip designs or to develop new chip designs.


New Application Allows Scientists Easy Access to Important Government Data
Rensselaer Polytechnic Institute (12/10/10) Gabrielle DeMarco

Computer scientists at Rensselaer Polytechnic Institute's Tetherless World Constellation (TWC) have developed US Government Dataset Search, software that uses Semantic Web technology to make it easier to access the U.S. government's Data.gov data warehouse. The scientists say that US Government Dataset Search provides usable, relevant, searchable, and replicable datasets on many government-related issues. The software works with Elsevier's SciVerse Web sites. "Elsevier's tool-based systems show a new way for publishers to join this movement without sacrificing copyrights," says Rensselaer professor James Hendler. The application displays a customized list of government datasets that are closely related to the topic that is being searched for. The application also searches TWC's Linking Open Government Data portal, which hosts Data.gov datasets that have been enhanced with Semantic Web technologies. "Through this application and others developed within the Tetherless World, we are empowering researchers with new tools for the basic practice of science by introducing semantics into the exploration of data," says Rensselaer's John Erickson.


Supercomputing Research Opens Doors for Drug Discovery
Oak Ridge National Laboratory (12/09/10) Morgan McCorkle

Oak Ridge National Laboratory (ORNL) researchers have modified an existing program for use on supercomputers that sorts large molecular databases to find certain chemical compounds that can be used as potential drugs and medicines. "Our research is the missing link between supercomputers and the huge data available in molecular databases like the Human Genome Project," says ORNL's Jerome Baudry. "We have an avalanche of data available to us, and now we need to translate that data into knowledge." The software helps researchers find new chemicals that interact with a target, such as a protein, in the body. There are thousands of known proteins and millions of chemicals that can be potential drugs, leading to an enormous number of possible combinations. The new research method enables scientists to take risks with previously untested drug candidates, which could lead to the development of new and exotic drug classes. "Now, every molecule can be examined without worrying about wasting resources," Baudry says. The researchers plan to use ORNL's Jaguar supercomputer to search for chemicals that can treat debilitating diseases.


President's Council Calls for Universal Data Exchange
InformationWeek (12/09/10) Nicole Lewis

The United States needs a universal data exchange language for facilitating a robust and secure exchange of health information electronically across institutions, according to a report from the President's Council of Advisors on Science and Technology. The report says the U.S.'s chief technology officer, the Office of Management and Budget, and the Department of Health and Human Services (HHS) should devise a set of metrics over the next year for measuring progress on creating an operational health information technology (IT) infrastructure nationwide. The report also found some weaknesses in the federal government's current efforts to spur health IT modernization. "Federal efforts are not optimized to achieve the President's goals of improving the quality of healthcare and reducing its cost," the report says. HHS secretary Kathleen Sebelius says the report notes the huge potential of health IT to improve the overall quality of care, but also points out its many challenges. Sebelius says it takes time to learn new technology, securely sharing information is an issue, and systems can be expensive.


Abstract News © Copyright 2010 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe