Welcome to the December 11, 2009 edition of ACM TechNews, providing timely information for IT professionals three times a week.
HEADLINES AT A GLANCE
Researchers Make Significant Advances in Molecular Computing
University of Kent (12/10/09)
The fundamental limits of molecular computing have been defined by researchers at the University of Kent, which published their findings in the Journal of the Royal Society Interface. The research also discusses how fast molecular computers can perform a computation, which must be addressed in order to design machines that use components of organisms to run calculations inside living cells. The metabolic rate or the ability to process energy would determine the speed of bio-molecular computers, says Kent's Dominique Chu. "One of our main findings is that a molecular computer has to balance a trade-off between the speed with which a computation is performed and the accuracy of the result," Chu says. "However, a molecular computer can increase both the speed and reliability of a computation by increasing the energy it invests in the computation." He says this energy could be derived from food sources. Moreover, he believes the findings have the potential to be of practical importance for computing in general.
Two Internet Policy Groups Strengthen Ties
The Wall Street Journal (12/11/09) P. B9; Fowler, Geoffrey A.
The Internet Society (ISOC) and the World Wide Web Consortium (W3C) have agreed to work more closely together to promote open technical standards. ISOC also will donate $2.5 million over three years to W3C to aid in its development efforts. "The risk is without open standards, like ISOC and W3C produce, Internet applications that users have available won't interoperate," says W3C's Ralph Swick. W3C will use the ISOC donation to reach out to developers in the startup phase of inventing new technology. Common operating standards are what has allowed the Internet to grow, says ISOC president Lynn St. Amour. "There is no hierarchy or centralized control, so it doesn't limit any possible futures," St. Amour says. "We work to make sure it is kept open--that governments don't put regulatory environments in place that restrict that openness." The ISOC is currently promoting the adoption of Internet Protocol version 6 (IPv6), a new Internet address standard that likely will eventually replace the IPv4 protocol, which most Internet services use today. IPv6 would address the limited number of remaining addresses in the current standard. Although the technical foundation for IPv6 is ready, ISOC is still trying to have the new standard adopted on devices that connect to the Internet.
A Week to Focus on Computer Science Education
National Science Foundation (12/08/09) Cruikshank, Dana W.
The U.S. House of Representatives has designated Dec. 6-12 Computer Science Education Week to help raise awareness of the dearth of computer science in grades K-12 and the need to remedy the situation. A 2002 Computer Science Teachers Association (CSTA) survey determined that only 17 percent of teachers mentioned having a state-mandated computer science curriculum at the high school level, while just 1 percent said the course was obligatory. That is in contrast to a 2006 CSTA report estimating that computer science departments produced more than 45,000 baccalaureate and 850 Ph.D. degrees that year. Countries such as Canada, New Zealand, and Israel, unlike the United States, have put in place a comprehensive computer science curriculum. The United States has to do the same for grades K-12 if it wishes to keep up with the changing global economy. The U.S. Bureau of Labor Statistics estimates that 854,000 professional information technology (IT) jobs will be added between 2006 and 2016, and all of these jobs will demand a solid computer science background. Furthermore, many professions outside of the IT industry--including marketing, journalism, and the creative arts--now require proficiency in computer science and computer programming. The National Science Foundation has set up new research grant programs to improve computer science education, one of which is the Broadening Participation in Computing program designed to fund research into increasing computer science education.
NTU, IBM Join Hands to Drive Service Innovation
Nanyang Technological University (Singapore) (12/09/09)
Nanyang Technological University (NTU) and IBM are working with government, academia, and industry partners to develop the Service Science, Management, and Engineering (SSME) program in Singapore. The partners also will offer jobs and internships to SSME graduates and students, and provide case studies for industry-academic projects during the second phase of the relationship. "We are happy to extend our partnership with IBM, and we look forward to seeing a more concerted effort in incorporating SSME into our curriculum and research work," says NTU professor Tan Ah-Hwee. The SSME academic curriculum was created to help produce a new breed of multidisciplinary and skilled professionals for the services sector, which accounted for 65.6 percent of Singapore's gross domestic product in 2008. The economy is becoming more technology-based and services-led, and Singapore needs professionals who have a combination of business, technology, and social sciences skills. "These partnerships have helped the university produce graduates and research that is of relevance to the industry through the years," Tan says.
So, That Explains the Headache
Investor's Business Daily (12/08/09) P. A5; Deagon, Brian
A new University of California, San Diego (UCSD) study found that the average U.S. citizen consumes 34 gigabytes of information per day outside of the workplace, and overall U.S. households consumed approximately 3.6 trillion gigabytes of information in 2008. "It's a snapshot of the information revolution," says UCSD report co-author Roger Bohn. The study focused on information people come in contact with through TV, radio, computers, phones, print media, music, and theatrical movies--all outside the workplace. Even with the Internet boom of the last 20 years, the study found that TV still accounts for more than 42 percent of time spent receiving information, the most of any medium. However, when measured in bytes, computers and video games account for 54.6 percent of total data entering the home, largely due to game consoles that create huge streams of graphics data. TVs accounted for 34.7 percent of bytes delivered. Bohn says the personal computer industry has done more to take advantage of processing power advancements than the TV industry. The study also found that about 70 percent of U.S. residents play some kind of computer or video game, a number that surprised researchers. Bohn predicts that data consumption will only increase in the future, thanks in part to the growing distribution of high-definition TVs and smartphones.
New Silicon-Germanium Nanowires Could Lead to Smaller, More Powerful Electronic Devices
UCLA Newsroom (12/09/09) Kromhout, Wileen Wong
Silicon-germanium semiconducting nanowires that could potentially be used in next-generation transistors have been successfully grown by researchers from the University of California, Los Angeles (UCLA); Purdue University; and IBM. UCLA professor Suneel Kodambaka says the nanowires could help accelerate the development of smaller, faster, and more powerful electronics. The researchers demonstrated that they could produce nanowires layered with different materials that were free of defects and atomically sharp at the junction. "The nanowires are so small you can place them in virtually anything," Kodambaka says. "Because of their small size, they are capable of having distinctly different properties, compared to their bulk counterparts." The nanowires are created by first melting tiny particles of a gold-aluminum alloy in a vacuum chamber, and then injecting a silicon-impregnated gas that causes silicon to precipitate and form wires under the droplets. Germanium wires are then produced by introducing a germanium vapor to the chamber. Kodambaka says that silicon-germanium nanostructures have thermoelectric applications, in which heat is transformed into electricity.
Smart CCTV Learns to Spot Suspicious Types
New Scientist (12/09/09) Fleming, Nic
An international team of computer scientists led by Shaogang Gong at Queen Mary, University of London are developing intelligent video-surveillance software designed to spot suspicious individuals for a next-generation closed-circuit television system called Samurai. The system employs algorithms to profile behavior, and it also can account for changes in lighting conditions so it can track people as they move from one camera's viewing field to another. The system also can learn the probable routes people will take as well as follow targets as they move in a crowd, zeroing in on their distinctive shape, their luggage, and the people they are walking with. The system issues alerts when it spots deviant behavior, and is designed to adjust its reasoning according to feedback from the operator. "The use of relevant feedback from human operators will be a very important part of these technologies," says Paul Miller of Queen's University's Center for Secure Information Technologies. "The key is developing learning algorithms that work not only in the lab but that are robust in real-world applications." The Samurai team demonstrated a prototype system in November and said the system successfully recognized potential threats that human operators may have missed, using footage captured at Heathrow airport.
The Battle to Preserve an Old Accelerator's Data
Technology Review (12/11/09)
CERN's Andre Holzner and colleagues are attempting to preserve the 100 terabytes of information that has been generated since 1989 by the Large Electron Positron (LEP) collider, which was shut down in 2000 and dismantled. Transferring that amount of data from LEP is not a major concern, considering its replacement--the Large Hadron Collider--will shortly produce 100 Tbytes of information every day. However, the team will need to preserve the software necessary to make sense of the data, as much of the original LEP software was written in Fortran. Although expertise in Fortran must be preserved, Holzner and colleagues might be unable to show future researchers how the raw data was processed into the form that it appears in scientific journals. The physicists stored most of the analytical software in personal directories, which were deleted a year after they left the lab. Moreover, some of the original data has been lost. Overuse likely caused wear and tear on the LEP's storage tapes.
Light-Generating Transistors to Power Labs on Chips
ICT Results (12/11/09)
European researchers engaged in two projects, ILO and OLAS, investigated the more efficient extraction of light from organic thin films fashioned from carbon-based plastics. The objective of the initiatives was the development of an electrically powered laser. The project partners' expertise and experience in a number of technologies was combined to develop a multifunctional transistor. "Not only did we create a fully functional electronic device in the form of a field-effect transistor, but we were also able to get it to generate light," says project coordinator Michele Muccini. The results of the OLAS project are viewed as an international criterion for what Muccini terms the photonic field-effect heterojunction approach. "Using transistors instead of external sources allows you to greatly increase the efficiency of light generation and extraction," he says. "You expend much less energy driving devices because they are not only efficient but made from disposable organic material which is compatible with other platforms made from other material like silicon or glass." Muccini says several project partners are proposing to incorporate the new framework into lab-on-a-chip devices for biomedical diagnosis. "What this would eventually mean to a doctor on the ground is the development of an affordable, portable, disposable device able to screen for a number of illnesses," he says.
Facebook (and Systems Biologists) Take Note: Network Analysis Reveals True Connections
Northwestern University News Center (IL) (12/07/09) Fellman, Megan
Northwestern University researchers Roger Guimera and Marta Sales-Pardo have developed a universal method that can correctly analyze a variety of complex networks. The researchers tested their method on a range of five networks: a karate club, a social network of dolphins, the neural network of a worm, the air transportation network of Eastern Europe, and the metabolic network of Escherichia coli. For each of the five networks, the researchers introduced errors and applied an algorithm to the distorted network. Each time, the algorithm created a new network with the errors separated out, and each new network construction was closer to the original true network. "The flexibility of our approach, along with its generality and its performance, will make it applicable to many areas where network data reliability is a source of concern," say Guimera and Sales-Pardo. The central idea to the new method is that, even though every network has unique characteristics, they all have nodes that can be put into specific groups, with the nodes connecting to each other based on group membership. The method averages all possible groupings of the nodes and gives each group a weight that reflects its explanatory power. "There are many ways to map nodes in a network, not just one," says Sales-Pardo. "We consider all the possible ways. By taking the sum of them all, we can identify both missing and spurious connections."
Bristol Postdoc Shares in High-Performance Computing Prize
University of Bristol News (12/07/09)
University of Bristol postdoctoral researcher Rio Yokota was a member of the team that won the first Gordon Bell prize in the price per performance category of high-performance computing since 2001 at the recent SC09 conference. Rio Yokota carried out the programming remotely, while the actual calculations were run on a cluster of graphics processing units (GPUs) at Nagasaki University. At Bristol, Yokota is developing the fast multipole method for GPUs with a group led by Lorena Barba in the Department of Mathematics. Yokota collaborated with researchers in Japan, and in November the team reached a sustained performance of 57.3 teraflop per second (Tflops) on a cluster of 760 GPUs that cost $428,134. At 138 megaflops per dollar, the mark is 32 times better than the achievement of the 2001 price per performance winner. The level also is an increase from the 42 Tflops performance Yokota and colleagues recorded in a paper published in August 2009.
Scientists, IT Community Await Exascale Computers
Computerworld (12/07/09) Thibodeau, Patrick
The information technology (IT) industry discussed the challenges it faces in developing exascale systems, a new generation of supercomputers that promise to be far more powerful than existing technology, during the recent SC09 supercomputing conference in Portland, Oregon. With a peak performance of 2.3 petaflops, Jaguar, a Cray XT5 system at Oak Ridge National Laboratory, is the fastest supercomputer today. However, an exaflop would be 1,000 times faster than a petaflop. Exascale systems are expected to appear by 2018, but developers have to improve hardware performance while limiting the use of power. Exascale supercomputers will need to use less memory per core and more memory bandwidth, considering systems running 100 million cores will have continuous core failures. The IT community will have to rethink such issues "in a dramatic kind of way," says IBM's Dave Turek. "There are serious exascale-class problems that just cannot be solved in any reasonable amount of time with the computers that we have today," says Buddy Bland, project director at the Oak Ridge Leadership Computing Facility.
San Diegans--and Their Cell Phones--Will Help Computer Scientists Monitor Air Pollution
UCSD News (12/04/09) Kane, Daniel
University of California, San Diego (UCSD) computer scientists are developing a network of environmental sensors called CitiSense designed to enable cell phone users to receive up-to-the-minute information on air quality. CitiSense derives its data from information gathered from hundreds, and eventually thousands, of sensors attached to San Diego residents as they move about the city. However, monitoring air quality is just one of several tasks the UCSD researchers hope CitiSense will be able to complete. The goal is to build a wireless network in which thousands of environmental sensors carried by the public rely on cell phones to transport information to central computers. The information will be analyzed and distributed back to individuals, public health agencies, and San Diego at large. The researchers say that building a large-scale network that integrates sensors and other digital technology into the physical world will require advancements in a number of computer science areas. "It is a tremendous challenge to integrate a number of technologies and then deploy them outside--in the wild," says UCSD professor William Griswold. The UCSD team is working on powering the sensors by solar, wind, or vibrational energy instead of batteries. Other hurdles the researchers must overcome include security and privacy issues, software architecture, and cyberinfrastructure.
Abstract News © Copyright 2009 INFORMATION, INC.
To submit feedback about ACM TechNews, contact: [email protected]
Change your Email Address for TechNews (log into myACM)