Association for Computing Machinery
Welcome to the February 20, 2009 edition of ACM TechNews, providing timely information for IT professionals three times a week.

HEADLINES AT A GLANCE


CSIRO: Mathematical Firepower (and Cash) Key to Solving Computing Problems
Computerworld Australia (02/20/09) McKinnon, Emma

Australia's Commonwealth Scientific and Industrial Research Organization's (CSIRO's) Louise Ryan says increasing funding for mathematical sciences is essential for the development of computer-based research. "In recent years, funding has been cut to support the quantitative sciences," Ryan says. "This is critical, it's fundamental, it underlies every other field of science and you can't cut corners by cutting funding for mathematical research and training." Ryan oversees CSIRO mathematical and statistical scientists working on applications in agriculture, light metal production, environmental modeling, genetics, supercomputing, financial risk, and transport logistics, among others. The researchers provide mathematical and statistical modeling to support a variety of national projects. Ryan says these areas require very sophisticated mathematical and quantitative techniques to characterize unique situations, as well as sophisticated computing platforms to run these simulations. "If you don't have those really strong underpinnings in the mathematical, statistical, and quantitative sciences, you can't get to the applied answers that you need," she says. Ryan says committing more money to the development of advanced computing technology could help Australia keep up with the demands of modern science. "The ability to transmit and gather information is just changing at an exponential rate," she says. "And so our statistical and mathematical fields have to evolve to be able to handle that data and that's where that very advanced computing comes in." She says Australia also must maintain a cutting-edge professional environment to prevent young and talented Australian scientists from looking for work in other countries.


New Method to Assemble Nanoscale Elements Could Transform Data Storage Industry
UC Berkeley News (02/19/09) Yang, Sarah

Scientists at the University of California, Berkeley and the University of Massachusetts (UMass) Amherst say that a new, innovative technique in which nanoscale elements precisely assemble themselves over large surfaces could lead to significant improvements in data storage media. "I expect that the new method we developed will transform the microelectronic and storage industries, and open up vistas for entirely new applications," says UMass Amherst professor Thomas Russell. Russell created the new approach with UC Berkeley professor Ting Xu. Xu says the density achievable with the new technology could potentially allow the contents of 250 DVDs to be stored on a surface the size of a quarter. Russell and Xu discovered a new way to create block copolymers, chemically dissimilar polymer chains that join together by themselves. Xu says the molecules in block copolymers self-assemble into an extremely exact, equidistant pattern when spread on the surface. For over a decade, researchers have been trying to exploit this characteristic for semiconductor manufacturing, but have been limited because the order starts to break down as the size increases. Russell and Xu overcame this problem by laying the film of block copolymers onto the surface of a commercially available sapphire crystal, which, when cut at an angle and heated to 1,300 to 1,500 degrees Centigrade for 24 hours, reorganizes its surface into a highly ordered pattern that can be used to guide the self-assembly of block polymers. Russell and Xu were able to achieve defect-free arrays of nanoscopic elements with features as small as 3 nanometers, creating a density of 10 terabits per square inch.


Video: Computer Science Meets the Grid
EE Times (02/13/09) Merritt, Rick

University of California, Berkeley professor Randy Katz says future computers need to be designed to meet the energy efficiency needs of data centers, and the power grid needs to adapt and become more like the Internet. At a gathering of Berkeley researchers, Katz demonstrated that computers use far too much energy when not in use, and described the trend toward massive data centers that will contain warehouses full of containers that house computers, communications, power, and cooling systems. "It used to be you designed computers at the chip level, then we moved to the board level, and then racks," Katz says. "Now something like a 20-foot container is the basic building block of the aggregated systems we are building, and there is a completely new set of knowledge needed to build it." Katz says tomorrow's million-server data centers represent the industrialization of the information technology industry. Katz also described a new effort at Berkeley called LoCal, which is exploring the technologies that are critical to upgrading the modern electric distribution grid. Katz called for a future grid capable of generating, storing, and switching energy flows on demand, similar to the Internet. "This will be a switching network that has energy and data flows integrated into it," he says. The switching network could allow users to buy and sell power based on demand. LoCal will develop algorithms that would enable power-based transactions.
View Full Article - May Require Free Registration | Return to Headlines


Princeton Computer Scientists Guide Internet Transparency Project
Princeton University (02/18/09) Emery, Chris

Princeton University's Measurement Lab project is investigating the inner workings of Internet traffic. The Measurement Lab features a global network of computers that computer scientists can use to explore how data travels across the Internet, with the goal of creating transparency in the debate over who should regulate Internet traffic. The project will provide tools to the general public, and may help Internet users understand why their connection to the Internet can be slower at certain times. "Many debates about the Internet and policy would benefit from more and better facts," says Edward Felten, the director of Princeton's Center for Information Technology Policy. "Measurement Lab provides that. It provides a facility for aggregating information and fostering a fact-based public debate about what's going on and what government should do." The Measurement Lab research network was launched with three servers in Google's Mountain View, Calif., headquarters, and will soon expand to 36 servers in 12 locations worldwide. In addition to Princeton, Google, and the New America Foundation, a public policy think tank sponsoring the initiative, project participants include several universities and computer science research centers, and project organizers hope the network will expand further. The network will be used to test Internet connections for speed and functionality by sending data between Measurement Lab computers, which will be able to measure precisely when data leaves and arrives at a computer.


SIGGRAPH 2009 Seeks Live Real-Time Video Games Content
Business Wire (02/11/09)

The SIGGRAPH 2009 Computer Animation Festival will include a segment on real-time rendering projects, in which the top selections will be played and demonstrated live on various platforms. A jury of industry professionals and experts will judge the entries on creativity, innovation, performance, and the ability to render in real time in front of the live audience. Selected contributors will have an opportunity to show their projects for the Computer Animation Festival Evening Theater audience, and to showcase their work in The Sandbox, where attendees get a hands-on gaming experience. "The addition of live, real-time rendering work is going to make this one of the most dynamic and innovative festivals in SIGGRAPH history," says Ronen Barzel, SIGGRAPH 2009 conference chair. "The gaming industry has been a substantial innovator in real-time graphics, and now the best gaming work can be showcased at SIGGRAPH alongside the latest developments in research, science, art, and technology from around the globe." Real-time rendering will focus on mathematical or other industrial simulations, research projects, real-time artist explorations, new uses of pioneering technology, and scientific visualizations. SIGGRAPH 2009 will be held in New Orleans from August 3-7 at the Ernest N. Morial Convention Center.


The Case for 4D Immersive Holographic Spaces
Computing Community Consortium (02/17/09) Bajcsy, Ruzena; Nahrstedt, Klara

The United States continues to fall further behind Asian and European nations in broadband penetration, which is hindering the development of consumer applications and new industries, write University of California, Berkeley professor Ruzena Bajcsy and University of Illinois at Urbana-Champaign professor Klara Nahrstedt. However, they argue that new high-bandwidth applications such as tele-immersive systems are needed to justify the investment in extending broadband. Bajcsy and Nahrstedt say that quick action is needed to ensure that American universities and industries can capitalize on the next generation of tele-immersive systems, such as four-dimensional (4D) Immersive Holographic Spaces. 4D Immersive Holographic Spaces are joint multi-view multimedia-rich spaces that allow people to immerse themselves in a joint cyber-physical space with other people to perform physical activities or observe detailed full-body social behaviors and communication clues in real time. The professors say that such systems would contribute to the number of innovative economic opportunities, strengthen the energy-efficiency movement, and decrease the divide between regions with expert resources and those without. If major investments are committed to developing and building a national tele-immersive infrastructure, the United States could become the leader in broadband information-rich immersive spaces. Future tele-immersive systems will require significant advancements in real-time computer vision and graphics; the integration of speech, vision, and tactile sensory information; dynamic and task-dependent signal compression; and broadband wired and wireless networking, and operating systems and architectures.


Lighting Up the Lives of the Elderly - Adaptively
ICT Results (02/20/09)

Researchers working on the European Union-funded Aladin project say lighting that can adapt automatically to meet a user's needs could have significant benefits for anyone who spends long periods in artificially-lit buildings. "Studies have shown that the quality and type of lighting can have a significant impact on our health and comfort," says project coordinator Edith Maier, a researcher at Austria's Vorarlberg University of Applied Sciences. Aladin united academic and industrial researchers from Austria, Germany, Hungary, Italy, and Romania in an effort to develop an ambient lighting system that can intelligently adapt to a person's needs and wishes. The system uses biosensors worn by the occupants in a room or building to determine the lighting's settings. The researchers want to use the technology to improve the well being of the elderly, people suffering from age-related illnesses, and people with reduced mobility. Maier says poor lighting can exacerbate vision problems and reading difficulties, cause depression, and disrupt sleep. Most adaptive ambient lighting systems rely on a preset-time cycle to brighten and dim at certain times. The Aladin system uses data from sensors in a glove worn by the user to measure heart rate and skin conductance response, which varies when a user is active or at rest. The bio-data tells the system when to be in a bright active setting and when to create a more subdued, relaxing atmosphere.


San Diego Supercomputer Center Begins Cloud Computing Research Using the Google-IBM CluE Cluster
UCSD News (02/17/09) Zverina, Jan

The National Science Foundation (NSF) has awarded a two-year, $450,000 grant to San Diego Supercomputer Center (SDSC) researchers to investigate new ways to dynamically provision and manage massive data sets hosted on immense, Internet-based commercial computer clusters. The grant focuses on Google and IBM's distributed computing resource, or Cluster Exploratory (CluE), which the NSF praised as an alliance between private enterprise and the federal science agency to broaden the research infrastructure's accessibility to institutions throughout the United States. "The broader impact of this research will be to reassess how scientific data archives are implemented, and how data sets are hosted and served to the scientific community at large, using on-demand and dynamic approaches for provisioning data sets as opposed to the current static approach," says SDSC distinguished scientist Chaitan Baru. "This project has the potential to offer scientific researchers compute clouds as a complement to conventional supercomputing architectures used today, while creating new tools and techniques for commercial cloud computing." The cloud computing model is perceived as a way for researchers to migrate from local to off-site data set processing and management using commercially managed data clusters. SDSC's effort will concentrate on using its GEON LiDAR Workflow application, which enables users to subset remote sensing data stored as "point cloud" data sets, process it with different algorithms, and visualize the output. SDSC researchers also think that user communities can be exposed to the CluE-based or similar cluster-based deployments through a service-oriented architecture for access via other scientific workflow environments, visualization tools, and portals.


Trapped Rainbows Could Make Optical Computing a Reality
New Scientist (02/14/09) No. 2695, P. 22; Ananthaswamy, Anil

If successful, a technique that traps a rainbow inside a specially designed grating could revolutionize computing and telecommunications networks. In 2007, University of Surrey researcher Ortwin Hess and colleagues demonstrated that they could, in theory, trap light inside a tapering waveguide, or a structure that guides light waves down its length, made from a substance that can radically bend light. As the waveguide narrows in width, each frequency of the light stops in a different place, creating a trapped microwave rainbow of sorts. This model only worked in the terahertz frequency, which will not work for computing, but Lehigh University's Filbert Bartoli and Qiaogiang Gan repeated the process using visible and infrared light. Bartoli and Gan simulated a silver grating 25 micrometers long, with a depth that gradually changed from 140 to 230 nanometers over its length. Light enters the grating and interacts with surface plasmons, which are electromagnetic waves on the surface of the metal. As light travels down the grating, the waves are slowed, with different frequencies becoming trapped at different spots. Numerical analysis shows that visible light can be stored for a few picoseconds using this technique, which could have a significant impact in numerous applications.


Development of Mobile Applications Is Easy With the Elephant Platform
Fraunhofer Institute (02/17/09) Baumer, Susanne

The Fraunhofer Institute for Communications Systems ESK's Elephant research project is a software platform that makes it easier for programmers to develop applications for mobile devices. With the Elephant platform, which is based on Web technologies that facilitate the compiling and preparing of existing information, developers only need to understand their content and how they want it disseminated. Elephant makes use of templates and a drag-and-drop interface to provide an application with a structure, and a packet is generated by mouse clicks to prepare the content for use on different mobile devices. Elephant supports text and XML formats, and diverse image, audio, and video files. Developers can make the templates available to other users with Web 2.0 technologies. The Fraunhofer Institute plans to unveil Elephant at the Mobile World Congress in Barcelona.


Google, Yahoo and Microsoft Collaborate to Clean Up Web
New York Times (02/12/09) Helft, Miguel

Yahoo! and Microsoft announced that they will support Google's technology for detecting duplicate pages on Web sites. Web publishers will be able to use Google's new Web standard, dubbed the Canonical Link Tag, to find the principal URL that search engines should be indexing. Companies doing business online sometimes have large sites with multiple URLs that all point to the same page, which can be confusing to search engines and can cause them to index the same page multiple times. Some experts believe up to 20 percent of URLs may be duplicates. "There is a lot of clutter on the Web and with this, publishers will be able to clean up a lot of junk," says Matt Cutts, an engineer who heads the spam-fighting efforts at Google. "This is a clear benefit for publishers as it gives them an opportunity to get more exposure through search engines," says Microsoft's Nathan Buggia. Most search engines have technology to sort out duplicates, but Google's standard will make it easier for both publishers and search engines to address the problem.
View Full Article - May Require Free Registration | Return to Headlines


Carnegie Mellon's Synthetic Interview Technology Enables Virtual Chats With Darwin's Ghost
Carnegie Mellon News (02/05/09) Spice, Byron; Sloss, Eric

Researchers at Carnegie Mellon University's Entertainment Technology Center (ETC) and Duquesne University biologists are using ETC's Synthetic Interview Technology to enable virtual chats with Charles Darwin. The researchers have created an interactive experience that enables users to ask the virtual Darwin a wide variety of questions. Users will be able to access more than 15 hours of videotaped responses by Darwin's ghost, portrayed by an actor, and have virtual conversations that are unique to each user. The Ask Darwin system also provides access to more than a dozen modern-day biologists, religious authorities, and other experts that provide alternative views or answers that go beyond Darwin's 19th Century knowledge. "A synthetic interview enables users to interact with video clips so they can explore questions and topics that intrigue or trouble them, rather than passively view a film that simply provides an overview," says ETC's Don Marinelli. Darwin's synthetic interview is now open as a permanent exhibit at the Carnegie Science Center in Pittsburgh, and discussion is underway to provide different versions at other museums throughout the United States, including eventually providing a Web-based version for classroom use. ETC's Synthetic Interview Technology has been used to create interactive experiences with numerous historical figures, but the Ask Darwin implementation is the first time that supplemental materials that are relevant to an answer also are displayed.


A New Kind of Counting
Max Planck Society (02/06/09) Abrell, Barbara

Scientists at Germany's Max Planck Institute for Dynamics and Self-Organization (MPIDS) and Cornell University have developed a computer algorithm to crack previously unsolvable counting problems. Such counting problems are visualized by researchers as a network of lines and nodes, which means only one basic challenge must be met: Determining the number of different ways to color in the nodes with a certain number of colors without assigning the same color to nodes joined by a line. A node's color is imbued with a completely new significance, depending on the application. "The existing algorithm copies the whole network for each stage of the calculation and only changes one aspect of it each time," says MPIDS scientist Frank van Bussel. The researchers move through the network on a node-by-node basis, and the program never looks at the entire network but only at the next node point. At the first node point, the program cannot finalize the color selection as it would have to know how all the other nodes are linked to each other. Instead, the program notes down a formula for the first lattice point, which contains this uncertainty as an unknown quantity. As the program moves through the network, all the connections are exposed and the unknown quantities are removed. The program's knowledge of the network is complete once it has reached the final node point. The calculation time for a square lattice the size of a chess board is estimated to be many billions of years, but Denny Fliegner of MPIDS says the program can accomplish this in just seven seconds.


Abstract News © Copyright 2009 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]




Unsubscribe
Change your Email Address for TechNews (log into myACM)