Association for Computing Machinery
Welcome to the December 8, 2008 edition of ACM TechNews, providing timely information for IT professionals three times a week.

HEADLINES AT A GLANCE


U.S. Is Losing Global Cyberwar, Commission Says
BusinessWeek (12/07/08) Epstein, Keith

The United States is woefully unprepared for the challenges of 21st century cybersecurity, concludes a new report from the U.S. Commission on Cybersecurity, which calls for the establishment of a Center for Cybersecurity Operations to be supervised by a special White House advisor. The center would function as a new regulator of computer security in both the public and private sector, while active policing of government and corporate networks would incorporate new rules and a "red team" to test computers for flaws currently being exploited by cybercriminals. The report notes break-ins at the U.S. departments of Defense, State, Homeland Security, and Commerce, and at NASA and the National Defense University in 2007. For example, both military and corporate networks have been hit by the malicious agent.btz program, and the attacks have become more sophisticated and tougher to track down. The U.S. military has uncovered approximately 7 million unprotected electronic devices. Cybersecurity commission member Tom Kellermann says Homeland Security Department-led initiatives to bolster cybersecurity have been impeded by bureaucratic confusion and agencies and corporations' refusal to share information about data breaches. He adds that several members of the commission are working to persuade President-elect Barack Obama to take appropriate action. Obama's July 16 pledge to "declare our cyberinfrastructure a strategic asset" and to "bring together government, industry, and academia to determine the best ways to guard the infrastructure that supports our power," has given members hope, as has his promise to appoint a national cyber advisor who would report directly to the president.


Intel Cites Advance in Using Silicon in Data Products; Claim Is Challenged
Wall Street Journal (12/08/08) P. B7; Clark, Don

Intel announced that it has combined silicon with the element germanium to make an avalanche photo detector, a device that could reduce the cost and increase the speed of optically transmitting computer data. Intel says the discovery marks the first time that a silicon-based optical component exceeded the performance of an equivalent device made from more expensive materials. However, the importance of Intel's discovery was challenged by researchers at Luxtera, which is already producing silicon-based optical components. Optical communication uses streams of light particles, generated by lasers, that are encoded with information to transmit data at significantly faster speeds than is possible using standard, electrical components. However, optical connections are currently used primarily for high-volume, long-distance communications, or in massive supercomputers, due the high cost of optical components. Researchers are working to incorporate silicon materials in optical components to reduce their cost. Intel says the avalanche photonic detector increases capacity to 340 gigahertz, the highest result to date on that key metric of performance, says Intel's Mario Paniccia. Luxtera CEO Greg Young says Intel's performance claims are "tremendous," but he says Luxtera researchers were the first to best the performance of indium-phosphide photo detectors more than a year ago. Young also says that Intel's photo detector is incompatible with conventional semiconductor-production processes, so it cannot be used alongside other components on one piece of silicon.
View Full Article - May Require Paid Subscription | Return to Headlines


Revolutionary High Speed 'Cloud' Computing Software Announced by New University of Melbourne Start-Up
University of Melbourne (11/27/08)

Australian startup company Manjrasoft Pty Ltd and University of Melbourne researchers are developing Aneka, software that uses the power of networked computers to analyze data at high speeds. Aneka enables cloud computing, and offers a choice between multiple processing models for the analysis of large tasks, as well as the ability to negotiate task timeframes and pricing. "This is about being able to readily distribute, scale, manage, and quickly reconfigure computing tasks across discrete pieces of infrastructure," says Melbourne professor Raj Buyya, who led the research. Buyya says industries with large processing demands will continue to drive the adoption of cloud technology. "You no longer need to invest in or maintain expensive infrastructure," he says. "Manjrasoft's Aneka software can utilize desktop computers to drive its processing power--hence replacing expensive supercomputers and data centers." The researchers have competed Aneka's prototyping and are about release a beta version. Ivan Mellado of Melbourne Ventures, the university's commercial branch, says Aneka could provide a powerful way for enterprise developers and independent software developers to deploy computationally and data-intensive applications within a Windows environment.


Thieves Winning Online War, Maybe Even in Your Computer
New York Times (12/06/08) P. A1; Markoff, John

Malware continues to overcome security professionals' efforts to defend against it. "Right now the bad guys are improving more quickly than the good guys," says SRI International's Patrick Lincoln. As businesses and individuals become increasingly involved in online communities, cybercriminals are given more opportunities to infect machines and commit crimes. The Organization for Security and Cooperation in Europe estimates that credit card thefts, bank fraud, and other online scams rob computer users of $100 billion annually. In late October, the RSA FraudAction Research Lab discovered a cache of 500,000 credit-card numbers and bank account log-ins that were stolen by a network of zombie computers run by an online gang. "Modern worms are stealthier and they are professionally written," says British Telecom chief security technology officer Bruce Schneier. "The criminals have gone upmarket, and they're organized and international because there is real money to be made." Meanwhile, malicious programs are becoming increasingly sophisticated, with some programs searching for the most recent documents on the assumption that they are the most valuable and others stealing log-in and password information for consumer finances. Microsoft researchers recently discovered malware that runs Windows Update after it infects a machine to ensure the machine is protected from other pieces of malware. Purdue University computer scientist Eugene Spafford is concerned that companies will cut back on computer security to save money. "In many respects, we are probably worse off than we were 20 years ago," he says, "because all of the money has been devoted to patching the current problem rather than investing in the redesign of our infrastructure."


Sweeping Changes Coming With Smart Dust
Investor's Business Daily (12/08/08) P. A1; Bonasia, J.; Tsuruoka, Doug

Dozens of companies are engaged in the development of wireless, nanoscale sensor networks, also known as smart dust, in the hope of facilitating a wide range of radical applications, including climate monitoring and advanced drug delivery systems. The sensors are being used to watch building controls, pipelines, factory equipment, and drug-making processes, says Dust Networks founder Kris Pister. He says smart dust can reliably track different industrial systems through their ability to communicate with each other via a mesh of wireless radio signals, and this, along with their being small enough to be placed anywhere, gives them revolutionary potential. Smart dust is based on microelectromechanical systems that can measure temperatures, vibrations, or surface pressures, and transmit signals back to a command computer. Smart dust is finding use in industrial automation and factory applications, while building energy monitoring is another area generating great interest, Pister says. "People see all kinds of potential value in trying new applications, but in many cases the wireless sensor technology is not quite mature enough yet," cautions ARC Advisory analyst Harry Forbes. More industry standards must be approved if wireless smart dust networks are to grow, because manufacturers need to align various sensors to a spectrum of wireless networks and software programs. Pister says the incorporation of encryption and other protective measures eliminates smart dust networks as a security risk.


New Tool to Audit the Use of Private Data
University of Southampton (ECS) (12/05/08) Lewis, Joyce

University of Southampton computer scientists have developed a method for analyzing personal and confidential information to determine where it has come from, how it is being used, and how it can be made secure. Luc Moreau and Rocio Aldeco-Perez in the School of Electronics and Computer Science have developed a case study based on private data in a university and the requirements of the Data Protection Act. As a result of the tool, called Provenance, systems could be redesigned to include secure auditing strategies, which ultimately would make them more robust and trusted. "At the moment when data is leaked, there is no systematic way to analyze the scenario," Moreau says. "We are now working towards the first prototype capable of auditing this data." A Provenance prototype should be available in six months.


Economic Rescue: Can Supercomputers Help Save the Day?
Computerworld (12/08/08) Thibodeau, Patrick

Making supercomputing resources available to more companies could stimulate the U.S. economy by creating new jobs and products or services. The untapped economic benefits of high-performance computing (HPC) has led some U.S. universities and state governments to launch programs that provide companies with access to supercomputing systems and technical help. For example, the Ohio Supercomputer Center and the Edison Welding Institute (EWI) recently launched a beta program that gives welding engineers access to HPC capabilities through a Web-based interface. The welding engineers used the supercomputing capabilities to enter a wide variety of information on joining materials to create simulations that show how certain welds will work. EWI president Henry Cialone says the U.S. industry has only scratched the surface of HPC simulation and modeling, and that such efforts could greatly enhance the competitiveness of U.S. manufacturing. Indiana University, Purdue University, and the state of Indiana also are working to make supercomputing more available and profitable by making 20 teraflops of computing capacity available to Indiana businesses. Indiana University CIO Brad Wheeler says offering supercomputing power to companies as a shared utility provides them with standardized software, a place to host their application code, and help with parallel programming. "We need to remember what made this country successful in terms of technology," says Purdue CIO Gerry McCartney. "It was aggressive adoption of technology."


Safer, Better, Faster: Addressing Cryptography's Big Challenges
ICT Results (12/04/08)

The European Union-funded ENCRYPT network of excellence united 32 research institutions, universities, and companies in a four-year cryptography research effort. International Association for Cryptologic Research president and ENCRYPT leader Bart Preneel, a professor at Katholieke Universiteit Leuven in Belgium, says the three major issues facing cryptographers are cost, speed, and long-term security. Cost and speed are closely connected as a result of the trend of storing increasing amounts of information in distributed systems. Cost refers to both the cost of hardware systems capable of managing complex encryption and the energy used to run cryptographic processes on increasingly small, low-power devices. "In a few years we will have devices in our pockets with 10 terabytes of storage capacity--current methods are far too slow to encrypt that amount of data practically," Preneel says. Another major problem is that much of the data currently being generated will need to be kept secure for decades, or even centuries, but cryptography becomes increasingly easy to crack as it ages. The ENCRYPT project was structured into five core research areas, in which the researchers developed improved cryptographic algorithms, ciphers, and hash functions. A major achievement of the project is eight new algorithms with the ability to outperform the Advanced Encryption Standard. The project also developed a new method for creating cryptographic protocols based on game theory, and created lighter cryptographic algorithms for use in small, low-power devices such as smart cards and RFID tags.


Reconfigurable Computing Prospects on the Rise
HPC Wire (12/03/08) Feldman, Michael

Reconfigurable computing with field programmable gate arrays (FPGAs) is not as well hyped as GPU-accelerated high performance computing (HPC), but it is generating interest. The lure of FPGAs lies in their ability to be custom configured to run specific application workloads efficiently, and they are similar to GPUs in that they can offer up to one or two orders of magnitude performance gain for certain applications, consuming just a fraction of the power used by CPU-only implementations. Applications well suited for FPGAs include most bioinformatics applications, image recognition, encryption/decryption, and FFT-based codes. The practicality of FPGA coprocessing has been greatly improved with the opening up of AMD's HyperTransport interface and the subsequent licensing of Intel's Front Side Bus over the last two years. Imbuing FPGAs with socket friendliness also has carried significant benefits since it allows the arrays to communicate with the CPU directly using the native processor bus. The Convey Computer hybrid core platform is currently the most integrated and ambitious reconfigurable computing solution in existence. The platform bundles multiple FPGAs into a reconfigurable coprocessor alongside an x86 CPU, and the employment of application workload "personalities" protects the developer from hardware design issues or even explicit parallel programming, making the sorting out of CPU and FPGA code mappings the responsibility of the compiler and runtime system. The Convey's interoperability with standard programming languages gives it the potential to dramatically boost the availability of coprocessor acceleration to HPC users.


First Superconducting Transistor Promises PC Revolution
New Scientist (12/03/08) Marks, Paul

Andrea Caviglia and colleagues at the University of Geneva in Switzerland have applied a voltage to a single crystal containing strontium titanate and lanthanum aluminate, which created a superconducting version of a field effect transistor (FET). A year ago, the team grew a single crystal containing the two metal oxides as separate segments, and found a layer of free electrons at the interface of the materials. The electron gas flowed without resistance just above absolute zero. Applying the voltage to the interface enabled the team to switch the superconductivity on and off. The team made the first superconducting transistor by using the lanthanum aluminate side of the crystal as a source-drain channel and the strontium titanate layer as the gate. "With no electric field, there is zero resistance between the source and drain as the device is superconducting," Caviglia says. However, applying an electric field to the strontium titanate shifts the layer of free electrons away from the interface and the lanthanum aluminate stops conducting current. Computers that use a superconducting FET would be "much faster than the gigahertz speeds currently available," Caviglia says.


Project to Develop Conservation Software
Stanford Report (CA) (11/25/08) Shwartz, Mark

A collaborative effort between Stanford University's Woods Institute for the Environment, the Nature Conservancy, and the World Wildlife Fund has been awarded a two-year, $1.97 million grant to develop software for mapping and evaluating the economic benefits of temperate marine ecosystems. The effort, called the Natural Capital Project, will develop the Marine Integrated Valuation of Ecosystems Services and Tradeoffs (Marine InVEST) program, which is designed to give policymakers and other stakeholders an easy way to calculate the multiple benefits people derive from ocean ecosystems and incorporate those values into the planning process. Natural Capital Project lead scientist Heather Tallis says Marine InVEST will give people the ability to view a comprehensive ocean management plan and determine the monetary consequences of different courses of action in a variety of ecosystem services, including fishing, shoreline preservation, shipping, tourism, and recreation. Tallis says that when the software is ready next year it will be tested at a coastal site in California or British Columbia. Earlier this year, the Natural Capital Project released a beta version of a land-based InVEST resource planning program, which will serve as the technical framework for Marine InVEST. "We're now testing and refining InVEST for major resource decisions in landscapes around the world," says Natural Capital Project chair Gretchen Daily. "There is tremendous demand for this tool and approach to guide strategic investment in natural capital.


Dig This: RoboClam
MIT News (11/25/08) Thomson, Elizabeth A.

The Massachusetts Institute of Technology's (MIT's) RoboClam is a robot based on the razor clam designed to act as a "smart" anchor that can bury and remove itself from the ocean floor. "Our original goal was to develop a lightweight anchor that you could set then easily unset, something that's not possible with conventional devices," says MIT professor Annette Hosoi. RoboClam could be used to tether small robotic submarines that regularly need to reposition themselves to monitor variables such as water current and temperature, or act as a device that can burrow into the seabed and be directed to a specific location to detonate and destroy underwater mines. When designing RoboClam, the researchers looked for an animal that is well adapted to moving through sediment on the sea floor. The researchers created a glass-sided box filled with beads and water to observe the razor clam's digging technique. The clam's tongue-like "foot" first wiggles into the sand. It then makes a quick up-and-down movement while opening and closing its shell, which turns the waterlogged sand into a liquid-like quicksand, significantly reducing the drag on the clam's body. The RoboClam uses pressure regulators, pistons, and other devices to control its movements and how hard it is pushed in each direction.


Lawrence Livermore Erects HPC Testbed
Government Computer News (11/25/08) Jackson, Joab

The U.S. Department of Energy's Lawrence Livermore National Laboratory (LLNL) is erecting Hyperion, a modestly sized supercomputer that will have 1,152 nodes and 9,216 processor cores. When completed, it will be able to process about 100 trillion floating-point operations per second and serve as a testbed for high-performance computing applications. LLNL project leader Mark Seager says Hyperion will primarily be used for testing hardware and software infrastructure to allow engineers to test their work at scale. Seager says a piece of hardware or software frequently will not scale when introduced into a production environment, and operational workers must then troubleshoot the system while it is running production jobs. "Building this testbed will provide these resources upfront," he says. Several vendors contributed equipment to help build Hyperion in exchange for computing time. Getting vendors to share the cost of building a testbed is new, at least in government. The vendors own a small portion of the machine, which they can use anytime, and the design of the system enables researchers to carve out a section of the system for testing new hardware or software in such a way that if the test crashes the entire system will not crash. Vendors also receive some time with the entire machine to run benchmarks and application scalability tests. Hyperion is expected to be completed by March 2009.


How to Improve Email Communication
Chicago Journals (11/24/08) Igbinedion, Ofurhe

Effective email interaction will require strategies that are distinct from those involved in face-to-face conversations, such as tone of voice, body language, and context, writes University of Chicago researchers in the latest issue of the American Journal of Sociology. Daniel A. Menchik and Xiaoli Tian use the effort of a well-known scientific organization, which replaced occasional meetings of a research panel with ongoing email interaction, as an example. "People innovate in response to the challenges of a new context for the communication of essential elements of language," Menchik and Tian write. Participants used capital letters, quotations, emoticons, exclamation points, punctuation, bullet points, style, and even color to communicate the meaning of a word or message; cut and paste from previous emails and used subject lines from past discussions to maintain their conversational flow; and also relied on signatures, disclaimers, and other information to help the recipient understand their state of mind when writing an email. "People can cultivate ways of communicating in online contexts that are equally as effective as those used offline," they write. Email interaction has become a focus of developers. For example, Eudora's MoodWatch offers a feature that warns the sender about a potentially inflammatory email, and also alerts the receiver about the content of email.


Smart Traffic Systems Just Around the Corner
Nikkei Weekly (12/01/08) P. 16

Japan's National Institute of Information and Communications Technology (NICT) has developed a traffic-accident prevention system that uses wireless communications to transmit important data to motorists. NICT research manager Hiroshi Harada expects the technology to be ready in or after 2012. In August, Fujitsu commenced tests of a system in which wireless devices mounted atop roadside poles relay information to drivers, and researcher Jun Sato notes that "we've achieved stable communications even when the cars are farther than I expected." The technology could ultimately be added to traffic signals, and combined with cameras and radar to notify drivers about road conditions and speed limits. NICT is working on a wireless device that boasts better reliability even at high speeds, as part of a car-to-car communications concept that could prevent accidents at intersections that lack traffic lights or have poor lines of sight. All these various experiments share the employment of wireless communications in the 720 MHz band of the radio spectrum, which is currently being used for analog TV broadcasts. The 720 MHz band allows wireless signals to pass through obstructions, and following the elimination of analog TV the Ministry of Internal Affairs and Communications will permit the band's use for intelligent traffic systems.
View Full Article - May Require Paid Subscription | Return to Headlines


Abstract News © Copyright 2008 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]




Unsubscribe
Change your Email Address for TechNews (log into myACM)