Welcome to the March 23, 2011 edition of ACM TechNews, providing timely information for IT professionals three times a week.
Also, please download our new ACM TechNews iPhone App from the iTunes Store by clicking here and our new ACM TechNews iPad App by clicking here.
HEADLINES AT A GLANCE
Open Networking Foundation Pursues New Standards
New York Times (03/22/11) John Markoff
About two dozen large information technology companies, including IBM, Google, Microsoft, Facebook, and Cisco, are forming the Open Network Foundation, which plans to establish new standards for computer networking. The group wants to make computer networks of all sizes programmable, similar to the way individual computers are. The benefits of a new system would include more flexible and secure networks that are less likely to suffer from congestion. The new system also would enable large telecommunications companies to use software to combine several fiber-optic networks for small periods of heavy information loads. New technologies also could improve computer security and boost individual privacy, according to the foundation's organizers. "This answers a question that the entire industry has had, and that is how do you provide owners and operators of large networks with the flexibility of control that they want in a standardized fashion," says Stanford University professor Nick McKeown. The new system also would make it possible for data managers to program their networks and prioritize certain types of data.
Study Analyzes Role of Mobile Software in the Future Internet
PhysOrg.com (03/22/11) Lisa Zyga
Donghua University researchers Yongsheng Ding and Lei Gao have performed a macrodynamics analysis of mobile agents' migration behaviors that could provide the basis for a ubiquitous Internet framework in which mobile agents autonomously move between computers and interact with each other. "The study not only favors the design of composite services in a type of self-organizing network architecture, but also benefits the future deployment of an Internet-scale mobile agent system that holds myriads, hosts, and migratory movements of mobile agents," Gao says. Previous studies have predicted a future Internet based on mobile agents that fulfill users' requests. The researchers say mobile agents could be part of an evolutionary Internet framework in which they are rewarded for successfully performing a service, or discontinued when they fail. "The characteristics of the future Internet we envision resemble the self-organizing and the self-healing properties of natural ecosystems that have evolved over billions of years," Gao says.
Georgia Tech to Pursue 'Transparent Internet' With $1M Google Focused Research Award
Georgia Institute of Technology (03/22/11) Brendan Streich
Georgia Tech researchers are using a $1 million Google Research Focused Award to develop systems designed to make Internet access more transparent. The researchers plan to develop a set of Web-based, Internet-scale measurement tools that users can access for free. The tools would enable users to determine if Internet service providers are supplying the kind of service that is being paid for, and if the data being transmitted has been tampered with by governments or other organizations. "Ultimately we hope this project will help create a 'transparency ecosystem,' where more and more users will take advantage of the measurement tools, which in turn will improve the accuracy and comprehensiveness of our analysis," says Georgia Tech professor Wenke Lee. The project will focus on reachability from a variety of access networks, the performance of user networks, and the integrity of information moving through the networks. "In addition to new network measurement and security monitoring algorithms, we want to create and deploy a 'transparency watchdog' system that uses monitoring agents to keep constant tabs of network performance and availability in strategic Internet locations around the world," says Georgia Tech professor Nick Feamster.
Judge Rejects Google's Deal to Digitize Books
New York Times (03/22/11) Miguel Helft
Google's $125 million legal settlement between groups representing authors and publishers, which would have given it exclusive rights to create the world's largest digital library, has been rejected by U.S. federal judge Denny Chin. Creating the digital library is a pet project of Google co-founder Larry Page and it had garnered widespread support from within the company. However, Chin ruled that the agreement would give Google a "de facto monopoly and the right to profit from books without the permission of copyright owners." Google already has scanned about 15 million books for its Book Search service, which displays about 20 percent of the content from copyrighted titles that the company has worked out licensing deals for, but just small excerpts from books that it doesn't have copyrights for. The settlement would have allowed the service to go much further, making millions of out-of-print books available online and selling access to them, and it would have given authors and publishers new ways to earn money from digital copies of their works. Chin suggested he might approve the agreement if it was rewritten to apply only to books whose copyright owners would "opt in" to its terms. Google and the agreement's publishers said they would work together to draft a new agreement that could satisfy the court's objections.
A New System Has Been Developed for an ID in a Mobile Phone
Carlos III University of Madrid (Spain) (03/22/11)
Carlos III University of Madrid (UC3M) researchers are developing a way to integrate electronic identification (eID) data into an SIM mobile phone card to use the device as a means of personal identification. "The SIM cards are like small computers that we carry in our mobile devices allowing us to store information and execute applications, with the advantage of providing a high level security," says UC3M professor Celeste Campo. The system collects the user's personal information and allows identification and authorization of the eID stored in the SIM card. "These data interact with services offered by the company Secuware and allow us to access the Web services through authentication with the eID," Campo says. Although the researchers have developed a prototype device, they still face some hurdles before the technology can be released to the public. "For the generalized utilization of this system which we have developed it is necessary for the application to be transported to the different platforms for mobile devices, principally iPhone OS and Android, and in fact, within the agreement with Telefonica R+D we are already focusing on the latter," Campo says.
Rome Lab's Supercomputer Is Made Up of 1,700 Off-the-Shelf PlayStation 3 Gaming Consoles
Syracuse Post-Standard (NY) (03/23/11) Dave Tobin
U.S. Air Force Research Lab computer scientists, based in Rome, N.Y., have created the Condor Supercomputer from 1,716 PlayStation 3 gaming consoles, which they say could be one of the 40 fastest systems in the world. The supercomputer is designed to process, manipulate, and interpret huge amounts of imaging data. The Air Force is using new radar technology, called Gotcha, that has a higher resolution than older systems and needs Condor's power in order to run properly. Video processed from the radar signals can be viewed in real time or played back to investigate what led to an event, and a user can change perspectives, going from air to ground in an instant. "You can literally rewind or predict forward (in the future), based on the information you have," says Air Force Research Lab's Mark Barnell. The Rome Lab used a $2.5 million Department of Defense grant to build Condor, a project that Barnell says would likely have cost 10 times as much if the researchers had not used off-the-shelf PlayStation 3s. Rome Lab is sharing Condor's capabilities with other government agencies and universities, including Cornell University, Dartmouth College, and the universities of Florida, Maryland, and Tennessee.
Move Over, Einstein: Machines Will Take It From Here
New Scientist (03/22/11) Justin Mullins
Cornell University Ph.D. student Michael Schmidt and professor Hod Lipson have developed a research technique that reverses the usual scientific method. The researchers first perform experiments and feed the results into a computer to discover the laws of nature, rather than starting with a hypothesis to test. The success of this method hinges on evolutionary computing, in which robots or computers are presented with a goal and then generate programs and combine the most promising ones until the objective is achieved. Through evolutionary computing, machines can execute tasks that they have not been programmed to do, and the technique is already being employed to address a host of problems ranging from aircraft design to creating train timetables. Schmidt and Lipson's evolutionary algorithm, Eureqa, has been automated and released for free online. One key to the algorithm's success is its ability to look for invariance in the equations it produces. Lipson thinks that Eureqa will enable the extraction of laws of nature from data at currently unheard of rates, and that this type of machine learning will become routine.
3-D Models Created by a Cell Phone
Technology Review (03/23/11) Tom Simonite
Microsoft researchers have developed a cell phone application that uses overlapping digital photographs to create a photo-realistic three-dimensional (3D) model that can be viewed from any angle. "We want everybody with a cell phone or regular digital camera to be able to capture 3D objects," says Microsoft researcher Eric Stollnitz. After taking several photos of an object from different angles, the user uploads them to a cloud server for processing. The app downloads a photo-realistic model of the object that the user can manipulate in three dimensions. "These 3D scans take up less bandwidth than a video because they are based on only a few images, and are also interactive," Stollnitz says. The technology is based on Microsoft's PhotoSynth software, which provides a sense of three-dimensional imagery by jumping between different views.
Open-Source Software Designed to Minimize Synthetic Biology Risks Is Released
Virginia Bioinformatics Institute (03/21/11) Susan Trulove
Virginia Tech researchers have developed GenoTHREAT, an open source program for detecting the use of synthetic DNA as bioterrorism agents. The researchers used the "best match" screening protocol method recommended by the U.S. government for safeguarding toxins and other potentially dangerous materials from unauthorized individuals or those with malicious intent. "Since [the federal guidance on synthetic DNA] is only one of many regulations and policies that providers of synthetic DNA need to comply with, our current efforts aim at developing a more comprehensive biosecurity solution that can be customized for a variety of users," says Virginia Tech professor Jean Peccoud. The team included five undergraduate students who were trained in interdisciplinary strategies, including students in computer science, biology, biochemistry, and engineering. "This project exemplifies how it is possible to train students to use interdisciplinary strategies to confront today's most important scientific problems," says Virginia Tech's Daniel Wubah. "By combining engineering and life science expertise, this team has made a valuable contribution to a real-world problem directly related to the security of our nation."
Scientists Work Toward 'Forever' Storage
Computerworld (03/21/11) Lamont Wood
IBM researchers are researching ways to facilitate data storage migration to make it easier to preserve data. "Most storage products have a five-year warranty, and most users are in the practice of replacing their systems every five years, with infrastructure becoming completely transformed, like a snake shedding its skin, in a maximum of 10 years," says Gartner analyst John Monroe. The researchers are developing storage technology that should be reliable for decades instead of years. For example, IBM's racetrack memory carries about 100 bits, encoded as nonvolatile spots of magnetism. Racetrack memory also features superfast response times and is much more compact than today's hard drives. "Most magnetic devices are designed for 10 years, and ours should last at least that long," says IBM research fellow Stuart Parkin. "I think it will be the storage Utopia. There are no trade-offs." Meanwhile, Hewlett-Packard (HP) researchers are developing memristors for long-term data storage. "We think that memristor data will have a significantly longer shelf life--20 to 30 years at least," says HP researcher Alistair Veitch.
Software Opens up Redistricting
USA Today (03/21/11) P. 8A Gregory Korte
College students, professional software developers, and political activists across the United States are using newly developed software to better draw political districts. For example, Virginia and Michigan are holding public contests using George Mason University-developed software to remove gerrymandering from the political process. "The goal is to move beyond just having forums with citizens to talk about redistricting, but give citizens the tools to draw their own maps," says Michigan Center for Election Law director Jocelyn Benson. Columbia University law students are using drawcongress.org to draw districts for all 435 U.S. House of Representative seats. "People can see how the process works, so it's a little less mysterious than it was 10 years ago," says Dave Bradley, a Seattle-based software developer who created DavesRedistricting.com. The programs address the many complications inherent in redistricting, such as creating political districts that are compact and follow natural boundaries. "The technology has evolved so much that it's become almost entirely democratized," says Bob Holsworth, chairman of Virginia's bipartisan redistricting commission. "This will be a fact of political life from now on."
Dueling Algorithms
MIT News (03/18/11) Larry Hardesty
The Massachusetts Institute of Technology's Computer Science and Artificial Intelligence Laboratory, in collaboration with researchers at Northwestern University, the University of Toronto, and the University of Pennsylvania, have developed a mathematical framework that analyzes dueling algorithms. The research could help solve questions such as which Web service best serves the public. The researchers, led by Ph.D. student Ankur Moitra, have developed faster methods for finding equilibrium strategies as well as techniques to represent the results of different strategies probabilistically, making it easier to calculate equilibria. The researchers also developed computational methods for finding specific sets of strategies that coincide with the statistical profiles of the equilibrium calculations. Several cases of dueling algorithms were considered by the researchers, including two computers seeking an item in a database, two businesses attempting to hire employees from the same applicant pool, and two racers trying to plot routes across a town with erratic traffic patterns. In each instance it was determined that an equilibrium strategy for besting an opponent may not be good in the abstract.
Solving the Bandwidth Bottleneck
University of Texas at Austin (03/17/11)
Engineers at the University of Texas at Austin (UTA), Cornell University, the University of California, San Diego, the University of Southern California, and Moscow State University are participating in a multi-year research effort to develop algorithms that could improve wireless networks' ability to store, stream, and share videos. The researchers hope to provide high perceptual quality video to the nearly two billion Internet users worldwide, says UTA professor Robert W. Heath Jr. "Doing so requires the delivery of fewer, more perceptually relevant bits per video stream, communicating those bits more efficiently throughout the network, and creating a more capable perception and video-content-aware network infrastructure," Heath says. The researchers plan to develop algorithms that can determine which portions of video are most important to viewers and then select those sections to send through the limited bandwidth that is available. The researchers also are developing a more intelligent system equipped with stations that can communicate with each other in real time and manage bandwidth so that more can be added with increased demand. "In a more intelligent wireless system, the base stations could communicate and would know to give more juice to the area where video traffic is highest," says the UTA professor Alan Bovik.
Abstract News © Copyright 2011 INFORMATION, INC.
To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe
|