Association for Computing Machinery
Welcome to the December 15, 2008 edition of ACM TechNews, providing timely information for IT professionals three times a week.

HEADLINES AT A GLANCE


Google Wants Its Own Fast Track on the Web
Wall Street Journal (12/15/08) P. A1; Kumar, Vishesh; Rhoads, Christopher

Google recently approached major cable and phone Internet service providers with a proposal to create a dedicated lane for Google content. Analysts say the move would be a major loss for supporters of net neutrality, who oppose giving preferential treatment to any Internet traffic. Until recently, Google had been one of the most outspoken supporters of net neutrality. Phone and cable companies argue that Internet content providers should share network costs, particularly as network traffic has grown by more than 50 percent annually, according to some estimates. Internet providers say the rising amounts of traffic, largely created by the widespread creation and viewing of online videos, means they need to increase revenue to upgrade their networks. Charging companies for dedicated fast lanes is a possible solution. However, a major cable operator that has been in talks with Google says that it is hesitant to create an agreement because of concerns that such an agreement may violate U.S. Federal Communications Commission guidelines on network neutrality. Google's proposed agreement, called OpenEdge, would place Google servers directly within the network of Internet service providers, which would accelerate Google's services for end users. Although Google says that other companies can create similar arrangements if they want to, analysts say that if OpenEdge is approved it would give Google a significant advantage that few firms would be able to match. Google's Richard Whitt says the proposal does not violate network neutrality rules.


The Next Big Sensation?
Washington Post (12/15/08) P. C1; Garreau, Joel

Computer scientists are developing machines that enable users to feel digital content. "The world is going digital, but people are analog," says Immersion Corp.'s Gayle Schaeffer. "In the digital world, touch is so much more personal and private and non-intrusive." Touch-sensitive screens are becoming increasingly common, and are being used in a wide variety of devices, notes Consumer Electronics Association's Steve Koenig. However, a major problem is that touching a computer screen still feels like touching a computer screen, no matter what content is being displayed. The goal is to create touch screens that actually feel like the user is manipulating the content on the screen and working with three-dimensional objects. Allison Okamura, director of the Haptics Laboratory at Johns Hopkins University, is developing an experimental surgical robot that steadies a surgeon's hands through haptic feedback. "If I shake, it holds me steady," Okamura says. "I can force it to make me move very slowly and deliberately, so it makes me extremely accurate." Lab undergraduate student Kathryn Smith is working to capture, record, and convey the feeling of something through the skin's sense of touch, possibly using sophisticated vibrators to replicate texture, similar to how speakers reproduce audio. Okamura notes that haptics technology tends to attract female students. She says that haptics "slops over into fields like physiology and psychology--it's grounded in the business of figuring out exactly how humans tick. Psychology, as a field, is loaded with women."


Computing for Good: Web Technology to Solve Human Problems
Network World (12/11/08) Cox, John

The Georgia Institute of Technology recently launched the Computing for Good project (C4G), a course that encourages Georgia Tech computer science students and faculty to examine how computer technology can be used to help people. The C4G course was inspired by a faculty presentation last year by professor Santosh Vempala, who also taught the first course in Spring 2007, along with professors Michael Best and Ellen Zegura. In the class, students worked on seven projects. For one of the projects the students developed a Web tool to monitor the safety of African blood supplies. The tool is expected to go live in 14 African countries in a few weeks. Initially, the tool was only going to be used to create quarterly data reports for the U.S. Centers for Disease Control, but the researchers realized it could be used to manage the blood supply through real-time data collection, Vempala says. The Web-monitoring tool will give National Blood Transfusion Service (NBTS) directors access to current and accurate data on blood inventories. During the project, NBTS directors regularly asked for more data fields for summary and analytical reports, including trend analysis and regional data comparisons. The result was a new database that allows for more flexible data analysis and reporting. Internet access at some sites remained a problem, with many sites relying on low bandwidth dialup or satellite connections, so the team redesigned the tool to allow each office to customize selected parts of the application to meet local requires for data collection and reporting. They also redesigned the tool with advanced Ajax function calls to help the initial page load quickly, so users can start working immediately while other elements are downloaded in the background.


W3C Upgrades Web Accessibility Standards
InternetNews.com (12/11/08) Adhikari, Richard

The World Wide Web Consortium (W3C) has released version 2.0 of its Web Content Accessibility Guidelines (WCAG), which are designed to help developers make Web sites and online content easier to access for users with disabilities. W3C plans to make WCAG 2.0 an international standard. "There were a lot of different guidelines for accessibility developed by different countries and organizations, so in WCAG 2.0 we had this huge effort to develop something that would work well across all these different needs and become like a unified international standard," says W3C Web Accessibility Initiative director Judy Brewer. "There's been considerable international interest in accepting this as a standard." So far, WCAG 2.0 has received support from Adobe, IBM, Boeing, Microsoft, the Chinese and Japanese governments, and the European Commission for Information Society and Media. WCAG 2.0 offers several improvements over the previous version, including support for new Web technologies. "WCAG 1.0 was specifically for HTML, but the Web uses other technologies now, and we wanted an updated standard that would cover any technologies and also give developers more flexibility," Brewer says.


Digging Out From Piles of Sticky Notes
MIT News (12/10/08) Trafton, Anne

Massachusetts Institute of Technology computer scientists have developed several software programs designed to replace sticky notes and to-do lists. After studying how people use small pieces of paper to record information, the researchers developed and tested several programs that capture the broad context for every piece of information, as well as streamlined, Web-based, note-taking programs. The researchers found that, in addition to sticky notes, people use multiple improvised tools, including notebooks, whiteboards, text files, Internet bookmarks, and emails sent to themselves. People also recorded all types of information, from fantasy football rosters, to travel price comparisons, to lists of guitar chords. Capturing such information can be difficult because there are no specific computer applications designed to store them, and many people do not use specific systems such as calendars and address books due to a lack of what the researchers term "lightweight capture," meaning if too much time or effort is needed to record a piece of information, most people will not bother. The first tool created, Jourknow, captures context every time a user makes a note, such as where the user was and what Web site was being viewed. Another program, Inky, aims to "understand" the notes users write and place them in the right application, such as automatically placing notes on a meeting in the calendar application. The most recent effort, list.it, focuses on minimizing the time and effort needed to capture information. List.it is a Web-browser based tool that enables users to write short notes.


Sevenfold Accuracy Improvement for 3-D 'Virtual Reality' Labs
NIST Tech Beat (12/09/08)

National Institute of Standards and Technology (NIST) researchers have devised software that facilitates an at least seven-fold increase in the accuracy of the tracking devices in the institute's virtual research environment. Such environments are usually composed of several walls onto which are three-dimensional (3D) images are displayed, and users wear special glasses and carry wands that enable them to travel within and interact with the virtual world in conjunction with the underlying graphics system. The NIST software corrects the tracking of the location and orientation of the sensors affixed to the glasses and wands. NIST scientists mapped two sets of data points--the sensor's actual position and the position the computer says the sensor is at--and the resulting software reduced average location errors and average orientation errors by factors of 22 of 7.5, respectively. "This improvement in motion tracking has furthered our goal of turning the immersive environment from a qualitative tool into a quantitative one--a sort of virtual laboratory," says NIST mathematician John Hagedorn.


What's Next for Computer Interfaces?
Technology Review (12/11/08) Greene, Kate

Touch-screen technology is gaining widespread recognition as a practical interface for computers. Over the next few years touch screens will likely become increasingly prevalent, as projects with miniature touch screens or displays the size of walls become available. A problem with touch-screen technology is that fingers tend to cover up important information on the screen, but making touch screens larger would make devices to large to be used as mobile devices. A Microsoft Research project called nanoTouch tackles this problem by adding touch interaction to the back of devices. Devices have a display on the front and touch-sensitive controls on the back. A prototype device can display a transparent finger or cursor on screen to help people operate the device reliably. Meanwhile, Perceptive Pixel is supplying wall-sized touch screens to several U.S. government agencies and news outlets. The screens were widely used during the U.S. presidential election in November. Perceptive Pixel's displays use a physical phenomenon called total internal reflection, in which light is shone onto an acrylic panel, which acts as the display and is completely contained within the material. When a finger or object touches the surface, the light scatters and is detected by cameras positioned behind the display. The screens also can detect the amount of pressure with which an object touches the screen, adding additional functionality.


LMU Computer Scientists Involved in Galileo Research
Ludwig-Maximilians-Universitat Munchen (12/12/08)

Mobile and Distributed Systems Group researchers at Germany's Ludwig-Maximilians-Universitat Munchen (LMU) are developing services that support the European Union's Galileo navigation system. The Federal Ministry of Education and Research recently approved the Indoor project, in which LMU computer scientists will develop positioning and navigation technologies that can be used for traffic logistics and emergency services. The project's goal is to improve specific algorithms that will increase the energy and cost efficiency of location-based service applications. The project will include the enhancement of localization algorithms for indoor applications, the evaluation of exiting platforms and concepts, and a user study on the technology. So far, location services have relied on the terminal device periodically sending position data to a server, even when the user and the terminal are not moving. LMU researchers assign boundary circles to users according to given queries and the movement of the persons or objects being monitored. The terminal only sends data when the user moves beyond the circle, creating a more effective and economical method because data is only sent when there is movement. Currently, the technology works best with global position system-supported terminals, but it will be adapted for Galileo modules and indoor positioning. The researchers will be challenged to create precise localization inside large buildings, which is more difficult that outdoor positioning.


HP, ASU Unveil Paper-Like, Flexible Display
InformationWeek (12/09/08) Gonsalves, Antone

Hewlett-Packard (HP) and Arizona State University's Flexible Design Center have developed a prototype, paper-like, unbreakable, flexible computer display made almost entirely of plastic using self-aligned imprint lithography (SAIL) technology. SAIL allows an image on the display to maintain its form despite the display being bent and flexed. The technology consumes less power and requires 90 percent less material by volume than current computer displays. The researchers say the technology is a milestone in the effort to create a mass market for high-resolution, flexible displays. The technology could be used to reduce the costs of manufacturing laptops, smart phones, and other electronic devices, or to create electronic paper and signage. "In addition to providing a lower-cost process, SAIL technology represents a more sustainable, environmentally sensitive approach to producing electronic displays," says HP's Carl Taussig. The plastic material used in the display is flexible Teonex Polyethylene Naphthalate (PEN), which was developed by DuPont Teijin Films. The display also uses E Ink's Vizplex imaging film, which allows images to remain on the display without applying a voltage, significantly reducing power consumption when viewing text.


Stanford Launches 'Clean Slate' Internet Lab With Deutsche Telekom and NEC
Campus Technology (12/08/08) Schaffhauser, Dian

The Clean Slate Internet Design Program at Stanford University, a research effort launched in 2007 that aims to rethink the global communications infrastructure, recently announced the formation of the Clean Slate Lab, which will unite professors, students, research staff, and engineers in an effort to deploy program prototypes in research and operational networks. Stanford researchers will be joined by five full-time engineers, three from Deutsche Telekom and two from NEC. "The goal of the Clean Slate program is to reinvent the Internet to meet the needs of the future through fundamental and 'disruptive' advances, rather than incremental patches and work-arounds," says Clean Slate program executive director Guru Parulkar. One of the lab's initiatives will be to design, prototype, deploy, and disseminate technology developed as part of the Programmable Open Mobile Internet (POMI) 2020 project. Parulkar says the widespread adoption of smart handheld devices is revolutionary, and provides an opportunity for new software services and applications. However, he says such a revolution will only occur if the currently closed and incompatible networks are replaced with an open development platform. Part of the POMI 2020 initiative involves developing a technology called OpenFlow that uses a standardized interface to make closed and incompatible switches and routers programmable. Programmable routers and switches enable researchers to experiment with ideas on a production network with real applications and users by writing their own network services.


U.S. Role as Internet Hub Starts to Slip
Guardian Unlimited (UK) (12/08/08) Johnson, Bobbie

The Internet is no longer as U.S.-centric as it used to be, reveals a new TeleGeography Research survey. TeleGeography examined data traffic from Internet backbone providers worldwide and found that the percentage of data from Asia that passed through the United States at some point has fallen from 91 percent in 1999 to 54 percent in 2008. Meanwhile, the percentage of data from Africa that passed through the United States fell from 70 percent to 6 percent during the same time period. TeleGeography analyst Eric Schoonover says the changing Internet traffic patterns are due to the growing number of hubs and Internet exchanges in Latin America, Asia, and Africa. However, TeleGeography notes that the growth of Internet connectivity in North America and Europe has not slowed down--in fact, it has accelerated. The survey found that Internet connectivity growth rates reached 63 percent last year after several years of slower growth. The results of the survey were released amid other signs of America's declining role in the Internet. For instance, the number of Internet users in China surpassed the number of users in the United States for the first time ever earlier this year. In addition, ICANN has approved plans that will allow the creation of domain names in languages that do not use the Roman alphabet, such as Chinese, Arabic, and Russian; those domains will begin to be created next year.


Georgia Tech Tests Mobile Alert System for Cell Phones
Georgia Institute of Technology (12/03/08) Terraso, David

The Georgia Institute of Technology's Wireless Emergency Communications project recently tested the U.S. Federal Communications Commission's Commercial Mobile Alert System to determine how effective the system is at alerting people with vision and hearing impairments. The Commercial Mobile Alert System was created in 2008 to provide a method for commercial mobile service providers to voluntarily transmit emergency alerts to mobile subscribers. Software to make the emergency alert system accessible for people with disabilities was developed by the Rehabilitation Engineering Research Center for Wireless Technology's Wireless Emergency Communications project. The Georgia Tech study found that although 90 percent of people with vision-based disabilities said the alert attention signal was loud and long enough to gain their attention, only 70 percent of deaf and hearing-impaired participants thought the vibrating cadence used was strong enough to alert them to an emergency. All hearing participants said the early part of the message was missed because the tone ended too quickly and the 90-character spoken alert started too soon. The Commercial Mobile Alert System message first describes the event type, such as tornado or flood, so crucial information could be lost to those using text-to-speech software for the alerts. Deaf and hearing-impaired participants said they would like to see warnings such as a strobe light or a flashing screen, as well as stronger vibrations.


Profs Stand Up to Secure Laptops
Pittsburgh Tribune-Review (12/01/08) Cronin, Mike

Wireless Internet connections are more difficult to secure than cable connections because users send out information on radio waves that travel in every direction. Hackers can find ways to identify individual wireless users, pinpoint their location, intercept information from those users, and even piggyback on their paid Internet sessions for free. University of Pittsburgh professor Jose Carlos Brustoloni says it is possible to read emails and attachments sent over wireless connections at hotspots using programs available for free on the Internet. Carnegie Mellon University (CMU) professor Srinivasan Seshan says he is willing to give up some privacy in exchange for useful services, but right now he has no idea what information he is disclosing or who can see that information when using a wireless connection. Seshan and CMU computer science doctoral student Jeffrey Pang are working with Intel on two wireless security projects. Pang is the lead student on an effort to hide a wireless user's unique address and encrypt any emails, attachments, or other forms of information. Seshan is working to build virtual fences around Wi-Fi hotspots using electronic "steerable" antennas to create areas in which only authorized users can send and receive messages. Meanwhile, CMU professor Priya Narasimhan is studying how small devices such as pacemakers and handheld devices can be embedded with enhanced security features.


What Happens When Silicon Can Shrink No More?
New Scientist (12/05/08) No. 2685, P. 35; Heber, Joerg

Moore's law postulates that the number of components squeezed onto a single silicon chip doubles about every two years, and the current rate of shrinkage implies that the smallest components will reach the size of a few silicon atoms by about 2020, canceling silicon's microelectronic properties and forcing either a successor to silicon to be found or the acceptance of computing power's upper limit, at least until more exotic computing architectures achieve commercial viability. Wei Lu of the University of Michigan thinks a "bottom-up" approach to microelectronics, in which molecules are allowed to self-assemble into tiny structures, may be the key to extending the life of silicon. However, it is difficult to assemble uniform components such as transistors with this method, and Lu is an advocate of crossbar array technology that bypasses the replication problems. Building microelectronics via a bottom-up approach can yield materials with capabilities that silicon does not possess. Efforts to find an alternative base material to silicon hint that carbon could emerge as the perfect nanoelectronic material to continue the computer's evolution. Graphene is a material composed of shavings of carbon just one atom thick, and it is abundant and boasts extremely strong atomic bonds and mechanical stability. Electrons can travel along graphene's single atomic layer without striking the obstacles they would encounter in a three-dimensional structure. However, the reliable and reproducible production of graphene transistors requires mass-production fabrication technologies.


Abstract News © Copyright 2008 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]




Unsubscribe
Change your Email Address for TechNews (log into myACM)