Association for Computing Machinery
Welcome to the December 3, 2010 edition of ACM TechNews, providing timely information for IT professionals three times a week.


F.T.C. Backs Plan to Honor Privacy of Online Users
New York Times (12/01/10) Edward Wyatt; Tanzina Vega

The U.S. Federal Trade Commission (FTC) has proposed a broad framework for the commercial use of Web data and announced its support of a plan that would enable consumers to choose if they want their Internet browsing activity monitored. The framework features a simple "do not track" designation similar to the national "do not call" registry. If the FTC recommendations are widely accepted, online advertising and technology companies could be forced to change their methods of collecting specific information about consumers. The FTC will likely need support from Congress in order to enact many of its recommendations. However, pending Congressional action, the FTC plans to create a system called "privacy by design," which will require companies to build protections into their business practices. "We'd like to see companies work a lot faster to make consumer choice easier," says FTC chairman Jon Leibowitz. "Our main concern is the sites and services that are connecting the dots between different times and places that a consumer is online and building a profile of what a consumer is doing." The online advertising industry has generally accepted the concepts of the FTC proposal, but there is some opposition to some of the strict measures some consumer advocates prefer, according to the Interactive Advertising Bureau's Mike Zaneis. The FTC is seeking industry and public comments on the recommendations and to make other suggestions.

U.S. Tech Lead Is at Risk, Says Obama's Top Scientist
Computerworld (12/01/10) Patrick Thibodeau

The United States is in danger of losing its technological edge amid a deterioration in its global competitiveness, warns U.S. Energy Secretary Steven Chu. Particularly alarming is the country's loss of leadership in the area of high technology manufacturing. Chu notes that China's global share of the tech export market rose from 6 percent in 1995 to 20 percent in 2008, while the U.S. share fell from about 25 percent in 1998 to the current 12-13 percent. He says U.S. tech innovations such as the Internet did "wonderful things" to generate wealth in the United States in previous years, and stresses the development of alternative energy vehicles, renewable energy, high speed rail, and supercomputing as being crucial to sustaining U.S. tech leadership. Chu notes that the United States "still has the opportunity to lead in a world" in generating cheap, carbon free technology, and thus enable the country's "future prosperity." However, he cautions that "time is running out."

More Focus on Finances Needed to Increase Latino Science and Math Graduates
UCR News (CA) (12/01/10) Sean Nealon

A greater focus needs to be placed on finances in order to increase the number of Latino students graduating with science, technology, engineering and mathematics (STEM) degrees, according to a University of California, Riverside report. The report found that STEM majors with significant financial support from family members were more likely to graduate than students with less support. U.S. President Barack Obama recently signed the Health Care and Education Reconciliation Act, which, for the next nine years, will provide $100 million annually to increase the number of degrees in STEM fields handed out at Hispanic Serving Institutions (HSIs). The report offers a list of recommendations for HSIs hoping to receive federal grants, such as providing research opportunities in core subjects, increasing support for upperclassmen research experiences, and developing support programs involving industry experts and faculty members. The report divided Latino students into three groups--self-support, parental support, and balanced support. Just 26 percent of self-supported Latino students graduated from a research school, while 46 of those with parental support and 42 percent with balanced support obtained a STEM degree.

A European Network of Excellence for Large-Scale Data Management Over the Internet Is Launched
Universidad Politecnica de Madrid (Spain) (12/01/10) Eduardo Martinez

The Universidad Politecnica de Madrid (UPM) is participating in PlanetData, a European project to help researchers publish their data in a more serviceable form on a large scale over the Internet. PlanetData will offer data stream representations and scalable techniques for integrating, publishing, and assessing structured and unstructured data streams, microblog entries, digital files, e-science resources, public-sector datasets, and linked data in a cloud. The project will define best practices for capturing the context of the produced data. PlanetData will make sense of data--including provenance, and space, time, and social characteristics--as a means of increasing the effectiveness of data processing and retrieval techniques. The project also will provide strategies for assessing, recording, preserving, and improving data quality via repair techniques during information processing. In addition, the project will address issues of access control and privacy. UPM's Oscar Corcho will research the annotation of data sources to improve the processing, mining, and fusion of data resources as well as the modeling, integration, and adaptation of unknown and changing data sources.

FCC Chief Previews Proposed Net Neutrality Rules
CNet (12/01/10) Marguerite Reardon

U.S. Federal Communications Commission (FCC) chairman Julius Genachowski recently previewed new Net neutrality rules designed to preserve the Web "as a platform for innovation, investment, competition, and free expression." Broadband providers wanted guarantees that the language used in the rules would not bar them from either managing their networks or charging different prices for different levels of service, while wireless service providers wanted the wireless networks to be excused from Net neutrality regulations. Both of these measures are opposed by consumer groups. Genachowski's proposals would permit broadband providers to institute usage-based charges so that customers consuming greater bandwidth would get charged more than customers using less, and the FCC also will let providers experiment with offering specialized services over dedicated bandwidth that could supply higher-quality access to consumers. In addition, the new rules would ban wired broadband providers "from blocking lawful content, applications, services, and the connection of nonharmful devices to the network." They also would be subject to transparency mandates as to how their networks are managed. The transparency requirement will be applicable to wireless service providers as well, which also will be banned from blocking or degrading most traffic.

Web Bug Reveals Browsing History
BBC News (12/02/10)

Computer science researchers at the University of California, San Diego studied the 50,000 most visited Web sites and found that 485 use a browser bug to track a visitor's history. The flaw exploits the way browsers handle links users have visited, such as by changing the color of the text to reflect an earlier visit. The bug can be abused with software that resides on the site and interrogates the visitor's browser to see what it does to a given list of sites. Sites that are displayed in a different color reveal that the visitor has already seen the site. The researchers say that 63 Web sites copy the data the bug reveals and 46 hijack a visitor's history. The researchers also examined other popular techniques for mapping and monitoring what visitors do, such as running scripts that track the trail of a user's mouse pointer across Web pages. "Our study shows that popular Web 2.0 applications like mashups, aggregators, and sophisticated ad targeting are rife with different kinds of privacy-violating flows," according to the researchers, who warn that a pressing need exists to develop defenses against history hijacking.

Energy Use in the Media Cloud
University of Bristol News (12/03/10) Joanne Fryer

University of Bristol researchers are studying ways to reduce energy consumption in digital media. The researchers estimate that by 2030, the overall data demand will be 3,200 megabytes per person per day, totaling 2,570 exabytes per year by the total world population. The average power needed to support this activity is 1,175 gigawatts at current levels of efficiency, and that a factor 60-performance improvement would be required if infrastructure energy is to be provided by one percent of renewable energy capacity in 2030, according to the researchers. If historical trends continue, this level of energy efficiency could be reached by 2021. However, new applications, such as high-definition online video and Internet radio, could require bandwidth that exceeds the researcher's estimates. The researchers offer strategies that could help reduce the demand for bandwidth, including techniques to reduce digital waste, eliminating data downloaded but not used, and designing Web pages that encourage users to pursue less data-heavy options. "This research suggests that in a future which is increasingly environmentally constrained, there is still a good chance that broadband connectivity can be provided equitably to the majority of the world," says Bristol's Chris Preist.

Hackers Take the Kinect to New Levels
Technology Review (12/02/10) Timothy Carmody

Massachusetts Institute of Technology (MIT) Media Lab researchers have developed DepthJS, a Chrome Web browser extension that enables users to surf the Web using the Kinect gaming device. DepthJS uses JavaScript to translate hand gestures into commands that the browser can complete. The researchers plan to use DepthJS as the interface between different Web applications and Kinect gestures. "Getting Kinect's events into the Web browser is all about lowering the cost of entry to exploring and creating applications using depth information," says MIT's Dough Fritz. "Right now we are in that state of rapid change where people are remixing familiar interaction techniques with what feels natural." Adafruit Industries recently held a competition to find software that could connect the Kinect to a normal computer. Developers have submitted videos of different Kinect applications. "These videos are really just proof-of-concepts that show some of the possibilities for further development," says Adafruit's Limor Fried. Another major hurdle is translating gestures from humans for use by computers. Most Kinect games overcome this issue by matching users with an onscreen avatar that mimics the movements. Another solution could be using light projectors to create virtual objects that users can interact with.

IBM Chip Breakthrough May Lead to Exascale Supercomputers
Computerworld (11/30/10) Agam Shah

IBM's new CMOS Integrated Silicon Nanophotonics technology boosts the data transfer rate between computer chips using pulses of light, a development that could increase the performance of supercomputers by a thousand times or more. CMOS Integrated Silicon Nanophotonics combines electrical and optical components on one piece of silicon. The new technology can replace the copper wires that are used in most chips today. The integrated silicon converts electrical signals into pulses of light, making the communication between chips faster, says IBM researcher Will Green. He says the photonics technology could boost supercomputing calculations to speeds approaching an exaflop, which IBM hopes to develop into an exaflop computer by 2020. "This is an interesting milestone for system builders [who are] looking at building ... exascale systems in 10 years," Green says. IBM also plans to use the optics technology to develop new types of transistors. "The nice thing about it is we have a platform which allows us to address many different places simultaneously," he says.

Project Pioneers Silicon-Germanium for Space Electronics
Georgia Tech News (11/30/10) John Toon

Georgia Tech researchers have developed a method using silicon-germanium (SiGe) technology to construct space electronics that could make space vehicles and instruments highly resistant to extreme temperature changes and space radiation. "The team's overall task was to develop an end-to-end solution for [the National Aeronautics and Space Administration]--a tested infrastructure that includes everything needed to design and build extreme-environment electronics for space missions," says Georgia Tech professor John Cressler. The researcher say that SiGe technology can enable lighter, smaller, more powerful, less expensive, and more complex components, in addition to providing greater reliability and adaptability. "Without changing the composition of the underlying silicon-germanium transistors, we leveraged SiGe's natural merits to develop new circuit designs--as well as new approaches to packaging the final circuits--to produce an electronic system that could reliably withstand the extreme conditions of space," Cressler says. SiGe-based electronics have been shown to function from plus-120 degrees to minus-180 degrees Celsius and can withstand different types of radiation. The SiGe technology also does not require the same amount of shielding technology as other alloys, which reduces overall weight.

Could 135,000 Laptops Help Solve the Energy Challenge?
U.S. Department of Energy (11/30/10)

The U.S. Energy Department (DOE) has awarded the highest allocations of supercomputing time to date to 57 research projects that will use computer modeling to execute virtual experiments that would be unworkable in the natural world, under the department's Innovative and Novel Computational Impact on Theory and Experiment program. The projects will tap Oak Ridge National Laboratory's Cray XT5 and Argonne National Laboratory's IBM Blue Gene/P supercomputers to address such challenges as improving biofuel production, accelerating development of more efficient solar cells, or cultivating more effective Parkinson's disease medications. The awards feature approximately 1.7 billion processor hours on the machines, mirroring both the increasing refinement of the computer modeling and simulation discipline and the swift expansion of supercomputing capabilities at DOE's laboratories. Among the projects are both commercial and academic research efforts, including alliances with companies to employ sophisticated computer simulation in the development of improved jet engines and wind turbines.

Inside the Labs at Microsoft, HP and IBM
Network World (11/29/10) James Niccolai; Nancy Gohring; Joab Jackson

Microsoft, Hewlett-Packard (HP), and IBM each take different approaches to their research efforts. In 2007, HP hired former University of Illinois-Chicago dean of engineering Prith Banerjee, who narrowed the lab's focus from about 150 projects to about 20 "big bets" projects around certain areas that HP views as central to its future. "It's the research output that matters, not the amount of dollars you put in," Banerjee says. HP currently devotes about one third of its resources to exploratory research in areas such as nanotechnology and quantum computing, while another third is allocated to applied research and the rest is dedicated to research into specific products. Microsoft prides itself on being one of the few public companies that still conducts pure research. "I think of research as one of the things that we have to do and elect to do in order to ensure we survive over the long term," says Microsoft's Craig Mundie. Microsoft encourages its researchers to work together and to try to collaborate on ideas that can lead to products. Since 2002, Microsoft's research and development budget has grown from about $250 million to about $500 million per year. Research also is essential to IBM's survival, says IBM's Robert Morris. Since 2002, IBM has increased research and development spending by 21 percent, and has led the United States in most patents issued for the last 17 years. IBM has developed an effective method to transfer the research into business, a major hurdle for many technology companies.

Robots Learn to Read the Writing on the Wall
New Scientist (11/29/10) Colin Barras

University of Oxford researchers believe that the next step in robotics technology is developing artificial intelligence machines that can read. Because optical character recognition (OCR) systems already exist, manipulating the technology for use in robots should be straightforward, says Oxford's Ingmar Posner. Google recently released a smartphone application that analyzes scanned images of text, called Goggles, that has been able to translate languages. However, Goggles requires users to identify text and point the phone's camera at the words before the OCR technology can work. "The OCR software doesn't cater for the fact that it might not be seeing text," Posner says. The Oxford team developed text-spotting software that identifies a horizontal area of uniform color just above and below an area with two-tone color variation, traits that are common within text on posters and street signs. After the software has identified the text, an image of it is analyzed by the OCR software. The team also loaded a dictionary and spell-checking system into a prototype robot, which uses news Web sites to identify and understand the names that it reads. Posner says that with those systems in place, the robot is ready to read text in the world in the same way a human does.
View Full Article - May Require Free Registration | Return to Headlines

Abstract News © Copyright 2010 INFORMATION, INC.
Powered by Information, Inc.

To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe