Association for Computing Machinery
Welcome to the September 23, 2009 edition of ACM TechNews, providing timely information for IT professionals three times a week.


Two Young Berkeley Faculty Members Receive MacArthur 'Genius' Award
UC Berkeley News (09/22/09) Sanders, Robert

University of California, Berkeley professors Maneesh Agrawala and Lin He have been named MacArthur Fellows, which provides them with five-year, $500,000 grants to work on independent projects. Agrawala, the recipient of ACM's 2008 SIGGRAPH Significant New Researcher Award, is currently designing new algorithms and techniques to process large amounts of information. "SIGGRAPH honored Maneesh last year for his innovative approach to visualization, computer graphics, and human-computer interaction," says ACM vice president Alain Chesnais. "His work, demonstrated over recent years at various SIGGRAPH conferences, reflects the way ACM encourages its Special Interest Groups like SIGGRAPH to raise awareness of computing and its vital role in driving innovations that enrich our lives and advance society." Agrawala says he uses art as a model for data processing. "People often create images, illustrations, and photographs to convey an idea of information to other people," he says. "In this process, they ideally highlight things that are important and de-emphasize the things that are not. In my lab, we are developing algorithms that automatically focus on the key information people need to understand the message being conveyed." While working as a doctoral student in computer science at Stanford, Agrawala designed a map generator called LineDrive. The program creates simple maps that emphasize the route itself, omitting details normally included in automatic directions such as the names of unrelated cities, roads, and parks. Agrawala also has designed algorithms that produce detailed instructions for projects that range from setting up software to constructing furniture or an airplane engine. Other projects include a video puppetry program for users with little animation experience and a photograph editing program.

President Obama Touts Role of Basic Research in Innovation
Computing Research Association (09/21/09) Harsha, Peter

In his recent address at Hudson Valley Community College in Troy, N.Y., U.S. President Barack Obama cited the need for the United States to strengthen its commitment to basic research, which he says is vital to the country's global competitiveness. He stressed that tapping the Internet's full potential is essential to U.S. innovation, and this entails faster and more wide-scale broadband deployments and the implementation of rules to guarantee that the Internet remains fair and open to all U.S. residents. Obama noted Federal Communications Commission Chairman Julius Genachowski's announcement of guidelines designed to achieve this goal. He announced his proposal of grants through the U.S. Defense Advanced Research Projects Agency and the National Science Foundation to investigate "the next communications breakthroughs, whatever they may be." Obama also said that he has established a goal to invest three percent of U.S. Gross Domestic Product into basic research and development, which he said has been badly neglected for decades. "When we fail to invest in research, we fail to invest in the future," Obama said. "Yet, since the peak of the Space Race in the 1960s, our national commitment to research and development has steadily fallen as a share of our national income."

FBI’s Data-Mining System Sifts Airline, Hotel, Car-Rental Records
Wired News (09/23/09) Singel, Ryan

Recently declassified documents show that the U.S. Federal Bureau of Investigation's (FBI's) National Security Branch Analysis Center (NSAC), a data-mining system that was proposed as a tool to track down terrorists, is being used in hacker and domestic criminal investigations. NSAC's database now contains more than 1.5 billion government and private-sector records on U.S. citizens and foreigners, including tens of thousands of records from private databases, according to the declassified documents. Critics say the database is increasingly close to the Total Information Awareness system first proposed by the Pentagon following the Sept. 11, 2001 terrorist attacks. The FBI is currently looking to quadruple the staff of the NSAC program. However, the proposal has been heavily criticized by privacy groups as being both invasive and ineffective, and critics say the declassified documents show that the plan is being continued in private and without sufficient oversight. NSAC contains more than 55,000 entries on customers of the Cendant Hotel chain, with entries for hotel customers whose names match those on a list the FBI provided to the company. An additional 730 records are from the Avis rental car company, which were collected through a one-time search of Avis' database matched against the State Department's terrorist watch list, and 165 entries are from credit card transaction histories from the Sears department store chain. An analysis of the documents shows that the FBI has continuously expanded the NSAC system since 2004, and by 2008, NSAC had 103 full-time employees and contractors. The FBI wants to add 71 additional employees and is seeking $8 million for outside contractors to help analyze the data.

New Computing Tool Could Lead to Better Crops and Pesticides, Say Researchers
Imperial College London (09/22/09)

Researchers from Imperial College London (ICL) and Syngenta have developed a computational tool that uses machine learning to find complex patterns in data about the behavior of plants. The use of machine learning means agricultural researchers will be able to analyze plant biology in minutes, even if they lack certain information about the inner workings of plants. With previous mathematical modeling tools, finding nuggets of information could take months or even years. "We believe our computing tool will revolutionize agricultural research by making the process much faster than is currently possible using conventional techniques," says ICL professor Stephen Muggleton. "We hope that our new technology will ultimately help farmers to produce hardier, longer lasting, and more nutritious crops." The researchers are currently testing a prototype of the tool, which will help agricultural scientists predict how genes inside plants will react to different chemicals or environmental conditions. For one project, the tool will be used to analyze how different genes affect the way a tomato's flesh hardens and tastes, and the findings could lead to new tomato strains that are tastier, or redden earlier and soften later to prevent them from spoiling when they are transported to market.

Minority Students Needed in Math and Science to Combat 'Brain Drain,' Professors Say
Chronicle of Higher Education (09/22/09) Nelson, Libby

Three mathematics and science professors called on the U.S. government to support institutional programs that have succeeded in attracting and retaining minority students during a recent Congressional briefing session. Arizona State University professor Carlos Castillo-Chavez said that most of the Hispanic and American-Indian students who participate in the university's math and science honors program, a high school summer residential program, pursue science majors. Spelman College professor Sylvia T. Bozeman said her school's summer programs, in addition to the recruitment, advising, and mentoring efforts of the math and science faculty, have helped to get more young black females interested in mathematics. Castillo-Chavez said the talents of U.S. citizens will need to be developed if the country is to make up for the brain drain of Chinese and Indian scientists and mathematicians who have begun to return to their home countries. He noted that institutions such as Harvard University and the Massachusetts Institute of Technology will not be able to produce all the math and science graduates that the U.S. needs. "We have to produce large numbers of extremely well-qualified scientists and mathematicians," he said. "It's not going to take place at the elite universities, but at schools with limited resources." The mathematics-education experts also said Congress should increase spending on undergraduate scholarships and for the National Science Foundation.
View Full Article - May Require Paid Subscription | Return to Headlines

Rethinking the Long Tail Theory: How to Define 'Hits' and 'Niches'
Knowledge@Wharton (09/16/09)

The Long Tail theory suggests that the Internet drives demand away from popular products with mass appeal and directs it to more obscure niche offerings as it eases distribution and uses cutting-edge recommendation systems, but this theory is being challenged by new Wharton School research. A paper by Wharton professor Serguei Netessine and doctoral student Tom F. Tan details their use of data from the Netflix movie rental company to investigate consumer demand for blockbusters and lesser-known films. The researchers have determined that, contrary to the Long Tail effect, mass appeal products retain their high rankings when expanding product variety and consumer demand is factored in. "There are entire companies based on the premise of the Long Tail effect that argue they will make money focusing on niche markets," Netessine says. "Our findings show it's very rare in business that everything is so black and white. In most situations, the answer is, 'It depends.' The presence of the Long Tail effect might be less universal than one may be led to believe." The number of rated film titles at Netflix climbed from 4,470 to 17,768 between 2000 and 2005, and if this diversity is factored in so that product popularity is estimated relative to the total product variety, Wharton researchers do not uncover any proof of the Long Tail effect. Netessine says a relative analysis yields more meaning when applying the Long Tail theory to companies because it accounts for costs involved in maintaining a supply chain to meet demand for many niche products.

Taming the Vast--and Growing--Digital Data-Sphere
ICT Results (09/21/09)

European researchers are attempting to connect thousands of digital repositories into a massive network of easily searchable online data under the auspices of the DRIVER project. The researchers already have developed a search engine that regroups more than 1 million open access articles from 260 of Europe's top institutions. The DRIVER system uses a suite of open source software known as D-NET, which lets users or institutions individually tailor their experience regardless of their location. D-NET enables users to collect together open access content from various institutional repositories and presents the content in a manner that is uniform and openly accessible. Institutional repositories also can employ the software to plug in and make their content available from DRIVER or any other implemented portals. Thus far the DRIVER project has set up a stable platform capable of accessing any type of text document, but the researchers will develop the system to access content from any media. "This is a project that will never end, because there will always be something else to do, or new standards and technologies will emerge that need to be added to D-NET," says DRIVER coordinator Yannis Ionannidis. "We have achieved a milestone, but it is the first of many."

How the Netflix Prize Was Won
Wired News (09/22/09) Van Buskirk, Eliot

The secret to the success of BellKor's Pragmatic Chaos and The Ensemble, first- and second-place winners of the Netflix Prize, was teamwork. The Netflix Prize offered $1 million to the team that could improve its movie recommendation algorithm by 10 percent. Both of the winning teams combined the strengths of several smaller teams that had worked independently before being absorbed by a larger group. "In combination, these teams could get better and better and better," says Netflix's Neil Hunt. BellKor's Pragmatic Chaos and The Ensemble independently combined their members' algorithms to design more complex ones that represented everyone's input. Rather than appointing a few team leaders that did most of the designing, the teams say they worked communally and thus increased their strength--even if some of the ideas seemed unrelated to the initial problem. One of the algorithms BellKor's Pragmatic Chaos used tracked the number of movies a Netflix viewer rated in one day. People who rate a large number of movies at once have most likely seen them a while ago, which affects their judgment. Although this data would not have contributed to the team's success alone, when combined with other algorithms it increased the group's performance. The Big Chaos team, which was incorporated into BellKor's Pragmatic Chaos, found that users rate movies more negatively or positively depending on the day. By providing an algorithm that took this seemingly irrelevant data into account, Big Chaos helped the winning team succeed. "One of the big lessons was developing diverse models that captured distinct effects, even if they're very small effects," says The Ensemble's Joe Sill.

Controlling the Language of Security
Science Centric (09/19/09)

A security policy specification that guarantees the reliability and availability of home networks has been developed by computer scientists at Kyungpook National University and the Electronics and Telecommunications Research Institute in Korea. "Whenever a new access to the home network is found, it should be able to authenticate and authorize it and enforce the security policy based on rules set by the home administrator," the researchers say. The researchers developed the Home security Description Language (xHDL), which includes the necessary notation for consistently describing and specifying the security policy, and ultimately securing a home network. XHDL consists of a combining-rule element, authentication element, user element, object element, object-group element, role element, and rule elements. Each term could be used to run a browser-based control center. The domestic administrator would have simple control options for allowing access to the home network for specific devices and for controlling the packets of information that pass through the gateway to and from the Internet. XHDL would protect home networks from cyberattacks and ensure that it is available for use.

Too Scary to Be Real, Research Looks to Quantify Eeriness in Virtual Characters
Indiana University (09/22/09) Chaplin, Steve

Indiana University professor Karl MacDorman, director of the Indiana University-Purdue University Indianapolis Android Science Center, is researching human photorealism in robots, androids, and computer-generated characters. He questions whether people actually are unsettled by slight imperfections in human-looking technology, a reaction known as the uncanny valley. Developing perfectly human-like computer-generated characters that do not fall into the uncanny valley is the ultimate goal for animators and video game companies. MacDorman has discovered that many of the traditional rules of art school do not apply to computer-generated characters. "For example, artists will typically enlarge the eyes of someone they paint to make the person look more attractive," he says. "But our results indicate that eye enlargement can have a decidedly negative effect when photorealistic textures are applied to a human model." The same is true of other traditional facial distortions, such as eye separation and face height. MacDorman has devised an eeriness index to help quantify how virtual characters influence human decision making. He believes that developing design principles for human-looking characters would not only have a significant economic impact on computer animation and video games, but could be a major part of developing long-term, beneficial relationships between robots and humans.

Moving Toward a New Vision of Education
Plataforma SINC (09/15/09)

Researchers from the University of Basque Country have finished a 10-year study examining the effects of information and communications technologies (ICT) in the classroom. Study co-author Asun Martinez says "the studies carried out at compulsory education level were not able to show the transformation and improvement of learning in schools that had been promised as a result of incorporating technology into the classroom." Martinez says the problem is that ICTs are being used to supplement traditional teaching rather than helping it evolve. Instead of encouraging innovation, they have become another tool for the laborious task of memorization. "We started this work stressing that ICTs have the potential to spread knowledge beyond the physical limits of time, space, and the people to whom one has access within the four walls of the classroom," write the researchers in their study. Since ICTs are not being used in the spirit of student creativity and autonomy, researchers are proposing using ICTs in the way they were designed--to challenge old assumptions and promote personal research and growth. In order to change how ICTs are used in school, the researchers say that educators must reevaluate the current classroom structure. By offering students learning opportunities not just in the classroom, but outside it, schools would be "better adapted to the needs of society today," the study concludes.

IBM Develops Denser, Faster Chip Memory
InformationWeek (09/21/09) Gonsalves, Antone

IBM says it has developed a prototype of the densest and fastest on-chip memory. The prototype consists of embedded dynamic random access memory (eDRAM) that is integrated on the same die as a multicore processor. IBM used 32-nanometer, silicon-on-insulator (SOI) technology that protects the transistors on the chip with a layer of insulation, which reduces electrical leakage. The company says the SOI technology improves chip performance by 30 percent and reduces power consumption by 40 percent, compared to standard silicon technology. The eDRAM cell is up to four times as dense as 32-nm embedded transistor-based static random access memory, which means it can help produce smaller, more efficient processors that can process more data. IBM says the embedded memory has latency and cycle times of less than two nanoseconds. The new memory chip will be used in next-generation 32-nm processors, and IBM says it will improve the performance and energy savings of servers, printers, storage, and networking applications as well as mobile, consumer, and game applications. IBM will offer the 32-nm SOI technology to a wide range of application-specific integrated circuitry and foundry clients and use it in chips for its own servers.

Abstract News © Copyright 2009 INFORMATION, INC.
Powered by Information, Inc.

To submit feedback about ACM TechNews, contact: [email protected]

Change your Email Address for TechNews (log into myACM)