Association for Computing Machinery
Welcome to the June 29, 2009 edition of ACM TechNews, providing timely information for IT professionals three times a week.

HEADLINES AT A GLANCE


And the Winner of the $1 Million Netflix Prize (Probably) Is ...
New York Times (06/26/09) Lohr, Steve

After three years and more than 50,000 entries, a multinational team claims that it has met the requirements of Netflix's $1 million contest to develop a powerful algorithm that is at least 10 percent better at suggesting movies than the Cinematch software the company currently employs. The accuracy of Netflix's recommendations have a major impact on the service's appeal to its customers, so the movie rental service started a contest in October 2006, offering $1 million to the first contestant to improve their recommendation system's accuracy by at least 10 percent. A coalition of four teams, calling itself BellKor's Pragmatic Chaos, which includes statisticians, machine-learning experts, and computer engineers from the United States, Austria, Canada, and Israel, declared that it has developed a program that improves accuracy by 10.05 percent. Following the rules of the contest, other contestants have 30 days to try to do better. The contest has been praised as an example of prize economics and the crowdsourcing of innovation. BellKor's Pragmatic Chaos is a collection of the 2007 and 2008 winners of the Netflix Progress Prizes, which were awarded annually to the teams that made the most progress toward the 10 percent improvement objective. "What we've seen is that collaboration has taken hold," says Netflix's Steve Swasey. "They realized how difficult the challenge is, and they have assembled people with complementary skills."


Computer Failures Are Probed in Jet Crash
Wall Street Journal (06/27/09) P. A1; Pasztor, Andy; Michaels, Daniel

Aviation investigators looking for a cause of the crash of Air France Flight 447 believe that a rapid chain of computer and equipment failures may have stripped the flight crew of the airplane's automation technology, which pilots generally rely on to control large jets. The possible scenario behind the jet's crash starts with malfunctioning airspeed sensors and rapidly evolves to what appears to be widespread computer failures, according to people familiar with the investigation. The initial physical evidence recovered from the crash, and automatic maintenance messages sent by the aircraft, indicates that the plane bucked through heavy turbulence caused by a thunderstorm, without the full protection of its flight-control systems, which experts say many pilots now take for granted. Investigators believe the pilots, having only backup instruments, had difficulty restarting flight-management computers and the plane may have started breaking up due to excessive speed. The investigators stress that it is too early to determine any specific causes. Regardless of the final findings, the crash is already prompting some flight safety experts to question whether pilots are trained enough to handle widespread flight-computer failures. Most modern jetliners are essentially completely automatic, and pilots generally just monitor instruments and rarely interfere with controls. If computer failures occur in today's increasingly computerized jetliners, many safety experts question how proficient most crews will be in trying to rely on less high-tech backup systems.


Metrorail Crash May Exemplify Automation Paradox
Washington Post (06/29/09) P. A9; Vedantam, Shankar

The fatal collision of two trains on Washington, D.C., Metro's Red Line may come to symbolize the core problem of automation, which is the relationship between humans and their automated control systems. "The better you make the automation, the more difficult it is to guard against these catastrophic failures in the future, because the automation becomes more and more powerful, and you rely on it more and more," says University of Wisconsin at Madison professor John D. Lee. As such systems become more reliable, the greater the likelihood that supervising humans will become less focused, which makes it increasingly probable that unanticipated variables will tangle up the algorithm and lead to disaster. The University of Toronto's Greg Jamieson notes that many automated systems explicitly instruct human operators to disengage, as they are designed to remove human "interference." "The problem is when individuals start to overtrust or over rely or become complacent and put too much emphasis on the automation," he says. Lee, Jamieson, and George Mason University psychologist Raja Parasuraman say there is growing agreement among experts that automated systems should be designed to augment the accuracy and performance of human operators rather than to replace them or make them complacent. A number of studies illustrate that operators can retain their alertness and skills through regular training exercises in which they switch from automated to manual control. Parasuraman has determined that "polite" feedback from a machine can enhance the machine-operator relationship to facilitate measurable safety improvements.
View Full Article - May Require Free Registration | Return to Headlines


U.S. and Russia Differ on a Treaty for Cyberspace
New York Times (06/28/09) P. A1; Markoff, John; Kramer, Andrew E.; Wong, Edward; et al.

The United States and Russia disagree about the best way to shield computer systems and the Internet from the growing menace of cyberattacks, with Russia favoring an international pact akin to those negotiated for chemical weaponry and the United States preferring better cooperation between international law enforcement organizations. Russia's proposed treaty would prohibit a country from clandestinely incorporating malicious codes or circuitry that could be later triggered remotely in the event of war. "We really believe it's defense, defense, defense," says an anonymous official of the U.S. State Department. "They want to constrain offense." U.S. officials are particularly opposed to agreements that would permit governments to censor the Internet, arguing that they would provide cover for repressive regimes. They also are concerned that a treaty would be ineffective because determining if a cyberattack is perpetrated by a government, a hacker loyal to that government, or an independent rogue agent is nearly impossible. U.S. officials say the discord over the proper cyberdefense approach has impeded global law enforcement cooperation, especially since a substantial number of the assaults against U.S. government targets originate from China and Russia. The Russians, meanwhile, perceive the lack of an accord as encouraging a cyberarms race. The Pentagon intends to set up a military cybercommand to get ready for both offensive and defensive cyberwarfare.
View Full Article - May Require Free Registration | Return to Headlines


CIFellows Status Report
Computing Community Consortium (06/27/09) Lee, Peter; Lazowska, Ed

The Computing Innovation Fellows (CIFellows) Project has received 526 applications for CIFellowships from 145 distinct colleges and universities. The applications reveal a total of 949 different applicant-mentor pairs, with mentors from 198 universities, companies, and non-profits. Twenty-seven percent of the applicants are female, 42 percent are U.S. citizens, 5 percent are permanent residents, and 6 percent come from an underrepresented racial/ethnic group. AI/Machine Learning/Robotics/Vision was the leading research subdiscipline for the applicants at 21 percent, followed by Networks/Operating Systems at 9 percent, Scientific/Medical Informatics at 8 percent, Hardware/Architecture at 7 percent, HCI/CSCW at 7 percent, and Information Assurance/Security/Privacy/Cryptography at 7 percent. More than 1,200 people have expressed interest in hosting a CIFellow. The Selection Committee has begun the review process, and the final decisions could be made as early as July 10.


Designers to Share Real-World Experiences at the 46th Design Automation Conference User Track
Business Wire (06/25/09)

The new User Track at the 46th Design Automation Conference (DAC) technical program will feature more than 80 technical papers and posters presented by designers from around the world. The User Track will focus on innovations in tool use and design methodologies throughout the design process, from system design exploration and embedded software synthesis in the front end to constraint generation and physical verification at the back end. User Track co-chair Leon Stok says the program will give designers a unique opportunity to learn from other designers how design tools can be applied to solve complex design problems. "Based on the incredible industry response and exciting lineup of presentations, we anticipate that User Track will be a very valuable part of DAC this year for all who participate," says Tufts University's Soha Hassoun, the other co-chair. The User Track features nine sessions over three days, covering a variety of front- and back-end design processes, such as power planning and analysis and real-world timing analysis. A Font-End Power Panning and Analysis session will feature a NEC presentation on their use of an automated flow to pre-characterize the power consumption of a set of basic components. The Timing Analysis in the Real World session will have representatives from Atmel Corp., the European Space Agency, FishTail Design Automation, Fujitsu, IBM, Sun Microsystems, and Texas Instruments, who address issues involved in mixing gate-level and transistor-level timing analysis, among others. DAC, co-sponsored by ACM, takes place July 26-31 in San Francisco.


Graduate Science Enrollment Rises, Bringing More Diversity
InformationWeek (06/25/09) Claburn, Thomas

Enrollment in graduate science and engineering (S&E) programs has risen to new levels, including greater percentages of non-White ethnic groups and women, according to a new National Science Foundation (NSF) report. The report says the recent growth toward ethnic and racial diversity represents the largest change in the demographic composition of S&E graduate students in the United States. White, non-Hispanic students accounted for 71 percent of all U.S. citizens and permanent residents enrolled in these programs in 2000, according to the report, but only 66 percent in 2007. The NSF report also says that enrollment in U.S. S&E programs grew by about 3.3 percent in 2007, the largest annual growth rate since 2002, and almost double the 1.7 percent growth rate in 2006. The number of post-doctoral appointments at academic institutions also reached a new record, about 36,000, up from about 30,000 in 2001. The proportion of men to women among U.S. citizens and permanent residents enrolled in U.S. S&E programs was divided 52 percent to 48 percent, and among foreign students men outnumbered women 66 percent to 34 percent. U.S. citizens and permanent residents represented the majority of graduate students, according to the report, but the majority of postdoctoral appoints, 58 percent, were given to temporary visa holders.


The Grill: Using Computer Models to Predict War
Computerworld (06/22/09) Forrest, Sara

New York University professor Bruce Bueno de Mesquita has developed a computer model that can forecast the outcomes of international conflicts, and the U.S. Defense Department has found the model very useful. De Mesquita says the model begins by assuming that everyone is interested in two dimensions on any policy issue--getting the outcome as close to what they desire as possible, and getting credited as playing an essential role in reaching or thwarting an agreement. "The model estimates the way in which individual decision-makers trade off between credit and policy outcomes," he notes. De Mesquita says the model has been generally welcomed by people more oriented toward quantitative modeling, while those who tend to focus on area studies or historical case study analysis have been less receptive. "The problems I look at with my model typically involve many dozens of players, sometimes more than 200," he says. "There is no way to construct biased data to produce a desired outcome except to make the data appear transparently wrong to anyone looking at the data." The model is founded on game theory, and De Mesquita points out that advances in computing power have made this kind of modeling unrestrained by memory or processing limitations.


Holt: When it Comes to Voting, a Paper Ballot System Is a Must
NewJerseyNewsroom.com (06/22/09) Lagomarsino, Andy

New Jersey Rep. Rush Holt (D) recently reintroduced the Voter Confidence and Increased Accessibility Act, a bill that would create a national voting standard that would require paper-ballot voting systems and accessible ballot-marking devices coupled with routine random audits of electronic voting tallies. "Congress should pass a national standard ensuring that all voters can record their votes on paper and requiring that in every election, randomly selected precincts be audited," Holt says. In every federal election since 2003, when the Help America Vote Act was enacted, citizen watchdog groups have collected information on voting machine failures. In 2004, the Election Incident Reporting System received more than 4,800 voting machine complaints from all but eight states, and in 2006 a sampling of voting machine problems gathered by election integrity groups and the media exposed more than 1,000 incidents in more than 300 counties in all but 14 states. In 2008, the Our Vote Live hotline received almost 2,000 voting machine problem reports in all but a dozen states, and 19 states conducted completely unauditable elections. Paperless electronic voting is preferred by many election officials, but it is unverifiable and unauditable, and computer scientists say that computers are unreliable without an independent audit mechanism. "The clear trend is towards paper ballots," says Holt. "In fact, every jurisdiction that has chosen to change its voting system since 2006 has chosen to use paper ballots with optical scan counting. That should be the standard."


IBM Aims for a Battery Breakthrough
BusinessWeek (06/23/09) Hamm, Steve

IBM has announced a multiyear effort to increase the performance of rechargeable batteries 10-fold, with the goal of designing batteries that enable electric vehicles to travel 300 to 500 miles on a single charge. Currently, electric vehicles can only go about 50 to 100 miles before needing to recharge. "We want to see if we can find a radically different battery technology," says Chandrasekhar Narayan, manager of Science & Technology Organization at IBM Research's Almaden lab. IBM is leading a consortium that will develop batteries using lithium and oxygen instead of the potentially combustible lithium-ion mix that is commonly used in consumer electronics and early electric vehicle batteries. The new batteries also could be used to store energy in electric grids. Industry and government leaders have called for this type of effort due to concern that the U.S. will miss out on the switch from gasoline to electricity as the primary power source for light vehicles. The concern is that the U.S. dependency on the Middle East for oil will be replaced by a new dependency on Asia for batteries. "We lost control of battery technology in the 1970s," says former Intel chairman Andy Grove. "Battery technology will define the future, and if we don't act quickly it will go to China and Japan." IBM expects approximately 300 top scientists and battery experts to attend a conference on the project scheduled for late August. Narayan says that IBM's expertise in nanotechnology, materials science, chemistry, and supercomputing makes it well-positioned to lead the project, and he "we'll know in two years if there are any show-stoppers."


Less Fuss, More Muscle in Quantum Data Transfer
ANU News (06/22/09) Cox, Penny

Australian National University (ANU) researchers have discovered a more efficient way to use light to convey information. The approach to generating quantum entanglement, or coding information in the physical relationship between two objects, uses fewer light beams and components. "Until now, the amount of information that could be conveyed using optical entanglement was limited by levels of complexity," says lead researcher Jiri Janousek from ANU's ARC Center of Excellence for Quantum-Atom Optics. "The ability to scale up information transfer is hampered by the fact that you need to increase the number of nonclassical light sources, splitters, and receivers each time you want to add another channel of information." The researchers' findings on mode manipulation in light indicates only one light source and one receiver is needed for optical entanglement, which suggests that it would be easier to scale up for conveying more information channels. Janousek sees the approach playing a role in the development of quantum technologies such as quantum communication and information processing, and even quantum computers.


Report Calls for Grassroots But Comprehensive Changes
Science (06/19/09) Vol. 324, No. 5934, P. 1498; Mervis, Jeffrey

A $1.5 million study from the Carnegie Corp. of New York focuses on weaknesses in U.S. math and science education. The report calls for more comprehensive math and science content, higher standards and evaluation, improved training for educators, and more innovative institutions. The success of this initiative requires participation from all stakeholders, including business leaders, politicians, principals, and professors. The report supports current reform initiatives, which include a 46-state consortium that aims to develop a common set of "fewer, clearer, and higher" reading, math, and science standards. At the study's rollout, Carnegie commissioners noted that the improvement of science, technology, engineering, and mathematics education will demand far more than simply requiring yearly student progress on reading and math tests. "Even though the target is better math and science education, you probably can't achieve it without looking at the entire system," said commissioner Katherine Ward. The study supports a strategy that involves scaling up what U.S. Education Secretary Arne Duncan terms local "islands of excellence" such as Urban Advantage, a program that taps the resources of New York City museums to teach middle school science. Among the things the program does is fulfill a requirement that all eighth-graders in the New York public schools carry out a long-term scientific investigation.
View Full Article - May Require Paid Subscription | Return to Headlines


Abstract News © Copyright 2009 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]




Unsubscribe
Change your Email Address for TechNews (log into myACM)