Association for Computing Machinery
Welcome to the November 5, 2008 edition of ACM TechNews, providing timely information for IT professionals three times a week.

HEADLINES AT A GLANCE


E-Voting: What Will It Take for a Smooth Election?
IDG News Service (11/04/08) Gross, Grant

Only a few problems were reported with the voting machines used in Tuesday's U.S. election, mostly involving touch-screen and optical-scan equipment. U.S. Elections Assistance Commission (EAC) chairwoman Rosemary Rodriguez is confident that voting systems can be improved and that elections can run more smoothly. The EAC launched a new program to certify the integrity of e-voting machines early in 2007, but the commission has yet to certify any machines. Rodriguez says the EAC is taking its time to make sure its certification program is extensive and focused on the right issues. However, a major question is whether fixing voting systems will be a top priority, says ACM U.S. Public Policy Committee Chairman Eugene Spafford. "The question comes down to, how much are we willing to spend, and how confident do we want to be in the results?" Spafford says. In some cases, problems with optical-scanning machines appeared to be connected to rain, which moistened the paper ballots and caused machines to jam. In other places, voters questioned why their ballots were put in boxes instead of being scanned immediately. Spafford says many of the problems are the result of poor poll-worker training. In some cases, problems with voting machinery were compounded by high voter turnouts, which was avoided in states with early voting. Spafford says in the rush to adopt new technology following problems in 2000 many states adopted unproven technology.


Wall Street's Extreme Sport
New York Times (11/05/08) P. B1; Lohr, Steve

Ignorance of the human factor in the mathematical models of risk put together by financial engineers is largely responsible for the current economic woes, according to finance experts and economists. "The technology got ahead of our ability to use it in responsible ways," says Andrew W. Lo with the Massachusetts Institute of Technology's Sloan School of Management. The application, understanding, and management of the risk models was flawed, and some quantitative finance analysts' warning of this years ago fell on deaf ears as the market boom encouraged more and more trading of sophisticated securities, causing debt to pile up. "Complexity, transparency, liquidity, and leverage have all played a huge role in this crisis," notes Capital Market Risk Advisors president Leslie Rahl. "And these are things that are not generally modeled as a quantifiable risk." Math, statistics, and computer modeling also apparently failed to adequately calibrate the lending risk on individual mortgage loans. Lenders have been spurred in recent years to migrate to automated underwriting systems and depend chiefly on computerized credit-scoring models rather than human assessment. The quantitative models usually stem from academia, where the stress is on problems that can be solved, proved, and published--an approach that is incompatible with financial modeling, analysts say.
View Full Article - May Require Free Registration | Return to Headlines


Harnessing Network Anarchy for the Common Good
ICT Results (10/31/08)

European researchers are working on the Delis project, an effort to solve the problems inherent in keeping the Internet operating efficiently while maintaining users' personal freedoms. "Our purpose was not to change the Internet but to understand it," says project coordinator Friedhelm Meyer auf der Heide, a computer science professor at the University of Paderborn in Germany. Delis researchers combined the algorithmic techniques used in computer science with the experience gained from biological and social behavior studies, statistical physics, economics, and game theory to develop better methods of managing such networks. Delis researchers already have created a search engine based on peer-to-peer sharing techniques, a new network management system for Internet providers, and a spam database. Large-scale networks such as the Internet have become so big that they can no longer be deployed and managed using traditional techniques, Meyer auf der Heide says. To solve this problem, Delis researchers have developed a new approach partially based on P2P networks. The strength of P2P networks is that they lack a central server or router, and instead each computer acts as a shared server on the network. In theory, the traffic load is balanced evenly across a large network of peers and is resilient to failure, allowing the network to handle huge amounts of data in a distributed and self-organizing manner. The new approach provides users with direct control over what aspects of their online behavior can be collected and forwarded to other users to help in their searches.


FCC Expands Use of Airwaves
Washington Post (11/05/08) P. D1; Kang, Cecilia

The U.S. Federal Communications Commission (FCC) has approved a plan that allows unused wireless spectrum, known as white space, to be used by electronic devices to connect to the Internet once the spectrum becomes available after the transition from analog to digital TV in February. Critics say that using white space spectrum could cause interference with broadcast channels and wireless microphones used for public speaking and performances. However, supporters of the plan, including many technology companies, say it will open more wireless technologies for consumers. FCC Chairman Kevin J. Martin says the FCC normally adopts prospective rules about interference and then certifies devices to ensure they are in compliance, but in this case the FCC took "the extraordinary step" of conducting interference testing first to prove the concept that white space devices can be safely deployed. Tech firms describe the spectrum as Wi-Fi on steroids, and say the industry will benefit as more people use it to access the Internet through smart phones, digital music and video players, laptops, and other devices. Public interest groups praised the decision, saying the use of white space would provide an alternative to services offered by wireless companies.
View Full Article - May Require Free Registration | Return to Headlines


ES&S Voting Machines in Michigan Flunk Tests, Don't Tally Votes Consistently
Wired News (11/03/08) Zetter, Kim

Optical-scan voting machines manufactured by Election Systems & Software (ES&S) failed recent pre-election tests in Oakland County, Mich., creating different tallies for the same ballots tested multiple times, according to county election officials. The problems occurred during logic and accuracy tests prior to the election, says Oakland County clerk Ruth Johnson in a letter submitted to the U.S. Election Assistance Commission. The problematic machines are ES&S M-100 optical-scan machines, which read and tally election results from paper ballots. In the letter, Johnson expressed concern that such problems, tentatively linked to a paper dust build-up in the machines, could affect the integrity of the election results. "The same ballots, run through the same machines, yielded different results each time," Johnson wrote. "This begs the question--on Election Day, will the record number of ballots going through the remaining tabulators leave even more build-up on the sensors, affecting machines that tested just fine initially? Could this additional build-up on voting tabulators that have not had any preventative maintenance skew vote totals? My understanding is that the problem could occur and election workers would have no inkling that ballots are being misread." ES&S touch-screen and optical-scan machines were responsible for counting 50 percent of the votes in the last four major U.S. elections, according to ES&S, which has optical-scan machines deployed in 43 states. Johnson says that eight percent of county communities reported inconsistent vote totals during logic and accuracy tests with the ES&S machines, and that conflicting vote totals also surfaced in other areas of Michigan.


Stanford Conference Explores the State of AI
Gamasutra (11/03/08) Kline, Dan

International artificial-intelligence (AI) developers and researchers showcased their latest work at the annual Artificial Intelligence and Interactive Digital Entertainment (AIIDE) Conference at Stanford University. The AIIDE conference focuses on bringing AI game programmers together with AI academics to share research findings and discuss future collaboration opportunities. The three-day conference featured 30 talks on a variety of topics, including non-player character reasoning, designer tools, and procedural dialogue creation. Steve Rabin, Nintendo developer and AI Game Programming Wisdom editor, gave an overview of where game AI comes from, where it is going, and why conferences such as AIIDE are important. Rabin cited several recent landmark developments, such as Nintendo's universal speech recognition technology, F.E.A.R., and its enemy planning system; Forza's use of neural nets to manipulate cars; and Facade's interactive story and language processing technology. Rabin also discussed the challenges the industry is facing, including rising costs, greater risks, and improved CPU power and the industry's inability to find a strong AI to use that power. Borut Pfeifer, lead AI programmer on Electronic Arts LA's Project LMNO, focused on the problem of creating believable human characters. Pfeifer wants characters to be understandable and intelligent, identifying six key human qualities AI should express: conflicting motives, reactions, attention, meaning, emotional simulation, and physicality.


Cracking Open Internet Hardware
Technology Review (11/05/08) Greene, Kate

Stanford University professor Nick McKeown is leading the OpenFlow project, an effort to open up some of the most commonly used Internet network hardware with the goal of making the Internet more secure, reliable, energy efficient, and pervasive. "In the last 10 years, there's been no transfer of ideas into the [Internet] infrastructure," McKeown says. "What we're trying to do is enable thousands of graduate students to demonstrate ideas at scale. That could lead to a faster rate of innovation, and ultimately these ideas can be incorporated into products." McKeown's team has secured permission from equipment vendors to write a small amount of code that grants access to a critical part of a network or switch called a flow table. When a packet of data arrives at a switch, software in the switch finds instructions on the flow table to decide where to send the packet. McKeown says OpenFlow gives researchers the ability to add and delete instructions in the flow table, which is necessary for testing new ideas. Researchers say OpenFlow is creating a new field of research. "This could take over the Internet," says HP Labs' Rick McGeer. "This actually looks like an elegant, efficient solution that we can use to take all of these ideas that we've been exploring for the past five years and start implementing them, and start putting them in the network."


The Next Generation Wireless Chips
University of Cologne (11/04/08) Kollner, Raphael

Europe's Integrated Circuit/Electromagnetic Simulation and design Technologies for Advanced Radio Systems-on-chip (ICESTARS) project will enable the development of low-cost wireless chips that can operate in a frequency range of up to 100GHz. "In the future, mobile devices will provide customers with services ranging from telephony and Internet to mobile TV and remote banking, anytime, anywhere," says University of Cologne professor Caren Tischendorf. "It is impossible to realize the necessary, extremely high data transfer rates within the frequency bands used today." ICESTARS project leader Marq Kole says that by the end of the project in 2010, project participants hope to have accelerated the chip development process in the extremely high frequency range with new methods and simulation tools. ICESTARS is funded by the European Commission and is led by NXP Semiconductors. German semiconductor company Qimonda will develop advanced analog simulation techniques for the project. Other partners include Finland-based software developer AWR-APLAC, which will focus on frequency-domain simulation algorithms, and Belgium's MAGWEL, which will focus on electromagnetic simulations. In addition to the University of Cologne, university partners include Upper Austria University of Applied Sciences, Germany's University of Wuppertal, and the University of Oulu in Finland. University partners will focus on modeling questions, algorithmic problems, and simulations issues that need to be solved for testing analog circuits with digital signal processing in the extremely high frequency range.


Multicore: New Chips Mean New Challenges for Developers
IDG News Service (11/04/08) Krill, Paul

The development of multicore processors is forcing software developers to work on getting software to be processed across multiple cores to fully utilize the new performance capabilities available in the hardware. However, this task has proven to be quite challenging, with developers struggling with issues surrounding concurrency and potential performance bottlenecks. Already, 71 percent of organizations are developing multithreaded applications for multicore hardware, according to a recent IDC survey. IDC analyst Melinda Ballou says developers need to approach multicore with a level of commitment to better practices throughout an organization and from a project perspective. Multicore processors are becoming increasingly common as single-core chips reach their limits and as power-consumption issues become more important. As hardware continues to change, the pressure will be on software developers to adapt and capitalize on the new capabilities. Developers must learn new techniques and use new tools to maximize performance. Intel's James Reinders acknowledges that developing multicore applications requires a much more complicated thought process about software design than most developers understand. "By and large, the majority of programmers don't have experience with it and are in need of tools and training and so forth to help take advantage of it," Reinders says. Intel is offering its Threading Building Blocks template library to help C++ programmers with parallel programming. The Intel Thread Checker helps find nondeterministic programming errors, and the Intel Thread Profiler helps visualize a program to check what each core is doing.


Cyberinfrastructure Tools Improve Remote Use of Scientific Instruments
Ohio Supercomputer Center (10/31/08)

Researchers at the Ohio Supercomputer Center (OSC) have developed cyberinfrastructure tools that will enable industrial researchers to share scientific instruments over the Internet. OSC's remote instrumentation cyberinfrastructure includes Web portals to provide access to multiple researchers, robust networking to provide fast and efficient data transmission, and mass storage for data archiving and subsequent retrieval. OSC's Prasad Calyam says the goal is to foster research and training activities that can drastically shorten the innovation process in fields such as materials modeling and cancer research. By creating Web portals that integrate with OSC's Remote Instrumentation Collaboration Environment (RICE) software, the OSC can support multi-user session presence, user control management, live video feeds between Ohio labs, and collaboration tools such as Voice over IP and chat. Calyam says the RICE software allows researchers to control the microscope in real time through remote operation as researchers examine a sample, or it can restrict remote users to just viewing the sample's images and communicating with the operator. The Ohio Board or Regents funded this research to create a greater return on investment for instruments such as electron microscopes, nuclear magnetic resonance spectrometers, Raman spectrometers, and ion accelerators at Ohio universities.


Minimising Downtime by Decentralising Control
ICT Results (10/30/08)

When complex processes controlled by computerized systems encounter a malfunction, the process tends to shut down completely until the problem can be diagnosed and corrected. However, European researchers working on the NeCST project have developed a set of algorithms that use decentralization to prevent shut downs. When fully realized, the software could help power stations, oil refineries, factories, and other industrial plants work around localized faults. The NeCST project has made network control systems more fault tolerant by making individual components of the overall system as autonomous as possible. Once a fault is diagnosed or predicted, the problem can be preempted or fixed while the rest of the network operates normally. NeCST project coordinator Eric Rondeau says the systems can be viewed as distributed networks of nodes operating under highly decentralized control that are unified in accomplishing complex system-wide goals. In a petroleum-oil refinery, used to test the prototype NeCST algorithm, several processes are used, all of them falling under a networked control system. If a fault develops in one of the processes, NeCST isolates the fault and allows the rest of the system to continue other processes.


Contest Aims to Expose Wider Audience to Supercomputing
Rice University (10/29/08) Boyd, Jade

In an effort to jumpstart the development of free classroom lessons and materials on parallel computing, Rice University is co-sponsoring the 2008-2009 Open Education Cup, a contest with $500 cash prizes for the five best lessons submitted to the open-education site Connexions. The contest will take place from Nov. 15-21 in Austin at SC08, cosponsored by ACM. "Reports have said over and over again that we need more and better high-performance-computing education," says Microsoft's Dan Reed, one of the contest's judges. "Projects like this are a way to build that education from the ground up." Reed also is a member of the President's Council of Advisors on Science and Technology (PCAST) and a co-author of PCAST's 2007 report on the challenges faced by the U.S.'s technology industry. The report found that almost every sector of the economy depends on IT, and the nation's workforce needs IT training to keep up with changes such as multicore processing. Reed, Rice University professor Jan Odegard, and others say the new power available in multicore processors will go unused unless the industry can find a way to give users the tools and knowledge to utilize the new power. "With the introduction of dual-core, quad-core, and soon, many-core chips, as well as the understanding that chips with hundreds of cores will be in your typical PCs within just a few years, parallel processing is suddenly something that everybody needs to be familiar with," Odegard says.


New Way of Measuring 'Reality' of Virtual Worlds Could Lead to Better Business Tools
North Carolina State University (10/29/08) Shipman, Matt

North Carolina State University researchers led by professor Mitzi M. Montoya have developed Perceived Virtual Presence (PVP), a tool that enables businesses to design more effective online virtual worlds for training employees and collaborating on projects. PVP enables users to measure how "real" virtual worlds are. Moreover, PVP takes how users interact with the virtual environment, with their work in that environment, and with other users, all into consideration. "This is an important issue because we believe that if users feel they are 'present' in the virtual world, they will collaborate better with other members of their team--and the more effective the virtual world will be as a setting for research and development or other collaborative enterprises," Montoya says. "An increased sense of presence in the virtual world leads to better comprehension and retention of information if the technology is being used for training purposes, and trainees are happier with the process."


New Communications Tools Help Emergency Responders
CNN (10/29/08) Walton, Marsha

Researchers at the National Institute of Standards and Technology (NIST) are working to improve the communications devices that emergency workers rely on through a variety of projects, including placing equipment inside buildings scheduled for imploding to test a robot's ability to send audio and video in abandoned mines. NIST conducts research in places that are notorious for causing problems with emergency communications, such as tunnels, collapsed buildings, and oil refineries. During such tests, researchers monitor radio wave communications, making note of where signals fade, at what frequencies those signals failed, and how far into the tunnel or structure the signals can go before communications are lost, examining both the video signal being sent by a robot and the control communications to the robot. For example, tunnel research revealed a "sweet spot," a particular frequency in mines, subways, and tunnels where radio signals travel farthest, which varies depending on a tunnel's dimensions. The discovery of the sweet spot could help researchers design wireless systems that are more likely to function in an emergency. Creating smarter robots also is part of efforts to improve disaster communications. A smarter robot would be able to monitor its own signal strength, and would know when it is starting to lose communication with its operator, which could allow it to automatically deploy a repeater to relay the signal and continue exploring.


Gadgets Designed at CMU Work Off Gestures, Brain Signals
Pittsburgh Tribune-Review (10/29/08) Leonard, Kim

Gestris is a Tetris-style game, developed by researchers at Intel's Pittsburgh Research Lab, in which players use body gestures to manipulate pieces on the screen. Intel's Padmananabhan Pillai says that in the future the researchers would like to make significantly richer interfaces for use in the home. "Imagine that you could point to the TV and say, 'volume up,'" Pillai says. Computers and robots capable of determining what humans want and need more easily were a common theme among the dozens of projects recently demonstrated at Intel's lab. The lab includes 23 researchers who work with faculty from Carnegie Mellon University and the University of Pittsburgh and medical experts from the University of Pittsburgh Medical Center, as well as 20 to 25 students. "Mostly, Intel research is engaged in trying to understand where computing is going," says Intel's Andrew A. Chien. "And where it is going is a combined enterprise of hardware and software and, ultimately, human beings." Pittsburgh lab researcher Rahul Sukthankar says machines must be able to interact with people in more natural ways, such as reading their gestures or even brain signals. "It's no longer just spreadsheets and word processors," Sukthankar says. "It's really interacting with humans in their environment." In addition to gesture and voice recognition, computers may one day be able to respond to facial expressions, or know what a person is thinking about by reading brain activity patterns. Intel's Jason Campbell described projects that could use millions of tiny sphere-shaped computers to build objects that change their shape on command, such as a cell phone that can morph into a keyboard or wrap around a person's ear.


Abstract News © Copyright 2008 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Unsubscribe
Change your Email Address for TechNews (log into myACM)