Association for Computing Machinery
Welcome to the November 21, 2008 edition of ACM TechNews, providing timely information for IT professionals three times a week.

HEADLINES AT A GLANCE


NASA Tests First Deep-Space Internet
NASA Jet Propulsion Laboratory (11/18/08) Borja, Rhea; Brown, Dwayne; Trinidad, Katherine

NASA Jet Propulsion Laboratory (JPL) engineers have successfully tested the first deep space communications network based on the Internet, using the Disruption-Tolerant Networking (DTN) protocol to transmit dozens of images to and from a spacecraft more than 20 million miles from Earth. NASA and Google's Vint Cerf jointly developed the DTN protocol, which replaces the Internet's TCP/IP protocol for managing data transmissions. "This is the first step in creating a totally new space communications capability, an interplanetary Internet," says NASA's Adrian Hooke. An interplanetary Internet needs to be strong enough to withstand delays, disruptions, and lost connections that space can cause. For example, errors can happen when a spacecraft slips behind a planet, or when solar storms or long communication delays occur. Even traveling at the speed of light, communications sent between Mars and Earth take between three-and-a-half minutes to 20 minutes. Unlike TCP/IP, DTN does not assume there will be a constant end-to-end connection. DTN is designed so that if a destination path cannot be found, the data packets are not discarded but are kept in a network node until it can safely communicate with another node. In October, engineers started a month-long series of demonstrations, with data being transmitted using NASA's Deep Space Network twice a week. Researchers say the interplanetary Internet could allow for new types of complex space missions that involve multiple landed, mobile, and orbiting spacecraft, as well as ensure reliable communications for astronauts on the surface of the moon.


Robots With Emotions on Display at the ICT'08 Event in Lyon
University of Hertfordshire (11/21/08) Roberts, Emma

A European project is developing robots that are capable of growing emotionally, responding to humans, and expressing their own emotional states during interaction with people. The researchers behind the FEELIX GROWING project will display their mid-term results at ICT 2008, which takes place in Lyon from November 25-27, 2008. "The aim is to develop robots that grow up and adapt to humans in everyday environments," says University of Hertfordshire professor Lola Canamero, coordinator of FEELIX GROWING. "If robots are to be truly integrated in humans' everyday lives as companions or care-givers, they cannot be just taken off the shelf and put into a real-life setting, they need to live and grow interacting with humans, to adapt to their environment." The FEELIX GROWING project plans to offer live demonstrations of a baby pet robot learning to control its stress as it explores a new environment, and robotic heads responding with facial expressions to human faces and voices. Other prototypes to be shown include humanoid robots learning to execute simple tasks by observing and imitating humans, and an interactive floor responding to human touch and movement with different light and sound patterns.


Argonne, Oak Ridge Labs Sweep HPC Challenge
Government Computer News (11/20/08) Jackson, Joab

The U.S. Energy Department's Argonne National Laboratory and Oak Ridge National Laboratory have been recognized by DARPA's High Performance Computing Challenge for their superior performance. Unlike the Top500 supercomputer list, which ranks the performance of supercomputers on a single metric, the HPC Challenge evaluates machines in four categories, with each category representing the best performance for that benchmark. Argonne's 163,840-core BlueGene/P IBM system won the global access to memory category, which involves writing data to random parts of memory, and is measured in giga-updates per second. The Argonne BlueGene/P also won in the global fast Fourier transform category, which measures how quickly a system can execute a discrete Fourier transform. Oak Ridge's 150,152-core Jaguar system also won two categories, the first for best performance in the High Performance Linpack test, in which the system completed 902 trillion floating point operations per second; and in the stream category, which measures the sustainable rate at which data can be moved to and from memory to the processor. Funded by DARPA, the National Science Foundation, and the Department of Energy, the HPC Challenge was developed to provide a well-rounded set of benchmarks to gauge supercomputer performance. The winners were announced at this week's SC08 supercomputer conference, co-sponsored by ACM.


Minn. Senate Race Could Hinge on Scanning Machine Mistakes
CNet (11/19/08) Condon, Stephanie

The result of a U.S. Senate race in Minnesota is still undecided, and a hand recount could show that several thousand votes were mistakenly rejected by optical-scan voting machines. Republican Sen. Norm Coleman holds a lead of about 200 votes over his primary opponent, Democrat Al Franken. Minnesota's secretary of state estimates that the machines may have mistakenly rejected as many as two votes out of every 1,000 cast, or about 6,000 votes overall. The scanners could have rejected ballots that were not correctly filled out, such as if a voter circled a candidate's name instead of filling in the bubble next to the name. Minnesota law mandates that any vote in which the voter's intention is clear must be counted, and a manual recount would permit votes that a machine would reject. Officials hope to finish the recount by December 5th, and the state canvassing board will reconvene on December 16th with the goal of obtaining the final results by December 19th. Beth Fraser, an official from the secretary of state's office, maintains that although the optical-scan machines rejected some votes, they are still the best option because they are both fast and provide an audit trail for recounts.


First SIGGRAPH Asia Sees Significant Participation From the Region's Talents
TAXI Design Network (11/19/08)

The first ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia opens in Singapore on December 10, 2008, reflecting the area's growing importance in computer graphics and digital media. From 1998 to 2005, the number of SIGGRAPH technical papers submissions from Asia increased by 300 percent. This year 30 percent of the materials accepted were from Asia, and 14 universities from across Asia will be represented. SIGGRAPH Asia will feature special sessions that focus on some of the most important developments in the industry, including a panel discussion on the issues and challenges of establishing a new production studio in Asia. The Computer Animation Festival (CAF), a popular attraction at SIGGRAPH, will display a variety of selected screenings, from feature films, games, and visual effects. Asia, and specifically China, Japan, and Korea, contributed almost half of the CAF entries this year. Another highlight of the conference will be the courses program workshop hosted by Animation Options LLC CEO Kevin Geiger, director of the nonprofit Animation Co-op, who will share his organizational insight on better planning and management of the production pipeline and workflow.


A Future Without Programming
IDG News Service (11/20/08) Kaneshige, Tom

Do-it-yourself applications development is on the rise as business users increasingly turn to codeless programming tools to create applications. "We also have a whole new wave of business users that are not intimidated by the notion of application development," says Forrester analyst Mike Gualtieri. Consultant Kevin Smith says applications and tools such as Coghead, a Web application for code-free development of other Web applications, makes him wonder how traditional developers who have to go through the process of coding from scratch stay in business. "It's such a game changer," Smith says. "I think it turns developers from wizards who read the magic book and know the syntax into business analysts who understand the processes and goals of what they're trying to achieve." Coghead, Caspio, Zoho, Wufoo, and other programs make it easier for nonprogrammers to create applications. Coghead CEO Paul McNamara says cloud computing tools could increase the number of potential software builders in the world tenfold. Codeless software development makes a lot of sense for business applications that have multiple records, business logic, notification, and other straightforward features. Gualtieri cautions that codeless software development is not as easy as some would suggest, and says that nonprogrammers should expect to confront a variety of challenges.


EPSRC Funds Cybersecurity Research
Engineering & Physical Sciences Research Council (11/19/08)

The United Kingdom's Engineering and Physical Sciences Research Council (EPSRC) and the Technology Strategy Board will provide funding for research to fight virtual crime and efforts to transform research into commercial opportunities. The investment will be used to create two Innovation and Knowledge Centres that will combine business knowledge with the most current research as part of an effort to capitalize on the full potential of emerging technologies. The centers will be based at Queen's University Belfast and the University of Leeds. The Belfast center will develop technologies for secure information architecture and to protect the trustworthiness of electronically stored information, including combating cyberattacks. "Taking exciting research from the university laboratory to the commercial sector through close collaboration with user stakeholders is vital to ensuring the UK's economy continues to be innovative and globally competitive," says EPSRC chief executive David Delpy. The ubiquitous nature of mobile communications makes connectivity easier than ever, but global connectivity introduces global vulnerabilities in terms of privacy, security, and reliability. The Belfast center also will work to develop secure solutions for protecting mobile phone networks.


IT Sector Adds Jobs Despite Economic Turmoil
Network World (11/19/08) Brodkin, Jon

New statistics from the U.S. Bureau of Labor show that the IT profession may be stronger than the rest of the job market. Although 240,000 jobs were lost nationwide in October, the IT profession actually grew during this period, with about 5,500 jobs being added to computer systems design and related services, and another 300 being added to management and technical consulting services. Foote Partners CEO David Foote says the growth of the IT industry at a time when most other industries are shrinking demonstrates the health of the industry today. In contrast, during the economic downturn earlier this decade, many IT jobs were cut. The difference between then and now is that today large portions of the IT budget are tied to lines of business, and companies think about IT and how it relates to profit, instead of simply as an expense, Foote says. He says many IT professionals are now knowledge workers and not just systems maintenance or programming workers. Nevertheless, he cautions that highly paid IT senior and middle-management positions are still vulnerable because of their higher salaries. Between January and July of this year, the U.S. high-tech industry added 78,300 jobs, a significant decrease from the 111,400 tech jobs added during the same period in 2007.


Real-Time Beethoven
Norwegian University of Science and Technology (11/21/08) Oksholen, Tore

A student at the Norwegian University of Science and Technology has developed a computer instrument that takes the skills of jazz musicians to the next level. Oyvind Brandtsegg has developed a computer program and a musical instrument for improvisation and variation for his Ph.D. research. The computer instrument is capable of taking recorded music and splitting the sound into sound particles that last between one and 10 milliseconds, infinitely reshuffling the fragments, and making it possible to vary the music without changing its fundamental theme. "It's easy to change a bit of music into something that can't be recognized," Brandtsegg says. "It's the opposite that is the challenge: To create variations in which the musical theme remains clear." The instrument allows composers to add new tonal variations and timbres to the musical palette and work in real time. Brandtsegg worked with the Department of Computer and Information Science to develop the software architecture, and the acoustics group at the Department of Electronics and Telecommunications to create the particle synthesizer.


The Invisible Network
ICT Results (11/18/08)

The European Union-funded E2RII project brought together 32 organizations from 14 countries to simplify communications by enabling devices to determine the best mode of communication depending on the situation. "Most users don't care about the technology, what they care about is communicating," says E2RII project coordinator Didier Bourse of Motorola Labs. "You may be in different environments--at home, in the office, on a train, and so on--but what you want is to be connected and to enjoy a seamless experience." Project researchers are working to create a mobile device that could set up a call on its own. Such end-to-end connectivity would require the exchanges, routers, and other hardware between communications nodes to be able to adapt to multiple technologies. Project researchers incorporated various technologies, including software-defined radio, in which functions that are normally hardwired can be managed by software, and cognitive radio and cognitive networks, in which communication nodes become increasingly intelligent and reconfigurable. In the near future, the industry will shift to pervasive services, in which a device can evolve to cope with new technologies and new services, Bourse says. Developers and vendors will be able to modify communications standards and equipment without having to invest in new hardware. In the more distant future, dynamic and flexible resource management will be available, which Bourse describes as a communications cube in which one side represents radio frequency, another represents the range of radio techniques available, and a third side maps all possible services.


Carnegie Mellon Theory of Visual Computation Reveals How Brain Makes Sense of Natural Scenes
Carnegie Mellon News (11/19/08) Spice, Byron; Watzman, Anne

A new computational model from researchers at Carnegie Mellon University helps explain how the brain processes images in the foreground and the background to interpret natural scenes. Michael S. Lewicki in the Computer Science Department and the Center for the Neural Basis of Cognition worked with graduate student Yan Karklin to build a model that uses an algorithm to analyze the patterns that compose natural scenes and determine which patterns are most likely associated with each other. "Our model takes a statistical approach to making these generalizations about each patch in the image," says Lewicki, a computational neuroscientist. The model of the visual neurons that can detect lines matched well with neurons that are involved in more complex visual processing. "We were astonished that the model reproduced so many of the properties of these cells just as a result of solving this computational problem," says Lewicki. Computer vision system should improve as researchers learn more about how the brain processes contours and surfaces. Better computer vision algorithms could make it easier for computers to understand the three-dimensional nature of objects, and recognize what they see around them as a complete picture. Karklin earned a Ph.D. in computational neuroscience, machine learning, and computer science last year, and is now a post-doctoral fellow at New York University.


How Google's Ear Hears
Technology Review (11/20/08) Greene, Kate

Google recently announced the addition of voice search capabilities to its iPhone mobile application, which will allow people to speak search terms to their phones. To make a voice-operated search engine, Google relied on the massive amount of data it has on how people use search, training its algorithms so that if the system has trouble understanding a word used in a search, it can look at the data and see which terms are regularly grouped together. Google also has data on speech samples correlated with written words, taken from its Goog411 free directory service. Google researcher Mike Cohen says that samples from the directory were the main source of acoustic data for training the system. Once the system is operational Google will be able collect much more data, which will help improve voice recognition even more. Massachusetts Institute of Technology researcher Jim Glass says the beauty of search engines is that they do not need to be exact, and Google's algorithms can simply answer the query to the best of their ability, and the user can either select the right result or try again. The search application also uses the iPhone's location-awareness abilities to prioritize results based on the user's location.


IBM Tries to Bring Brain Power to Computers
IDG News Service (11/19/08) Shah, Agam

IBM Research has been working on a project to give computers the same processing capabilities as the human brain. The goal is to integrate brain-related senses such as perception and interaction into hardware and software to enable computers to process and understand data faster while consuming less power, says IBM researcher Dharmendra Modha. Modha says neuroscience, nanotechnology, and supercomputing are all being combined as part of the effort to create a new computing platform. "If we could design computers that could be in real-world environments and sense and respond in an intelligent way, it would be a tremendous step forward," Modha says. A typical computing process starts out by defining the objectives and then creates algorithms to achieve those objectives. Modha says the brain works the opposite way, with a pre-established algorithm that is applied to problems as they arise, creating a platform that can address a wider variety of problems. For example, the brain-based approach could help manage the world's water supplies using real-time analysis of data gathered by a network of sensors that monitor variables to discover new patterns. Such an approach also could also be applied to world markets. The researchers are not working on concrete applications yet, but rather an understanding of what the brain does and how it can be implemented in computing, Modha says.


IT Security Education Continues to Evolve
CSO Online (11/17/08) Spafford, Gene; Goodchild, Joan

IT security and cyberforensics are two areas with a critical need for more workers, writes Purdue University professor Eugene Spafford, chair of ACM's US Public Policy Committee. Spafford says that computer science education has evolved from teaching the fundamentals for construction and systems in favor of a focus on all the places where computing can make a difference. Before, computer science focused more on program solutions around individual host computers and only some distributed computation, but now the focus is on higher-level concepts in languages, graphics, and network computation. Spafford says the security implications of this shift, and the shift in the industry toward cloud computing and large-scale networks, is largely unknown. Information security is generally not in the regular IT curriculum, and a reasonable core curriculum for information security has not yet been determined. Some schools, including Purdue, offer courses in secure programming as electives. Many programming flaws are actually taught against in almost every curriculum, Spafford says, but problems arise because either students do not pay attention, are pressed for time, switch languages, or end up working in environments where productivity is stressed over quality.


Cell Phones Linked to Track Real-Time Traffic
PC Magazine (11/10/08) Hachman, Mark

The Mobile Millennium trial, a real-time wireless traffic network for San Francisco, launched this month and will link together GSM-based cell phones equipped with special software. The pilot project, which hopes to have 10,000 participants by April, will be a real-world test of the technology used in the Mobile Century trial last February, which, like the new trial, was a partnership of Nokia; the University of California, Berkeley; the California Department of Transportation; Navteq; Safetrip-21; and the Center for Information Technology Research in the Interest of Society (CITRIS). The Web site 511.org already provides real-time traffic information in the region based on data from FastTrak transponders, which are used for paying bridge tolls. The cell phone-based method will be less expensive and will not be limited to major freeway infrastructure. Organizers say the Mobile Millennium project will focus at first on commuters who drive between the Bay Area and the Lake Tahoe ski area, with the first phase limited to highways while later phases will add arterial routes. The software used for Mobile Millennium is called Virtual Trip Lines, which organizers called "a data sampling paradigm that anonymizes the GPS-based position information and aggregates it into a single data stream." This data is combined with other traffic data and then broadcast back to the phones and the Internet. A customized urban-focused version of the system, which models traffic in lower Manhattan, also is under development.


Abstract News © Copyright 2008 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Unsubscribe
Change your Email Address for TechNews (log into myACM)