Association for Computing Machinery
Welcome to the November 3, 2008 edition of ACM TechNews, providing timely information for IT professionals three times a week.

HEADLINES AT A GLANCE


E-Voting Groups Are Watching a Handful of States
IDG News Service (11/02/08) Gross, Grant

Several U.S. states, including Virginia and Pennsylvania, will be watched closely on election day for problems with electronic-voting equipment, says Verified Voting president Pamela Smith. She says expected long lines and the lack of early voting options in some states could be problematic if there is any kind of equipment breakdown. "This is an election that will sort of stress-test the [election] systems," she says. "Any problem that's going to come up is going to be amplified." Several states have already reported long lines during early voting, and other states do not have adequate numbers of voting machines available to replace malfunctioning equipment. The problem will be most severe in states with touch-screen machines, such as Pennsylvania and Virginia. In addition to not having early voting, Pennsylvania and Virginia do not require paper-trail backups. Meanwhile, University of South Alabama professor Alec Yasinsac will closely monitor Florida and Ohio, as both states have a history of tight races and voting equipment problems. Florida has scrapped most of its touch-screen e-voting machines in favor of an optical-scan system in which paper ballots are scanned electronically. Ohio has faced problems with its e-voting machines during the 2004 presidential election and a primary election this year. Like Florida, Ohio has switched from touch-screen machines to optical-scan systems. Yasinsac, who serves on ACM's voting subcommittee, says that regularly switching between voting machines creates the potential for problems, as there is not enough time to train workers or test the systems. Other states to watch include Maryland, New Jersey, Delaware, Louisiana, Georgia, and South Carolina, all of which use touch-screen voting machines without paper-trail backups.


Scientists Crack Possible Future Quantum Computer Age Encryption
TG Daily (11/03/08) Gruener, Wolfgang

A system that was considered to be strong enough for quantum computing has been cracked by researchers at Eindhoven University of Technology in the Netherlands. The team built software that is capable of speeding up attacks on McEliece, a 30-year-old public-key encryption algorithm, and used a cluster of 200 computers to decrypt a ciphertext in just one week. Still, the McEliece cryptosystem could be used with more powerful computers because larger key sizes can be scaled to guard against such attacks. The superior strength and scalability of the system has not led to substantial acceptance of the technology in the cryptographic community. The McEliece cryptosystem has a very large public key (219 bits). Encrypted messages are much larger than the plain-text message, which increases the chance of transmission errors. Also, it is an asymmetric key algorithm and cannot be used for authentication or signature schemes.


Education in 2015: Cyberlearning for Digital Natives
Network World (10/30/08) Cox, John

At the recent Educause conference in Orlando, Florida, University of California, Los Angeles (UCLA) professor Christine Borgman outlined what learning could look like in 2015 if educators, teachers, researchers, and policy makers systematically leverage emerging technology trends. Borgman said the future learning environment will offer pervasive high-bandwidth wireless networks, cloud-based processing, and fast-growing repositories of digital information, including data from networked sensors and information analysis tools. Borgman's presentation was based on "Fostering Learning in the Networked World: The Cyberlearning Opportunity and Challenge," a report released by the National Science Foundation's Task Force on Cyberlearning, which Borgman chaired. Borgman is a researcher at UCLA's Center for Embedded Networked Sensing, which develops wireless sensing systems and explores their impact on a range of scientific and social issues. Borgman noted that today's students, dubbed "digital natives," are already using many digital tools to advance their understanding of their world and to learn, often in nontraditional ways. However, she said that when students enter a classroom, they effectively step back in time and are removed from their real-time relationship with the Web and information. In 2015, learning will be fully accessible, both at school and at home and other areas, Borgman said. Simulations, remote virtual labs, and data-visualization tools will enable students to work with massive amounts of data in real time. Students will have access to information kept in a variety of online digital repositories, and will be able to share a variety of experiences online with classmates, teachers, and others.


IBM Researchers Show Off New Weapon in Fight Against Online Fraud
eWeek (10/29/08) Prince, Brian

A new device devised by IBM researchers is designed to curb the compromising of online banking transactions by man-in-the-middle attacks and malware-infected PCs through the establishment of a direct, secure communications channel to online banking servers. The Zone Trusted Information Channel (ZTIC) can interface with the USB port of any computer to set up the server link, effectively circumventing the user's PC. If the PC is tainted with malware, the user can terminate the transaction while it is displayed on the ZTIC device. Man-in-the-middle attacks are countered because the device encrypts the data and uses its own hardware to carry out authentication and confirmation of the transaction, says IBM's Gunter Ollmann. "The various phases of the validation and acceptance of a transaction are moved from the PC over to the ZTIC," he says, adding that the system "can use bank-supplied smart-card technologies to further boost this encryption/security." IBM says pilot ZTIC devices are ready for trial. "In the presence of an ever-more professionally operating e-crime scene, it became obvious that PC software-based authentication solutions were potentially vulnerable and that we needed to innovate to stay ahead," says Peter Buhler with IBM's Zurich Research Lab. "That was the starting point for developing the ZTIC."


On Security, Microsoft Reports Progress and Alarm
New York Times (11/03/08) P. B9; Markoff, John

The security of the Windows operating system has significantly improved, but the threat of computer viruses, fraud, and other online threats has become far more serious, concludes Microsoft's biannual "Security Intelligence Report." Microsoft blames organized crime, naive users, and its competitors for the deteriorating situation. Microsoft reports that the amount of malicious or potentially harmful software removed from Windows computers increased by 43 percent during the first half of 2008. The report also says that improved Windows security caused attackers to shift their attention to security holes in individual programs. For example, the report notes that 90 percent of newly reported vulnerabilities involved applications in the first half of 2008, while only 10 percent of new vulnerabilities involved operating systems. Microsoft says that software practices must change industry wide otherwise the improvements in Windows will be meaningless. Security researchers agree. "The only thing that Microsoft can patch is their own software," says F-Secure chief security advisor Patrik Runald. "That's not what the bad guys are using to get into computers these days. It's certainly a challenge." The computer security industry has been fighting a losing battle as computer criminals are increasingly able to profit from identity theft and a variety of other scams. Microsoft has tried to combat the problem by building a variety of safeguards into its operating systems and its Internet Explorer browser, with mixed success. The Microsoft report notes that the infected rate of U.S. computers rose 25.5 percent in the last six months.


Optical Firewall Aims to Clear Internet Security Bottlenecks
ICT Results (10/29/08)

Researchers working on the European Union-funded WISDOM project have developed a firewall capable of analyzing data on fiber-optic networks at speeds of 40 gigabits per second. As demand for data-intensive services increases, telecommunications providers are expanding fiber-optic networks, and while performance has improved, the electronic processes and algorithms used to filter data for security threats is struggling to keep up. Using custom algorithms, WISDOM's optical firewall looks for patterns in the header content of data packets to isolate possible viruses, attacks, and other threats. The WISDOM firewall acts as a primary, high-speed filter that routes suspect packets to electronic processes for additional analysis. The WISDOM firewall was built using an integrated photonic technology platform in which silica-on-silicon circuits form an optical equivalent of an electronic printed circuit board. WISDOM researchers say the hybrid boards can be fitted for components for a variety of uses, including sensor systems, avionics, data transmission, optical processing, and network security.


'Cultured' Robots Make Sweet Music Together
New Scientist (10/29/08)

A composer and computer scientist at the University of Plymouth in the United Kingdom has developed two robots that are capable of singing sounds together in human voices. One robot sings a random sequence of about six notes, and the other follows with its own sound. Then the first robot compares what has been sung, nods if they are similar, and the second robot memorizes the settings that created the sequence. The robots only record what they both know, and they eventually build a memory of similar sounds, which becomes a shared culture. Eduardo Miranda programmed the robots, and he believes they could help him compose music that cannot be created by a human. "The robots develop their own musical culture," says Miranda. "There are no pre-programmed musical rules."


California, Canada Campuses Combat Greenhouse Gas Emissions With Green IT
University of British Columbia (10/27/08) Chan, Lorraine

The University of British Columbia (UBC) and the University of California, San Diego (UCSD) have pledged to work together to reduce greenhouse gas emissions on their campuses and to develop information technology that improves energy efficiency and reduces the impact of emissions on climate change. "By pooling our knowledge, resources, and best practices, Canada and the U.S. will be that much more able to contribute cutting-edge research on climate change," says UBC's John Hepburn. In the short term, the institutions will work to develop methods to share greenhouse gas emissions data in connection with International Organization for Standardization standards for information computer and telecommunications equipment and baseline emissions data for cyberinfrastructure and networks. UCSD's Art Ellis says the partnership "will examine how cyberinfrastructure can be used in research universities to create carbon-neutral environments. We are committed to sharing best practices, and working together to realize the promise of our collaboration." CANARIE chief research officer Bill St. Arnaud notes that huge growth has boosted the carbon footprint of high-performance computing (HPC), but he says networking and other trends such as virtualization promise to make HPC part of the solution.


ASU Supercomputer Provides Massive Computational Boost to Biomedical Research at TGen
Translational Genomics Research Institute (10/29/08) Yozwiak, Steve; Kullman, Joe

The Translational Genomics Research Institute's (TGen's) new Saguaro 2 supercomputer at Arizona State University is capable of performing 700 billion computations in less than 1/60th of a second, says ASU's High Performance Computing Initiative director Dan Stanzione. Saguaro 2 is capable of 50 trillion mathematical operations per second, and Stanzione says TGen will need that computing power to research a variety of human diseases through the use of data-rich DNA sequencing, genotyping, microarrays, and bioinformatics. The supercomputer will support efforts in translational biomedicine and help develop new therapies for individual patients suffering from Alzheimer's, autism, diabetes, coronary heart disease, melanoma, pancreatic cancer, prostate cancer, colon cancer, multiple myeloma, and breast cancer. TGen CIO Edward Suh says a joint TGen-ASU computer support team is being assembled, and he wants TGen and ASU to create more partnerships. "I am confident this new supercomputer system will help the ASU and TGen scientists expedite their research and accelerate innovation in biomedical and engineering research," Suh says. Saguaro 2 is a partially water-cooled set of 7-foot-tall black monolith computer racks, each one containing as many as 512 processor cores, linked by ultra-high-speed Infiniband cables.


NSF Enables Pakistan to Connect to Global Research Community Through New High Speed Link
National Science Foundation (10/28/08)

A functional U.S.-Pakistan network connection was tested last week during the Internet2 Emerging National Research and Education Networks session at the Fall Internet2 meeting in New Orleans. The Pakistan Higher Education Commission participated in the meeting over a 155Mbps connection from Islamabad to the TEIN2 network, the National Science Foundation (NSF)-funded TransPAC2 network, and the Internet2 network. "This represents a major milestone in the development of physical network connectivity between Pakistan and the global scientific community," says NSF director Arden L. Bement Jr. "Now we must continue those efforts toward our true goal of enhancing global research and education collaborations." The network connection will enable Pakistan scientists from 60 universities and institutes, linked through the Pakistan Education Research Network, to work with their international peers on research projects that require fast data transfers to share information around the world. "This network connection is the result of the hard work of many people and groups from the U.S., Pakistan, and the EU," says Indiana University principal investigator James Williams. "It is now our responsibility to continue that hard work and cooperation as we transform this link into a valuable piece of international cyberinfrastructure."


'Smart' Robotic Sensors Monitor Activity at Mount St. Helens
Columbian (WA) (10/27/08) Raftery, Isolde

U.S. Geological Survey hydrologist Rick LaHusen, Washington State University Vancouver computer science professor WenZhan Song, a group of WSU Vancouver graduate students and NASA's Jet Propulsion Laboratory are working to create more robust wireless communication systems for Mount St. Helens, which returned to dormancy in July following three years of eruptions. Aided by a $1.63 million NASA grant, the researchers are designing a dozen smart sensors that talk to each other and link to a central information hub. NASA scientists are monitoring the research in the hopes of using the technology to explore Mars. "The sensors are always looking to see what the best route is in case an instrument has some kind of catastrophic event--ice, snow burial, or the ash might blow one way," LaHusen says. "They can relay data between themselves, making short hops that are more energy efficient." The sensors performed exactly as designed when they were recently deployed inside Mount St. Helens' moonlike crater. The sensors can detect movements, relaying information to the Johnston Ridge Observatory at the Mount St. Helens visitor center. The ad hoc network the sensors create could have applications in radio systems, which would make radios more reliable in emergency situations, or create wireless nodes that could detect where a network is broken to indicate, for example, the location of a mine collapse.


Computational Pioneer Erez Lieberman Explains How the Web--and Spam--Evolves
Computerworld (10/27/08) Forrest, Sara

Erez Lieberman, a pioneer in mathematical and computational approaches to the study of evolution, testifies to modern computing's value to biology. "Whether you're studying human physiology or the mouse genome, 21st-century computation gives us the agility to handle huge data sets and to capture at least a fraction of the complexity that makes humans go," he says. Researchers studying social networks and the Internet are using Lieberman's research into evolution on networks, and Lieberman says that the Web, like biological systems, evolves through mutation and selection. He describes spam emails as an evolving population, noting that "just like viruses, which mutate their genome in order to disguise themselves from the immune system, spam emails incorporate typos like V1@gR@ to try and disguise themselves from spam filters. Now all this mutant spam gets sent out, and the emails that get the pitch across to the human reader while avoiding the spam filter will get reused in the future," which is an instance of selection. Lieberman also notes that evolutionary graph theory, which he developed with Harvard professor Martin Nowak, gave researchers insight into how network configuration has an impact on the evolution that occurs within the network.


If Looks Could Kill
Economist (10/25/08) Vol. 389, No. 8603, P. 95

Intelligent surveillance systems that can determine whether an observed person's intentions are hostile are being developed and implemented thanks to advances in behavior recognition technologies. One example is Project Hostile Intent, an initiative undertaken by the U.S. Department of Homeland Security's Human Factors Division focusing on the analysis of mostly involuntary "micro-expressions" that the human eye usually overlooks. Advocates of such technology say its concentration on behavior avoids racial profiling, while the University of Arizona's Judee Burgoon contends that the systems should be enhanced with cultural input to reduce the likelihood of false positives. The Human Factors Division also is working on Future Attributable Screening Technology (FAST) as a complementary program to Project Hostile Intent. FAST involves a series of sensors that can read skin temperature, blood-flow patterns, perspiration, and heart and breathing rates from a distance of several meters, and which reportedly had a roughly 80 percent success rate in detecting deceit or hostile intent in a recent demonstration. The FAST system does generate some false positives, which has some civil libertarians on alert.


Want an Easy Way to Control Your Gadgets? Talk to Them.
Discover (10/17/08) Cass, Stephen

AT&T is developing Watson, a voice-recognition system that will enable users to control their devices by talking to them. Although some cell phones already have voice-recognition tools for basic tasks, such as looking up phone numbers in a contact list, AT&T believes such devices can handle more complicated voice commands. For example, Watson will be able to help users find the closest ATM or automatically order a pizza when instructed. Watson is so complex that it is more practical to run the software on centralized servers than to run it on individual mobile devices. Moreover, since most wireless devices can connect to the Internet, and contain the hardware and software necessary to capture and compress speech, nearly any device can be configured to work with Watson. Captured speech is sent over the Internet or a cell phone network to an AT&T computer running Watson. Watson analyzes the speech and sends commands back to the cell phone to implement. AT&T is demonstrating the technology using a voice-operated TV remote control designed to work with AT&T's Internet TV service, allowing users to ask the remote to find programming in specific genres or with their favorite actors.


Symposium Examines Research Topics at Nexus of Digital Humanities and Computing
CLIR Issues (10/08) No. 65, Smith, Kathlin

The research challenges arising from the convergence of the humanities, humanistic social sciences, and technology was the focus of a recent symposium hosted by the Council on Library and Information Resources and the National Endowment for the Humanities. The purpose of the conference was twofold: To comprehend and envision how new media promote and transform the interpretation and analysis of text, image, and other sources of interest to the humanities and social sciences and facilitate new expression and teaching, and to understand how those processes of inquiry pose questions and challenges for research in computer science, humanities, and social sciences. One session stressed the opportunities and challenges for research, pedagogy, and learning, with participants registering the basic problem of sharing resources when so many exist in incompatible formats. Ensuring that scholarly resources are interoperable and that critical search and discovery tools are designed for broad use were some of the points that participants agreed on. Constructing better connections between cutting-edge and general digital technology users was another issue discussed, as was the challenge of changing faculty members' concepts of publication from traditional forms to a services model. Research areas highlighted by participants as stemming from the intersection of computing and humanities included language representation and computation to isolate characteristics of patterns in data, better techniques for searching and retrieving still and moving images, authoring system problems, methods for visualizing uncertainty and annotating premises behind conclusions, and consideration of whether validation methods used in computer science communities may be applicable to some humanistic data applications. Among the collaboration opportunities identified by symposium participants was partnering domain experts with computer scientists to produce domain ontologies and investigating data and tool preservation strategies via collaborations among supercomputing partners, digital humanities partners, and the National Archives and Records Administration.


Abstract News © Copyright 2008 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Unsubscribe
Change your Email Address for TechNews (log into myACM)