Welcome to the March 20, 2009 edition of ACM TechNews, providing timely information for IT professionals three times a week.
HEADLINES AT A GLANCE
Experts See Shortfall in Cybersecurity Research
InternetNews.com (03/19/09) Corbin, Kenneth
The United States is not prepared to deal with the emerging threats against the country's digital infrastructure, warned cybersecurity experts at a recent U.S. Senate Commerce Committee meeting. The experts also said that Congress needs to make information security and security education a bigger priority. "The simplest way to state this is the nation is under attack," said Purdue University professor Eugene Spafford, executive director of the Center for Education and Research in Information Assurance and Security. "It is a hostile attack, it is a continuing attack, and it has been going on for years, and we have been ignoring it." James Lewis, director of the Center for Strategic and International Studies, said cyberattacks threaten the long-term economic competitiveness and technological leadership of the United States. Senate Commerce Committee chairman John Rockefeller (D-W.Va.) said he is encouraged by President Obama's focus on cybersecurity, but cautioned that time is critical as computers are increasingly being used to manage the country's infrastructure. Rockefeller said he plans to introduce legislation that would boost cybersecurity education at the university level. Spafford said the U.S. needs more cybersecurity experts and noted that universities graduate just 50 to 60 Ph.D.s in fields related to cybersecurity. "Of those perhaps 10 to 15 are going to return to their home countries to start businesses to compete against the U.S. because our visa policies won't let them stay," he said.
Computer Experts Unite to Hunt Worm
New York Times (03/19/09) P. A15; Markoff, John
Computer security experts and ICANN are battling the author of the Conflicker botnet to prevent the malware program from causing further disruption on the Internet. Since first appearing late last year, Conflicker has spread rapidly, infecting millions of computers and creating a powerful botnet. Conflicker's author has updated the program several times in response to efforts to shut it down. An examination of Conflicker found that infected computers are programmed to try to contact a control system for instructions on April 1. Speculation over the nature of Conflicker's threat has ranged from a wake-up call to a devastating attack. Researchers working on disassembling the Conflicker code have not been able to determine where the author, or authors, are located, or whether the program is maintained by one person or a group. The consensus is that the Conflicker botnet will ultimately be sold as a computing-for-hire scheme. Several experts who have analyzed various versions of the malware say that Conflicker's authors have been monitoring the efforts to restrict the malicious program, and have repeatedly demonstrated that their skills are at the cutting edge of computer technology. The U.S. Federal Bureau of Investigation's Paul Bresson says the bureau is aware of the worm and is working with security companies to address the problem. A report from SRI International says the latest version of Conflicker, Conflicker C, represents a major rewrite of the software that makes it far more difficult to block communication with the program, but also gives it the ability to disable many commercial antivirus programs and Microsoft's security update features.
Diebold Admits Systemic Audit Log Failure; State Vows Inquiry
Wired News (03/17/09) Zetter, Kim
Premier Election Solutions admitted at a recent California state hearing that significant events, such as someone erasing votes on election day, are missed by the audit logs generated by its tabulation software, and this problem is endemic to all versions of the software. "The audit logs have been the top selling point for vendors hawking paperless voting systems," said California Voter Foundation president Kim Alexander. "They and the jurisdictions that have used paperless voting machines have repeatedly pointed to the audit logs as the primary security mechanism and 'fail-safe' for any glitch that might occur on machines." Premier's Global Election Management System (GEMS) software is used to tabulate votes cast on the company's touch-screen and optical-scan machines. It is used in more than 1,400 election districts in 31 states. The initial investigation was set off by an incident in Humboldt County in which a Premier system lost nearly 200 ballots during the U.S. presidential election in November. The company said the deletion was caused by a programming flaw in the GEMS software. Scrutiny of the audit logs by state officials revealed that the logs did not record critical information, making the tracing of the specific mechanism behind the deletion impossible. Furthermore, two of the logs featured a "clear" button that allowed officials to delete them, and although the button was eliminated in a later iteration of the GEMS software, three California counties still used the version with the button. California Secretary of State Debra Bowen described the audit logs as "useless" and pledged to conduct a deeper probe of the issue.
Ancient Cultures a Click Away
ICT Results (03/19/09)
European researchers working on the Mosaica project are developing portals that will enable users to virtually travel to both ancient and new destinations. "Mosaica was an opportunity to carve a niche in the semantic web for cultural diversity; to transfer young people's inherent fascination for technology into a learning experience; and to galvanize communities around important topics like preserving cultural heritage," says project leader Raphael Attias. "We've created a platform for playing and learning; with games as a hook for younger generations whose attention is harder to get--and keep--these days." Mosaica's goal is to bring the world into every classroom and every home. The project has developed a demo that features Jewish cultural heritage and has three main modes of use--explorative, guided, and collaborative. Users can explore Jewish heritage on their own time, take a guided tour or virtual expedition that provides them with prompts and recommendations on dynamically generated maps, or work together to share knowledge. Users can make notes on cultural objects, including free-text comments or semantically annotated comments using dynamic ontology creation, and contribute content such as photos. Students also can Mosaica to record their own virtual expeditions.
Robot Fish to Catch Pollution
Financial Times (03/20/09) P. 6; Harvey, Fiona
Researchers at Britain's University of Essex and the BMT Group have developed robotic fish that will be released into the port of Gijon in northern Spain to monitor the water's quality. The fish are lifelike in appearance and equipped with tiny chemical sensors capable of detecting pollutants in the water. BMT senior researcher Rory Doyle says there are very practical reasons for developing robots based on fish. "In using robotic fish we are building on a design created by hundreds of millions of years' worth of evolution, which is incredibly energy efficient," Doyle says. "This efficiency is something we need to ensure that our pollution detection sensors can navigate in the underwater environment for hours on end." The robots are autonomous and run on batteries that are recharged when the robots automatically return to a charging station. University of Essex professor Huosheng Hu says the fish will be able to detect changes in the environment in the port, and identify the early signs of a pollutant spreading. Hu says the objective is for the fish to detect pollutants early to prevent leaks from getting worse. The robotic fish should be released into the port next year.
Educating NITRD
Computing Research Association (03/18/09) Wilson, Cameron
ACM, the Computing Research Association (CRA), and National Center for Women and Information Technology (NCWIT) are working to improve the profile of computer science education efforts as part of the federal government's Networking and Information Technology Research and Development (NITRD) program. NITRD spans several government agencies to coordinate investments in information technology (IT) research and development. In 2007, the President's Council of Advisory on Science and Technology issued a report containing several recommendations for NITRD, including the need to improve computing education and strengthen the IT workforce pipeline. ACM, CRA, and NCWIT are working together on a letter outlining ideas of how NITRD could be improved and used to address computer science education issues, specifically at the K-12 level. The letter aims to strengthen the pipeline by expanding, better utilizing, and coordinating existing education efforts within the NITRD program. The letter offers four specific recommendations: Promoting computing education, particularly at the K-12 level, and increasing exposure to computing education and research opportunities for women and minorities as core elements of the NITRD program; requiring the NITRD program to address education and diversity programs in its strategic planning processes; expanding efforts at the National Science Foundation to focus on computer science education by broadening the Math Science Partnership program; and enlisting the U.S. Department of Education and its resources to address computer science education issues.
Internet Can Warn of Ecological Changes
University of East Aglia (03/19/09)
Researchers from Sweden's Stockholm University and Britain's University of East Anglia (UEA) say the Internet could be used as an early warning system for potentially disastrous ecological changes. In an article published in the journal Frontiers in Economy and the Environment, the researchers explore the option of using Internet data to detect approaching ecosystem changes. "Information and communications technology is revolutionizing the generation of and access to information," says UEA's Victor Galaz, the article's lead author. "Systematic data mining of such information through the Internet can provide important early warnings about pending losses of ecosystem services." The authors focus on three potential approaches that use Web crawlers to identify possible ecological shifts. First, Web crawlers can collect information on the drivers of ecosystem changes, instead of just on the resulting change. Second, future early warning systems can use the recent insight that ecosystems sometimes broadcast an upcoming collapse. For example, the variability of fish populations has been shown to increase in response to overexploitation. Third, Web crawlers can find information that describes ecological changes at small scales, which could warn of similar shifts in other locations, including warnings of invasive species and reduced resilience of ecosystems at larger scales due to the small-scale loss of interconnected populations or ecosystems.
Virtual Breath
University of Texas at Austin (03/11/09) Dubrow, Aaron
Researchers at the University of Auckland and the University of Iowa are developing a system for modeling, interpreting, and diagnosing the movement of air through the human lung using Texas Advanced Computing Center (TACC) high-performance computing resources for the purpose of determining how abnormalities can cause illness. The researchers are evaluating a new methodology of simulating patient-specific pulmonary airflow using TACC's Lonestar and Ranger supercomputers. "We use computed tomography (CT) images to construct realistic human lung models, and then we use computational fluid dynamics (CFD) models to simulate the airflow through the lung," says Iowa professor Ching-Long Lin, whose algorithms can model up to 23 generations of branching airways to create a very complex and useful mesh simulation. Lin has contributed a numerical scheme that integrates extremely accurate three-dimensional (3D) models of the lungs with less precise but more widespread one-dimensional (1D) models. The data derived from Lin's framework represents the lungs of a specific individual because it is based on actual high-fidelity CT scans. "The multi-scale, CFD framework is essential for delivering this technology 'to the bedside' because it allows us to select a combination of the 3D and 1D domains to give fine-scale computation (3D) in critical areas, and coarse scale computation (1D) in the remainder of the lung," says Auckland Bioengineering Institute research fellow Merryn Tawhai. Lin and his colleagues are using their system to explore biomedical issues in which the provision of a realistic airway model is critical, such as the role airflow plays in patients with abnormal tracheal structures. Lin says the use of Lonestar and Ranger is essential to modeling complex physiological structures with sufficient resolution to be useful and meaningful.
New Mathematical System Helps to Cut Bus Journey Times
University of Burgos (Spain) (03/17/09)
Researchers from the University of Burgos (UBU) have used heuristic algorithms and the "taboo search" method to improve bus service in Burgos, Spain. The approach enables their system to handle imprecise data and to only look for solutions that have not been developed. "For example, when searching for solutions, if a bus has just left a particular stop, this stop is then marked as 'taboo' and it cannot be included as part of that route again for a certain number of iterations [repetitions]," says Joaquin A. Pacheco, coordinator of UBU's Research Group on Metaheuristic Techniques. Slight modifications to current bus lines can be made as a result of the taboo search. The new system lowers the time spent waiting at Burgos bus stops from 20 to 17 minutes, and reduces bus journey times from 16 to 13.5 minutes. "When we face a problem of this kind, we can use an exact method, which gives us an optimal solution but takes a long time to calculate--or we can use an approximate or heuristic technique, which provides a good solution with less calculation time," Pacheco says.
Browser Coders Make Chrome Shine
Technology Review (03/18/09) Naone, Erica
Following the beta release of its Chrome Web browser, Google launched Chrome Experiments, a project to demonstrate Chrome's full potential. Chrome Experiments showcases the applications that require significant data processing using multiple Web pages simultaneously. Many of the demos in the project would cause other Web browsers to crash, say the developers of the experiments. Although some developers say the demonstrations highlight new opportunities for building complex Web software, others say it may be difficult to standardize the required features and that browser security should remain a higher priority. One of the experiments, Twitch, designed by University of California, Los Angeles professor Casey Reas, demonstrates Chrome's ability to launch each window or tab as a separate process on a computer, which allows multiple windows to run as if they were separate applications. Darrin Fisher, one of the engineering leads for Google Chrome, notes that all of the experiments use just HTML and JavaScript, and are designed to demonstrate what is possible with basic Web technology. "People typically think that they have to use Flash to get things done," Fisher says. However, he notes that not all browser functions work with plug-ins such as Flash and Java. Chrome Experiments offers "a glimpse of a Web without proprietary plug-ins," Reas says. "This is how the Web and innovation on the Web happened back in the mid-1990s."
Researchers Predict Click-Through Behavior in Web Searches
Penn State Live (03/11/09) Spinelle, Jenna; Messer, Andrea
Researchers from Penn State University (PSU) and Queensland University of Technology have developed a way to measure whether search engine users are satisfied with their results. The team used search logs from Dogpile.com to identify factors that would help predict whether user click-throughs increase or decrease. For example, the number of records in a search was among the five positive factors, the number of organic links clicked was among the four negative factors, and user intent was found to have no effect. The researchers created input and output values for neural networks, and used the systems to analyze the way users interacted with the search engine results. "Because click-through is based on each user, we grouped the records according to each unique IP address and cookie to determine a single user," says PSU professor Jim Jansen. The researchers found that more searchers clicked through early in the day and more searchers using Internet Explorer clicked through. "This research explores the online behaviors of users so that commercial search engine companies can utilize the data to improve click-through rates by designing more efficient retrieval and ranking algorithms," Jansen says.
Safer Net Surfing Is Goal of NIST Domain Name Security Experts
NIST Tech Beat (03/10/09) Brown, Evelyn
Scientists from the National Institute of Standards and Technology (NIST) are developing standards, guidance, and testing procedures designed to improve the security of the Domain Name System (DNS). Currently, the DNS system lacks the ability to authenticate the integrity of the source or response to the system, making it easier to redirect users away from legitimate addresses to Web sites that participate in phishing or other illegal Internet-based activity. NIST computer scientists led the development of new Internet Engineering Task Force standards to add digital signatures and associated key management procedures to DNS protocols. These additions, known as DNSSEC, let users validate the authenticity and integrity of the data and will supply the foundation for a new trust infrastructure for the DNS and protocols and systems that depend on it. NIST has posted a draft update of guidelines for DNS security, which is now available for public comment. Additionally, NIST recently provided technical assistance to ensure the security of the .gov top level domain. "We hope that the .gov deployment of DNSSEC will encourage rapid deployment in other sectors, including government contractors, trading partners, and general e-commerce sites," says NIST researcher Scott Rose.
Venture Looks to Future of Data Delivery
Nikkei Weekly (03/09/09) Vol. 47, No. 2378, P. 17
Visible light communications is a new data transmission technique in which data is sent to devices via light-emitting diodes (LEDs). The method is being researched by Keio University professor Shinichiro Haruyama, and practical applications could include analysis of shoppers' movements in retail stores by LED lights in the ceiling and sensors attached to the bottom of shopping carts. Such a system is about to undergo testing in a supermarket by Nakagawa Laboratories. Practical visible light communications became possible when LED lights that can blink at rates of more than 2 million times per second were rolled out, and large volumes of data can be transmitted via the conversion of the blink patterns into digital signals. Visible light communications is viable in places such as hospitals where conventional communications methods are not allowed because of their potential to interfere with the functioning of other devices, while another advantage is the lower equipment investment. NEC, Toshiba, and Nippon Signal have teamed up for studies of LED traffic signals that can share data using light rather than wires, which promises to realize considerable cost savings. Companies are especially excited about visible light communications' ability to transmit information to cell phones, and in October 2008 the Visible Light Communications Consortium voted to unify visible light communications standards with infrared light communications standards.
Abstract News © Copyright 2009 INFORMATION, INC.
To submit feedback about ACM TechNews, contact: [email protected]
Unsubscribe
Change your Email Address for TechNews (log into myACM)
|