Association for Computing Machinery
Welcome to the July 21, 2010 edition of ACM TechNews, providing timely information for IT professionals three times a week.

HEADLINES AT A GLANCE


Cyberwarrior Shortage Threatens U.S. Security
NPR Online (07/19/10) Gjelten, Tom

The security of U.S. computer systems requires an army of cyberwarriors, but a severe shortage of skilled computer security specialists is leaving the nation vulnerable. "We don't have sufficiently bright people moving into this field to support those national security objectives as we move forward in time," says former U.S. National Security Agency cybersecurity specialist James Gosler. There are only about 1,000 people in the United States with the sophisticated skills needed for the most demanding cyberdefense tasks, Gosler says. However, the demand for qualified cybersecurity specialists far exceeds the supply as Gosler estimates that 20,000 to 30,000 similarly skilled cyberspecialists are needed. The Center for Strategic and International Studies (CSIS) highlights the problem in a forthcoming report, "A Human Capital Crisis in Cybersecurity." A key element of a "robust" cybersecurity strategy is "having the right people at every level to identify, build, and staff the defenses and responses," according to the report. Some members of Congress are promoting a U.S. Cyber Challenge, a national talent search at the high school level with the goal of finding up to 10,000 potential cyberwarriors.


EU Boosts Hi-Tech Research Budget
BBC News (07/20/10)

The European Union (EU) plans to increase its budget for scientific research and innovation by 12 percent next year to 6.4 billion euros. The research will focus on energy projects, climate change, health, food security, and Europe's ageing population. About 600 million euros will go toward advanced computer technologies, 400 million euros will be spent on computer applications that address the challenges of building a low-carbon economy and managing aging populations, and 270 million euros have been earmarked for nanotechnologies. An additional 600 million euros is targeted for health research, about 206 million euros of which will go into clinical trials for new drugs. About 16,000 research bodies and businesses will receive grants and more than 165,000 jobs will be created in the effort to make Europe more competitive and greener. "Research and innovation are the only smart and lasting route out of crisis and towards sustainable and socially equitable growth," says EU commissioner Maire Geoghegan-Quinn. She says the notes that EU-funded research currently accounts for about 5 percent of the total public funding for research in the EU.


EU Scientists Make Virtual Reality Touchable
CORDIS News (Belgium) (07/20/10)

European researchers working on the immersive multimodal interactive presence (IMMERSENCE) project have developed an approach to generating virtual reality (VR) content using haptic technology that enables them to virtually teleport real objects through cyberspace, "touch" virtual reality, and feel the movements of a virtual dance partner. "We know that the more senses that can be used, the more interaction, the greater the sense of presence," says Technische Universitat Munchen researcher Andreas Schweinberger, coordinator of the IMMERSENCE project. The researchers, collaborating from nine universities and research centers across Europe, developed haptic and multimodal interfaces, new signal processing techniques, and a new method to generate VR objects from real-world objects in real time. The latter technology uses a three-dimensional scanner and advanced modeling system to create a virtual representation of a real object, which can then be transmitted to someone at a remote location. Other researchers focused on creating human-to-human interaction in a virtual environment. "The audiovisual aspects of VR have come a long way in recent years, so adding a sense of touch is the next step," Schweinberger says.


Survey Reveals Ed Tech Is Progressing, if Slowly
THE Journal (07/19/10) Aronowitz, Scott

U.S. primary and secondary schools and colleges are showing progress in education technology implementation, but the progress' pace is very slow, according to a new national education technology survey. The overall progress of the education technology implementation plan, called Vision K-20, has improved in four of five areas measured since 2009, according to the Software & Information Industry Association (SIIA) survey. However, the average increase in scores tracking that progress was less than one percent. "America's students are moving ever more quickly to 21st century technologies, but education leaders and institutions are not responding with the educational framework needed to keep pace with either the opportunity or the needs," says SIIA's Karen Billings. The survey indicates that minimal financial resources have limited technology growth. "With scarce resources, it becomes even more critical for institutions to use technology to more efficiently achieve their educational goals and outcomes," Billings says.


The Semantic Web Made Easy
Universidad Politecnica de Madrid (Spain) (07/21/10) Martinez, Eduardo

Researchers at Universidad Autonoma de Madrid (UAM) and Universidad Politecnica de Madrid (UPM) jointly developed Fortunata, a tool designed to simplify the use of the semantic Web. The researchers say Fortunata can be used by developers, graphic designers, and end users without an in-depth knowledge of informatics. The researchers wanted to improve the Internet by extending interoperability among software systems and creating programs that are capable of searching and interrelating information without human operators. Early tests show that users find the applications generated with Fortunata to be very usable and satisfactory to use, irrespective of their knowledge of informatics. The tests also demonstrate that graphic designers can create attractive Web templates using Fortunata's tools. Developers can use the templates to create Web applications that render semantic data. In the future, semantic agents will be able to select the template best suited to each user and adapt to the user's device, such as mobile phones, TVs, or personal computers.


Offshore R&D Spending Grows, Recession or Not
Computerworld (07/19/10) Thibodeau, Patrick

Global engineering research and development (ER&D) spending reached $1.1 trillion in 2009, according to a new study by Booz & Co. and the National Association of Software and Services Companies. The U.S. accounted for 38 percent of ER&D spending, Japan 14 percent, and Asia 7 percent, with most of the remainder coming from Europe. India's share of the global market increased about 40 percent over the last three years and was $8.3 billion last year. India's amount is expected to rise to $45 billion by 2020, when worldwide ER&D spending is projected to reach $1.4 trillion, with the U.S. in the lead at 35 percent, and with Japan at 13 percent and Asia at 11 percent. True innovation work has not topped 20 percent of offshore spending in India. "The high-end work will be retained in the U.S., so don't expect India to do an iPhone," says Booz's Vikas Seghal. Nevertheless, Seghal says ER&D growth in India has helped put the nation in a strong position to develop its own industries. India also could have a deflationary impact on global engineering services that could keep costs from escalating, since the average hourly engineering salary is $4 in India, compared with $41 in the U.S., Seghal notes.


Virtual Universe Study Proves 80 Year Old Theory on How Humans Interact
Imperial College London (07/19/10) Goodchild, Lucy

A study by Imperial College London (ICL), in conjunction with the Medical University of Vienna and the Sante Fe Institute, offers large-scale evidence to prove the psychological theory called Structural Balance Theory (SBT), which suggests that some relationships are more stable than others in a society. In the ICL study, information about interactions between players in a virtual universe game called Pardus is more detailed than that from other electronic sources because it includes data on the types of relationships and whether they are positive or negative. The research shows that positive relationships form stable networks in society, proving SBT for the first time. "Our new study reveals in more detail than ever before the key ingredients that make these networks stable," says ICL's Renaud Lambiotte. The researchers analyzed the data from the game by examining individual networks and the interplay between all the networks. The researchers are now using the tools developed for the study to examine large, complex networks for patterns of communication between millions of people using mobile phone data.


Revealing the True Colors of Masterworks
Technology Review (07/19/10) Mashberg, Tom

Northwestern University (NU) researchers employed enhanced image-processing technologies for colorizing black-and-white images to reveal the colors used by Henri Matisse on one of his paintings. The researchers used information about Matisse's other works, as well as color information from test samples of the work itself, to help colorize a 1913 black-and-white photo of the painting--Bathers by a River--while it was still a work in progress. "Matisse tamped down earlier layers of pinks, greens, and blues into a somber palette of mottled grays punctuated with some pinks and greens," says NU professor Sotirios A. Tsaftaris. The researchers made a high-resolution digital version of the 1913 photograph to work from. They also took multiple digital photos of the painting in its current form, going quadrant by quadrant to obtain a resolution of 4,000 by 5,000 pixels. Finally, they used sample data from their collaborators at the Art Institute of Chicago--cross-sections of the hidden paint layers on Bathers, obtained by removing microscopic core samples of the painting for spectroscopic analysis. "This research is an excellent example of collaborative research between computer science, art conservation, and art history," says Rochester Institute of Technology color scientist Roy S. Berns.


'Condor' Brings Genome Assembly Down to Earth
University of Wisconsin-Madison (07/19/10) Barncard, Chris

Researchers from the University of Wisconsin-Madison (UWM) and the University of Maryland (UMD) have assembled a full human genome from millions of pieces of data using a network of computers instead of a supercomputer. UMD professors Mihai Pop and Michael Schatz combined their Conrail genome assembly software with UWM's Condor distributing computing program. Condor, developed at UWM's Center for High Throughput Computing, breaks up long lists of heavy computing tasks and distributes them across networked computer workstations. The UWM team added features from another distributed-computing tool, called Hadoop, to manage both the complex workflow chain and the large data management problems involved with the billions of letters taken from human DNA by a sequencing machine. "By running them together, we're able to efficiently run this biological application--efficient not just in terms of computer time, but efficient in terms of dollars," says UWM's Greg Thain. "Because Condor could efficiently schedule the work, Maryland didn't have to buy a multimillion-dollar disk cluster."


Self-Sustaining Robot Has an Artificial Gut
PhysOrg.com (07/20/10)

British researchers have developed an autonomous robot with an artificial stomach that enables it to fuel itself by eating and excreting. Bristol Robotics Laboratory researchers designed the robot, called Ecobot III, so that it consumes partially processed sewage, using the nutrients within the mixture for fuel and excreting the remains. The robot also drinks water to maintain power generation. The meal is processed by 24 microbial fuel cells (MFCs), which are held in a stack of two tiers in the robot's body. Undigested matter passes via a gravity feed to a central trough from which it is pumped back into the feeder tanks to be reprocessed in order to extract as much of the available energy as possible. The bacteria in the MFCs metabolize the organic mixture, producing hydrogen atoms in the process, which help produce an electric current. The robot has maintained itself unaided for up to seven days, but is so far extremely inefficient, using only one percent of the energy available within the food.


Computers Intersect With Sociology to Sift Through 'All Our Ideas'
Princeton University (07/19/10) Emery, Chris

Princeton University researchers have developed a new way for organizations to solicit ideas from large groups and to have them vote on the merit of the ideas generated by the group. The system, called All Our Ideas, combines the concepts of sociology and computer science to enable an organization set up a Web site where people can contribute and rank ideas. The researchers say the system could help governments tap into public opinion and provide sociologists with a new research tool. "This is a hybrid method that combines the quantitative power of surveying with the power of focus groups to generate ideas," says Princeton professor Matthew Salganik. All Our Ideas allows survey creators to present respondents with a question and two possible answers. Respondents can either pick one of the answers or submit a new one, which joins the pool of answers that are voted on in pairs. The researchers designed All Our Ideas to avoid the pitfalls that have plagued other systems, such as stopping people from voting for their favorite entry multiple times by programming the system to randomly select two voting choices. The system also accounts for some ideas being in the system longer than others, preventing older entries from dominating the rankings because they have been voted on more times.


Broadband Picture May Not Be So Bleak
MIT News (07/16/10) Hardesty, Larry

A Massachusetts Institute of Technology (MIT) study concludes that most of the common methods for measuring Internet data rates underestimate the speed of the access network. The study found that the number of devices accessing a home wireless network, the internal settings of a home computer, and the location of test servers can all affect measurements of broadband speed. "If you are doing measurements, and you want to look at data to support whatever your policy position is, these are the things that you need to be careful of," says Steve Bauer, the technical lead on the MIT Internet Traffic Analysis Study. "For me, the point of the paper is to improve the understanding of the data that's informing those processes." The researchers analyzed six systems for measuring the speed of Internet connections, from free applications on popular Web sites to commercial software licensed by most major Internet service providers, and found a different reason why access speed was underestimated in each system. The research calls into question recent claims made by the U.S. Federal Communications Commission that Internet access speeds are only half as high as advertised.


Circling the Square
Science News (07/17/10) Vol. 178, No. 2, P. 17; Ehrenberg, Rachel

In 1957, U.S. National Bureau of Standards scientist Russell Kirsch made the first digital image using an apparatus that transformed a picture of his infant son into a 176-pixel by 176-pixel grid of zeros and ones. Although using square pixels to create a digital image was a logical choice at the time, Kirsch says people have been suffering with grainy images every since. To correct the problem, Kirsch has developed software that converts a digital image's square pixels into a smoother picture made of variable shaped pixels. Kirsch's method assesses a square-pixel picture with masks that are six by six pixels each and looks for the best way to divide this larger pixel into two areas of the greatest contrast. He says the program could be useful to the medical community, which must feed images such as X-rays into computers. Kirsch's approach addresses a problem that the field of computational photography continues to struggle with, says Duke University's David Brady.


Crunching Cancer With Numbers
New Scientist (07/13/10) Buchen, Lizzie

In 2009, the U.S. National Cancer Institute recruited scientists from a broad range of disciplines to apply their expertise and computational tools toward the discovery of simple laws governing the fate of cancer cells. This differs from the molecular-level approach that cancer research has concentrated on for the last several decades. The researchers are testing a series of interlocking computational models they have devised from fundamental precepts to describe and predict different aspects of cancer. Within five years they hope to have a single, all-encompassing model of mouse lymphoma that seamlessly aligns with the data. The ultimate goal is to generate a model that can anticipate an individual's response to various combinations of cancer therapies by feeding it with key parameters, such as gender, blood pressure, and genetic sequences. One of the researchers engaged in this effort is Paul Newton, project leader at the Scripps Research Institute's physical sciences oncology center. His goal is to unlock the underlying mechanics of metastasis by deconstructing the process into simple steps that can each be modeled using equations.
View Full Article - May Require Free Registration | Return to Headlines


Abstract News © Copyright 2010 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe