Association for Computing Machinery
Welcome to the May 4, 2009 edition of ACM TechNews, providing timely information for IT professionals three times a week.

HEADLINES AT A GLANCE


Canada on Verge of IT Student Shortage
New Brunswick Business Journal (Canada) (05/04/09) Bundale, Brett

Declining enrollment in computer science programs, along with an increased demand for computer science workers, could lead to a critical workforce shortage in Canada, says Information and Communications Technology Council president Paul Swinwood. "The shortage of new grads and the continued growth in the industry means there will be in trouble if action isn't taken," Swinwood says. He is promoting an information and communications technology (ICT) program for high school students that aims to create stronger links between the ICT industry and the Canadian Ministry of Education and to encourage students to pursue ICT-related programs at a college or university. In the early 1990s, about 150,000 people in Canada worked in the ICT field, and now nearly 650,000 work in the field, but demand is expected to continue to outpace the production of ICT graduates. "Even with the worst economic scenario, we will need 120,000 more people in the sector over the next seven years," Swinwood says. "And if the economy recovers the way we expect it to, that will jump to 150,000 or more." The number of incoming undergraduate students majoring in computer science in the United States fell nearly 70 percent between 2000 and 2005, according to the Computing Research Association. Swinwood says enrollments have fallen in Canada as well, and the skills gap could get worse if college and university computer science enrollments do not improve. There is a demand for at least 18,000 new information technology professionals every year in Canada, but universities produce only 7,000 ICT graduates per year.


Shift in Simulation Superiority
National Science Foundation (04/30/09) Chamot, Joshua A.

The most advanced supercomputers require programming skills that too few U.S. researchers have, and affordable computers and committed national programs outside the United States are whittling down the U.S.'s competitiveness in several simulation-driven fields, according to the International Assessment of Research and Development in Simulation-Based Engineering and Science, which was released by the World Technology Evaluation Center (WETC). "The startling news was how quickly our assumptions have to change," says the National Science Foundation's (NSF's) Phillip Westmoreland. "Because computer chip speeds aren't increasing, hundreds and thousands of chips are being ganged together, each one with many processors. New ways of programming are necessary." The WTEC report highlighted several areas in which the United States still holds a competitive advantage, including the development of novel algorithms, but also highlighted areas that are increasingly being lead by Europe or Asia, such as the creation and simulation of new materials from first principles. Westmoreland says that some of the new high-powered computers are as common as gaming computers, allowing scientific breakthroughs to take place all over the world. "Progress in simulation-based engineering and science holds great promise for the pervasive advancement of knowledge and understanding through discovery," says NSF's Clark Cooper. "We expect future developments to continue to enhance prediction and decision making in the presence of uncertainty."


The Hunt for Insights in the Online Chatter About Swine Flu
New York Times (05/03/09) Cohen, Noam

The Internet's ability to act as a societal thermometer is evident in the prevalence of Web users' swine flu-related searches and commentary on Yahoo!, Twitter, and other Web sites and services. The general insight gathered from such sources is that the recent outbreak is a worrying trend that has yet to break into a fully fledged panic. Giving the public an early warning on influenza outbreaks is the purpose of the Google Flu Trends project, which currently indicates that the country's mood is one of docility because of the relatively few swine flu cases reported so far in the United States. "Right now we are finding out that Google Flu Trends is very specific, but it might not be that sensitive," says University of Iowa professor Philip Polgreen. Online tools capable of mapping out seasonal influenza outbreaks can save money and lives by enabling authorities to schedule inoculations, boost staff at hospitals, and order delivery of treatments. Flu Trends lead engineer Jeremy Ginsberg co-authored an article in Nature detailing how the project works. Two piles of information--five years of material from the government tracking how frequently patients reported flu-like symptoms and five years of Google search data--were collected and then compared for overlap, and one of the main challenges is filtering out "noise," or items that have no actual relevance to the trend being studied. University of Iowa professor Alessio Signorini says that social networking sites such as Twitter could be particularly insightful for public health officials, as people are more likely to report feeling sick on Twitter than go see a doctor.


New Robot With Skin to Improve Human Communication
University of Hertfordshire (04/30/09) Murphy, Helene

Computer scientists at the University of Hertfordshire are covering a child-sized humanoid robot with artificial skin in an effort to help children with autism improve the way they interact with the robot, Kaspar, and other people. The robotic skin will be embedded with tactile sensors that will enable the sensor technology to provide feedback from the body of the robot. Kaspar will be able to respond to the different ways that children touch it in order to help them play in a more socially appropriate manner. "Children with autism have problems with touch, often with either touching or being touched," says professor Kerstin Dautenhahn. "The idea is to put skin on the robot as touch is a very important part of social development and communication and the tactile sensors will allow the robot to detect different types of touch and it can then encourage or discourage different approaches." The work at Hertfordshire is part of a European consortium's three-year Roboskin project.


Computer Hackers R.I.P.--Making Quantum Cryptography Practical
Institute of Physics (04/30/09)

Researchers from Toshiba and Cambridge University's Cavendish Laboratory say quantum communication is possible with practical components for high-speed photon detection. In their paper, "Practical Gigahertz: Quantum Key Distribution Based on Avalanche Photodiodes," the researchers discuss using an attenuated laser as a light source and a compact detector as a decoy protocol to guard against third-party attacks. The erroneous information would confuse all intruders except the compact detector. "With the present advances, we believe quantum key distribution is now practical for realizing high bandwidth information--theoretically secure communication," the researchers say. The high-speed detectors would receive information at higher key rates, would receive more information faster, and would make quantum key distribution easier to use. The researchers note that secure communication is of considerable interest to governments, banks, and large businesses.


Patent Pending: URI Engineers' System for Combating Manipulation of Online Product Ratings
University of Rhode Island News (04/28/09) McLeish, Todd

University of Rhode Island (URI) researchers have developed algorithms to protect against collaborative, profit-driven manipulations of online rating systems. URI professor Yan Sun says a recent survey found that consumers will pay at least 20 percent more for services that receive a 5-star rating over a 4-star rating. Sun says another survey found that eBay sellers with an established reputation could expect to earn 8 percent more revenue than new sellers marketing the same goods. In 2004, a software error revealed that a large portion of book reviews on Amazon.com were written by the publishers, authors, and competitors of the books being reviewed, and a 2006 study found that many eBay sellers artificially boost their reputation by buying and selling positive ratings. Sun says systems already exist that detect obvious ratings manipulations, but these new algorithms have been designed to detect "smart attackers" who try to make subtle changes to a product's rating. Sun, along with URI professors Steven Kay and Qing Yang and former doctoral student Yafei Yang, merged several traditional signal-processing techniques with their algorithms to create a new detection system that they say reduces manipulation by two-thirds.


Web Tool 'as Important as Google'
BBC News (04/30/09)

Physicist Stephen Wolfram says the goal of his free Wolfram Alpha program, which will be available to the public starting in the middle of May, is to "make expert knowledge accessible to anyone, anywhere, anytime." Wolfram's computational knowledge engine was recently demonstrated at Harvard University's Berkman Center for Internet and Society. It is designed to answer questions directly instead of retrieving Web pages in response to queries. Wolfram Alpha employs natural language processing to enable the use of normal, spoken language queries by users, and Wolfram says the program has addressed many of the challenges of interpreting people's questions. The program computes many answers on the fly by capturing raw data from public and licensed databases, along with live feeds. Wolfram says that trillions of pieces of data were selected and managed by a team of experts at Wolfram Research, and that these experts also tweak the information to ensure that it can be read and displayed by the system. He says the system has become proficient at eliminating "linguistic fluff," or words that are unnecessary for the location and computation of relevant data. This statement disappointed Boris Katz of the Massachusetts Institute of Technology, who is head of the Start natural language processing project. "I believe [Wolfram] is misguided in treating language as a nuisance instead of trying to understand the way it organizes concepts into structures that require understanding and harnessing," Katz says.


Sending Cell Phones Into the Cloud
Technology Review (05/01/09) Mims, Christopher

Intel Research Berkeley scientists Byung-Gon Chun and Petros Maniatis have developed CloneCloud, a cloud computing-based clone of smartphones that can handle large computational processing tasks. CloneCloud uses a smartphone's high-speed Internet connection to communicate with a copy of itself that exists in a cloud-computing environment on remote servers. The prototype, which runs Google's Android mobile operating system, creates an exact replica of the phone's software and can handle any processor-intensive task that is too much for the phone itself, based on amount of time and battery life that the task and transmitting the data would require. CloneCloud's major advantage is battery-life extension due to lower utilization of the phone's processor. Security would be a priority for CloneCloud, particularly as smartphones continue to be used as mini general-purpose computers, which could lead to many of the same problems that plague desktops, including viruses and spyware. Maniatis says the research team is developing ways to secure CloneCloud, such as "taint checking," a processor-intensive technique that examines the variables in data entered from outside sources. "We're using execution in the cloud to run email applications in an environment where you can do this emulation without waiting for the heat death of the universe for your smart phone to finish," Maniatis says.


Carnegie Mellon's Sean Green Uses Combination of Computer Tools and Artificial Intelligence to Predict Diarrheal Illness Globally
Carnegie Mellon News (04/28/09) Swaney, Chriss

Carnegie Mellon University (CMU) Ph.D. student Sean Green is using computer modeling tools to identify the best way to prevent the spread of diarrheal illness in more than 192 countries around the world. Green estimates that improving rural sanitation by 65 percent worldwide could save as many as 1.2 million lives. "We want to show where the money can be best spent in these communities where diarrheal illness kills more than two million people a year, and remains the third-leading cause of child mortality," he says. Green, along with CMU professors Mitchell J. Small and Elizabeth A. Casman, developed a pattern-matching algorithm that uses variables describing information about a country to determine which policies are most effective at preventing the outbreak of disease. The researchers say the most important variable they found for reducing diarrheal outbreaks is improving sanitation in rural areas. Green will travel to Bangalore, India this summer to continue studying the causes and impact of diarrheal illness. Green plans to develop a series of surveys to help urban slum communities near cities and non-government agencies develop the best public policies for curbing deadly diarrheal outbreaks.


Could the Net Become Self-Aware?
New Scientist (04/30/09) Brooks, Michael

The Internet is similar to the human brain in that it has a complex network of nodes for holding, processing, recalling, and transmitting information. The Web may also exhibit a level of consciousness. Francis Heylighen, an expert on consciousness and artificial intelligence at the Free University of Brussels in Belgium, describes consciousness as a system of mechanisms for improving information processing by adding more control over which of the brain's processes get the most resources. "Adding consciousness is more a matter of fine-tuning and increasing control ... than a jump to a wholly different level," Heylighen says. A self-aware Internet could constantly work to improve itself, by reorganizing and filling in the gaps in its own knowledge and abilities. Scientists could wake up the Internet by having it monitor its own knowledge gaps and work to address them. "The outlook for humanity is probably better in the case that an emergent, coherent and purposeful Internet mind develops," adds Ben Goertzel of the Artificial General Intelligence Research Institute. Heylighen believes the Internet could be made more self-aware within a decade.


DNA Origami Seeds Offer Bottom-Up Methods for Molecular Self Assembly
U.S. News & World Report (04/30/09) Fink, Leslie

California Institute of Technology (Caltech) professor Erik Winfree is developing a bottom-up approach for building complex man-made objects in which the order is imposed from within the object being made, so that the object "grows" according to a built-in design. The foundation of this technology is an information-containing DNA "seed" that can direct the self-assembled bottom-up growth of DNA tiles in a precisely controlled manner. "We are finally beginning to understand how to program information into molecules and have that information direct algorithmic processes," Winfree says. "It exhibits a degree of control over information-directed molecular self-assembly that is unprecedented in accuracy and complexity." Bottom-up approaches have not been significantly used in technology because researchers have not had a firm understanding on how to design systems that build themselves, Winfree says. He has been working to understand the processes, or algorithms, that generate organization in both computers and the natural world. Caltech's Paul W.K. Rothemund pioneered the seed-DNA technology that enables miniscule "DNA origami" structures to self-assemble into shapes, and the researchers have designed several versions of a DNA origami rectangle that served as the seeds for the growth of different ribbon-like DNA crystals. The seeds were combined in a test tube with other pieces of DNA, called tiles, and heated and slowly cooled, which caused the material to self-assemble into ribbons with particular widths and stripe patterns prescribed by the original seed.


Bringing Efficiency to the Infrastructure
New York Times (04/29/09) Lohr, Steve

Smart infrastructure--greener and more efficient systems for food distribution, railways, and electric grids--is emerging through the deployment of cheap sensors, computing muscle, and software for visualization and analytics. Computers have evolved into powerful tools for calculation and communication, and experts say the next phase is to transform computers into intelligent control instruments connected to data-producing sensors. IBM's smart grid programs employ sensors, software, and computerized household meters to maintain power lines and lower energy consumption, while IBM has a project with Norway's biggest food supplier that uses radio frequency identification (RFID) tags and tracking software over the Internet to optimize shipments and reduce spoilage. In Stockholm, IBM implemented a smart traffic monitoring system that uses Web cameras and RFID cards to reduce traffic in the city, thus lowering congestion and carbon dioxide emissions and increasing ridership on public transportation. Meanwhile, Cisco Systems has experimented with a project to encourage San Francisco commuters to use public transportation by offering a bus equipped with wireless Internet access and onboard touchscreens that are fed constantly updated data on connections and wait times. Smart infrastructure systems often face political opposition, as evidenced by New Yorkers' resistance to New York City Mayor Michael R. Bloomberg's proposal for a smart traffic system in Manhattan. Nevertheless, experts see smart infrastructure as having massive potential for efficiency upgrades. "We've barely scratched the surface of how information technology can help control and conserve energy use," says Harvard Business School professor Rosabeth Moss Kanter.


Abstract News © Copyright 2009 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]




Unsubscribe
Change your Email Address for TechNews (log into myACM)