Association for Computing Machinery
Welcome to the March 7, 2011 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Also, please download our new ACM TechNews iPhone App from the iTunes Store by clicking here and our new ACM TechNews iPad App by clicking here.

HEADLINES AT A GLANCE


Computer Science Programs Use Mobile Apps to Make Coursework Relevant
Washington Post (03/06/11) Jenna Johnson

College computer science courses are tapping mobile applications to help students create programs that tackle real-world challenges. For example, a Virginia Tech software engineering class collaborated with the Blacksburg city transit system to acquire data from global positioning system devices on numerous city buses, resulting in an algorithm that predicts arrival times and transmits the information to a prototype mobile app. The move toward relevance among college programming courses comes as the demand for computer science graduates exceeds the number of students graduating with computer science degrees. Computer and math fields are expected to add nearly 800,000 jobs between 2008 and 2018, according to the U.S. Bureau of Labor Statistics. "The sky is falling in a sense that we're not engaging kids that we could be engaging," says the U.S. National Science Foundation's Jan Cuny, who is helping to create a new Advanced Placement course. The current program primarily concentrates on Java coding, while a new class being tested at several colleges would focus on problem solving and technology creation.


Google Schools Its Algorithm
New York Times (03/05/11) Steve Lohr

Researchers at Google and elsewhere are straddling the forefront of computer intelligence through their efforts to program machines to understand human language. Improvements to statistical algorithms are possible due to an ever-increasing corpus of language on the Web and the development of faster computers, but machines face a sizable challenge with parsing and categorizing language. Google's Web site-ranking algorithm has a heavy reliance on connecting search terms to noun phrases in a Web page, as well as a site's popularity and how frequently other sites link to it. Google recently announced that it was performing a major retool of its ranking formula to downgrade low-quality sites that are mainly set up to siphon traffic from Google's search engine. "As we improve the language understanding of the algorithm, all the cheap tricks that people do will be recognized as cheap tricks instead of tricks that work," says Google fellow Amit Singhal. Computer scientists say the future of search engines such as Google resides in leveraging innovations in machine learning and language processing to transform them into answering machines.


Group Seeks Global Protocol to Identify Big Data Sets
Science Insider (03/03/11) Dennis Normile

The National Research Council's Board on Global Science and Technology was established in 2009 to study the implications of global scientific and technological advances for U.S. policy and to develop international partnerships. The board chose big data as a place to start due to significant interest in the topic, says panel chair Ruth David. During a recent symposium on "Realizing the Value from Big Data," the board designated a group to work out the details of a new identification system that would make data sets easier for researchers to locate and use. The group is developing a short digital tag that would identify data sets and provide basic details about the information, says IBM's Bernard Meyerson. He says making the tag easy to create and use is critical so that researchers producing large data sets voluntarily adopt the technology. The group may ask the Internet Engineering Task Force to endorse the identification system. It also wants to develop a way to rate the reliability of data sets as well as methods for merging data in different formats.


Graphene Etching to Usher In Computing Revolution
New Scientist (03/03/11) Jessica Griggs

Exotic computing devices such as ultra-fast computer chips could come a step closer to realization with the development of a technique to precisely etch graphene by Rice University researcher James Tour and colleagues. Using a method known as sputtering, the researchers coated the top layer of a stack of graphene sheets with zinc metal, causing damage only to that layer, which was then stripped off with hydrochloric acid. The other graphene sheets were left intact. "Before this, lithography could never give you single atom precision," Tour says. The National University of Singapore's Vitor Pereira says this technique could lead to the development of a set of different electronic elements composed of and interconnected by graphene. He says such devices would "explore the advantages of graphene to the fullest" and "help realize one of the ultimate goals in graphene-based electronics: All-graphene electronic circuits." The University of Southampton's Zakaria Moktadir says all-graphene circuits could pave the way for the construction of ultra-fast chips, along with more sophisticated touchscreens and sensors. He says that with vertical control established, the next breakthrough would be to achieve precise etching control horizontally, which would enable trillions of single-atom-thick transistors to fit on a chip.


Armies of Expensive Lawyers, Replaced by Cheaper Software
New York Times (03/04/11) John Markoff

The automation of high-level jobs is getting more frequent due to progress in computer science and linguistics. Recent advances in artificial intelligence have enabled software to inexpensively analyze documents in a fraction of the time that it used to take highly trained experts to complete the same work. Some programs can even extract relevant concepts, specific terms, and identify patterns in huge volumes of information. "We're at the beginning of a 10-year period where we're going to transition from computers that can't understand language to a point where computers can understand quite a bit about language," says Carnegie Mellon University's Tom Mitchell. The most basic linguistic approaches use specific search words to find and sort relevant documents, while more sophisticated programs filter documents through large word and phrase definition webs. For example, one company has developed software designed to visualize a chain of events and search for digital anomalies. Meanwhile, another company has developed software that uses language analysis to study documents and find concepts instead of key words. These tools and others are based on Enron Corpus, an email database that includes more than five million messages from the Enron prosecution and was made public for scientific and technological research by University of Massachusetts, Amherst computer scientist Andrew McCallum.


Memristor Processor Solves Mazes
Technology Review (03/03/11)

Yuriy Pershin at the University of South Carolina and Massimiliano Di Ventra at the University of California, San Diego have developed a memristor processor that solves mazes. The researchers connect a voltage across the start and finish of the graphical puzzle and wait. "The current flows only along those memristors that connect the entrance and exit points," say Pershin and Di Ventra. The state of memristors change, enabling them to be easily identified--and a chain of memristors is a potential solution that would be much quicker than maze-solving strategies, which effectively work in series. "The maze is solved in a massively parallel way, since all memristors in the network participate simultaneously in the calculation," Pershin and Di Ventra say. They have conducted a test with a memristor simulator and believe that implementing the device in silicon will become easier over time. With the new approach, the network structure and layout take part in calculating, not just the memristors.


MSU Researchers Develop Method to Match Police Facial Sketches, Mug Shots
Michigan State University Newsroom (03/03/11) Tom Oswald

Michigan State University (MSU) professor Anil Jain and doctoral student Brendan Klare developed a set of algorithms and software that can automatically match hand-drawn facial sketches to mug shots in law enforcement databases. "We match them up by finding high-level features from both the sketch and the photo; features such as the structural distribution and the shape of the eyes, nose, and chin," Klare says. The researchers say the project, which is being conducted in MSU's Pattern Recognition and Image Processing lab, is the first of its kind, and has had positive results so far. "Using a database of more than 10,000 mug shot photos, 45 percent of the time we had the correct person," Klare says. The MSU team plans to field test the system in about 12 months.


Expert Panel: What's Around the Bend for Big Data?
HPC in the Cloud (03/02/11) Nicole Hemsoth

There will be major changes to the way big data is perceived, managed, and harnessed to power analytics-driven projects, according to big data innovators at Yahoo!, Microsoft, IBM, Facebook's Hadoop engineering group, and Revolution Analytics. One of the biggest trends in the next year will be widespread adoption of big data via increased innovation, especially in the enterprise, says Yahoo!'s Todd Papaioannou. The increased adoption is being facilitated by the "expanding Hadoop ecosystem of vendors who are starting to appear, offering commercial tools and support, which is something that these traditional enterprise customers require," Papaioannou says. There will be more emphasis on valuable data that has been overlooked and often thrown out in the past. With more data in use, cloud computing becomes essential in big data management. The cloud will drive down the costs associated with storing and processing big data, offering companies more options for data management, Papaioannou predicts. "The ability to execute statistical operations on and visualize huge data sets is ultimately where the end users of such systems want to go," says Revolution Analytics' David Champagne. As big data's growth continues, more ways to uncover its value will come to light, says Roger Barga with Microsoft's eXtreme Computing Group.


Fast Laser Could Revolutionize Data Communications
Chalmers University of Technology (03/02/11) Christian Borg

Chalmers University of Technology researchers have demonstrated that a surface-emitting laser can provide error-free data transmission at 40 gigabits per second, a new record that could lead to faster Internet traffic, computers, and mobile devices. Surface-emitting lasers are emitted from the surface of the laser chip, which enables researchers to fabricate and test the laser on the wafer before it is cut into individual chips, lowering the cost of production to one tenth that of conventional lasers. The new laser's volume also is smaller and it requires less power without giving up speed. "The laser's unique design makes it cheap to produce, while it transmits data at high rates with low power consumption," says Chalmers professor Anders Larsson. He says the technology could be applied to supercomputers as well as large data centers. "In the huge data centers that handle the Internet there are today over one hundred million surface-emitting lasers," Larsson says. "That figure is expected to increase a hundredfold."


Lijun Yin and the Face of the Future
Binghamton University (03/01/11) Rachel Coker

Binghamton University professor Lijun Yin is researching ways to extend the human-computer interface beyond the keyboard and mouse. "Our research in computer graphics and computer vision tries to make using computers easier," Yin says. He has led the development of methods for submitting data to a computer based on where the user is looking and through gestures and speech patterns. Yin's next goal is to develop a computer that can recognize a user's emotional state, and he is working with Binghamton psychologist Peter Gerhardstein to study how the technology could help children with autism. A common symptom associated with autism is difficulty in interpreting other's emotions, and a common therapeutic technique is to use photographs to teach children when others are showing emotions such as happiness or sadness. Yin's research could lead to three-dimensional avatars that can display a range of emotions, including representations of the child's family that could aid the therapeutic process. "We want not only to create a virtual-person model, we want to understand a real person's emotions and feelings," says Yin. The researchers also are developing algorithms that can identify when people are in pain or lying based only on a photograph.


NIST Expert Software 'Lowers the Stress' on Materials Problems
NIST Tech Beat (03/01/11) Chad Boutin

Computer scientists at the U.S. National Institute of Standards and Technology (NIST) have improved a tool that enables materials designers to determine how stress and other factors impact materials with a complex internal structure. Object-Oriented Finite element analysis (OOF) 2.1 offers a dramatic improvement over the previous version of the program, which has been available since 1998. As a starting point, OOF enables designers to address materials issues by using micrographs--images of a material taken by a microscope. "Version 2.1 greatly improves OOF's ability to envision 'non-linear' behavior, such as large-scale deformation, which plays a significant role in many types of stress response," says NIST's Stephen Langer. "It also allows users to analyze a material's performance over time, not just under static conditions as was the case previously." For example, OOF could be used to see how a ceramic layer on jet turbine blades will respond as the metal blades heat up and expand over time. The new version of OOF also includes templates that enable programmers to include their own details and formulas for describing a particular substance.


Improving the Information Systems Workplace
University of Arkansas (AK) (02/28/11) Barbara Jaquish

Researchers at the University of Arkansas, Florida State University, and Baylor University collaborated on a research project to determine why so many women leave the information systems (IS) industry. The researchers worked with six single-gender focus groups taken from three large corporations. Three focus groups consisted of male managers and were conducted by male researchers, while the other three focus groups consisted of female managers and were conducted by female researchers. The researchers used revealed causal mapping to identify concepts and the relationships between concepts that came up in each of the groups. The researchers found that men displayed an awareness of the challenges women face, but male managers did not state specifically how these concepts fit into the larger system or connect to other concepts. Meanwhile, the researchers found that "gender was central to women's perceptions of how work was conducted in the [IS] workplace," says Arkansas researcher Margaret Reid. The study also found that there is limited communication, even in the best cases, regarding women and the challenges they face in IS.


Abstract News © Copyright 2011 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe