Association for Computing Machinery
Welcome to the August 15, 2011 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Also, please download our new ACM TechNews iPhone App from the iTunes Store by clicking here and our new ACM TechNews iPad App by clicking here.


Administration Issues Far-Reaching Plan for Building Cyber Workforce (08/12/11) Aliya Sternstein

The Obama administration has unveiled the first-ever roadmap for building a U.S. cybersecurity workforce and assessing federal success in raising public awareness of computer threats. "The public is insufficiently aware of the risk of sharing information in cyberspace--which can affect personal and national security," the roadmap says. The plan, developed by the National Initiative for Cybersecurity Education, specifies instructions for the federal government on how to "maintain an unrivaled, globally competitive cybersecurity workforce." It stipulates that agencies must adopt cybersecurity competency models by 2012, while by 2015 the government will generate an estimate of the national cybersecurity workforce's health. The roadmap also includes guidelines for developing qualifications for cyberprofessionals, and it says that by 2013 officials will have established a baseline for all required skills, while leaders should be able to evaluate the strength of federal, state, and local cybersecurity staffing against defined proficiencies. Much of the framework focuses on joint private sector/academic projects. The plan urges collaboration with companies and universities to identify new workforce requirements necessitated by evolving threats and technological advances. The plan also calls for opening up government resources to citizens and federal workers.

Industry Tries to Streamline Privacy Policies for Mobile Users
New York Times (08/14/11) Tanzina Vega

As concerns grow over data collection, including proposed legislation to more closely protect consumers, mobile applications developers are building basic privacy policies into their programs. Although many application developers create programs that collect data on users and sell that information to produce customized advertising, they often lack a consistent approach for protecting users' privacy, according to the Association of Competitive Technology's Morgan Reed. "Solving this privacy problem is absolutely critical for us," Reed says. "We want to make sure this revenue stream continues." By 2015, 36 percent of United States consumers will use mobile Internet services, and spending on mobile advertising is expected to increase to $2.8 billion, according to Forrester Research. Rising privacy concerns have led Sens. Al Franken (D-Minn.) and Richard Blumenthal (D-Conn.) to propose legislation that would require mobile companies to get a user's consent before collecting location-based data and before sharing that data with third parties. Several firms have responded to the need for privacy by developing policies that are both easy for consumers to read and easy to create for mobile application developers.

IBM Exec: The End of the PC Era Is Here
eWeek (08/11/11) Jeffrey Burt

Thirty years after IBM's 5150 personal computer launched the PC era, IBM's Mark Dean says the PC's time is ending. PC sales worldwide grew by 2.3 percent in July, slower than in previous months, and below the predicted growth of 6.7 percent, according to a Gartner study. The PC's fall can be attributed to the rise of new technologies such as smartphones and tablets, as well as the social interactions that these devices provide. "These days, it's becoming clear that innovation flourishes best, not on devices but in the social spaces between them, where people and ideas meet and interact," Dean says. Technology companies need to understand where computing is headed and to embrace "that which is technologically inevitable," meaning a future of varied devices that are connected to the cloud, says former Microsoft chief software architect Ray Ozzie. However, Intel executives are not entirely giving up on PCs, touting their ultrabook concept, which is a very light and thin notebook that offers the performance of traditional laptops and the features provided by tablets. Microsoft's Frank Shaw says the company sees PCs evolving to meet consumer's demand instead of slowly fading out of existence.

LHC@Home Allows Public to Help Hunt for Higgs Particle
BBC News (08/11/11)

The Large Hadron Collider (LHC) team has launched another effort to use the collective computing power of home computers to help simulate particle physics experiments. In 2004, the LHC team used the public's computers to simulate beams of protons, but with LHC@home 2.0 it plans to take advantage of advances in home computers to simulate the much more complex particle collisions themselves. "Volunteers can now actively help physicists in the search for new fundamental particles that will provide insights into the origin of our universe, by contributing spare computing power from their personal computers and laptops," according to a statement from Cern, the European Organization for Nuclear Research, which runs the LHC. The world's most powerful atom smasher, the LHC facility is located in an underground ring beneath the Swiss-French border. The LHC will use the computing power to complement its network, the Worldwide Large Hadron Collider Computing Grid, a computing infrastructure that handles the 15 million gigabytes of data the facility produces each year. LHC@home 2.0 will split up the task of simulating the collisions and feed them back to scientists to compare the simulations.

Research in Game Theory Tackles IT Complexity
IDG News Service (08/10/11) Nicolas Zeitler

Institute of Science and Technology (IST) Austria professor Krishnendu Chatterjee recently was awarded grants from the European Research Council and Microsoft to support his efforts to develop more complex but efficient IT systems. "As I and my team do theoretical research we don't need much special, costly equipment," as the main need is in qualified research staff, says IST Austria professor Krishnendu Chatterjee. The researchers currently are working on quantitative graph games and their application to the synthesis of correct systems. "Our goal is to develop automated tools that help in the development of software systems," Chatterjee says. They are working to develop new methods for verifying reactive systems such as operating systems. The researchers will collaborate with Microsoft researchers, staying within the guidelines of the fellowship. "As Microsoft develops operating systems, our work has a close relation to their work," Chatterjee says.

SDSC Readying 'Gordon' Supercomputer for Pre-Production Trials This Month
UCSD News (CA) (08/09/11) Jan Zverina

The University of California, San Diego's San Diego Supercomputer Center (SDSC) this month will launch the pre-production phase of Gordon, the first high-performance supercomputer equipped with large amounts of flash-based solid state drive memory. Gordon, which has 64 input/output nodes joined by an InfiniBand switch fabric communications link, will be available to U.S. researchers that want to run large-scale database applications, according to SDSC director Michael Norman. Gordon features about 300 trillion bytes of flash memory that can handle massive databases at speeds up to 100 times faster than hard drive disk systems, according to Norman. "Now we have enterprise [multi-level cell], and it's available at both attractive prices and with very good durability (or write endurance), which is achieved by over-provisioning and wear leveling," he says. Gordon is designed to help researchers in computational science, visual analytics, and interaction network analyses. "Data of this size is simply becoming unmanageable for analysis, so there is an urgent need for supercomputers like Gordon," Norman says.

Disney, Carnegie Mellon Researchers Build 3D Face Models That Give Animators Intuitive Control of Expressions
Carnegie Mellon News (PA) (08/09/11) Byron Spice; Jennifer Liu

Researchers at Carnegie Mellon University and Disney Research Pittsburgh have developed computerized models derived from actors' faces that reflect a full range of natural expressions and provide animators with the ability to manipulate facial poses. The method translates the motions of actors into a three-dimensional face model and subdivides it into facial regions that enable animators to intuitively create new poses. "We can build a model that is driven by data, but can still be controlled in a local manner," says Disney researcher J. Rafael Tena. The researchers created their models by recording facial motion capture data from an actor performing sentences with emotional content, localized actions, and random motions. The researchers applied 320 markers to the actor's face, which enabled the camera to capture facial motions during the performances. They used a mathematical model to analyze the facial data, dividing the face into different regions and measuring the distances between points as the face moved. The researchers also plan to develop models based on higher-resolution motion data and an interface that can be used by computer animators.

Portable, Super-High-Resolution 3-D Imaging
MIT News (08/09/11) Larry Hardesty

Massachusetts Institute of Technology (MIT) researchers have developed GelSight, a portable imaging system that combines a physical interface with computer-vision algorithms. GelSight consists of a slab of transparent, synthetic rubber with one side coated with a paint containing tiny pieces of metal. When the slab is pressed against an object, the paint-coated side deforms. Cameras mounted on the other side of the slab photograph the results, and algorithms analyze the images. At the recent ACM SIGGRAPH 2011 conference, MIT researchers Edward Adelson and Micah Kimo Johnson presented a new version of GelSight that can register physical features less than a micrometer in depth and about two micrometers across. The new version can produce three-dimensional (3D) models of an object, which can be manipulated on a computer screen. The MIT team also has developed a prototype sensor that an operator can move over the surface of an object to produce a 3D image almost instantly. "I think it's just a dandy thing," says University of Southern California professor Paul Debevec. "It's absolutely amazing what they get out of it."

A New Window to the Face (08/09/11) Douglas Gantenbein

Researchers from Texas A&M University and Microsoft Research Asia have developed an approach to creating high-fidelity, three-dimensional images of the human face, which can depict large-scale features and expressions as well as subtle wrinkling and the movement of human skin. The research could have applications in computerized filmmaking and in creating realistic user avatars. The team used marker-based motion capture, applying about 100 reflective dots to three volunteer actors' faces. Video cameras captured the actors as they made a series of facial expressions while the researchers collected data on how faces change with the different expressions. The researchers then used a laser scanner to capture high-fidelity facial scans, which were aligned with the corresponding frames in the marker-based facial data. To avoid glitches in the system, the researchers used a two-step registration algorithm, first registering large facial expressions and then refining the scans by dividing the face into smaller areas. Finally, the team combined the motion-capture data with the face scans to reconstruct the actual expressions as they were performed.

New Anticensorship Scheme Could Make it Impossible to Block Individual Sites
University of Michigan News Service (08/09/11) Nicole Casal Moore; Steve Crang

Researchers at the universities of Michigan and Waterloo are developing Telex, an approach to combating Internet censorship that turns the entire Web into a proxy server, making it nearly impossible for a censoring government to block individual sites. The system requires that Internet service providers (ISPs) outside the censoring nation install Telex stations. When a user wants to visit a banned Web site, a secure connection is first established with any password-protected Web site that is not blocked and serves as a decoy connection. The Telex software then marks the connection as a Telex request by applying a secret-coded tag into the page headers using public-key stenography. The user's request passes through routers at different ISPs, some of which would be Telex stations, which hold a private key that enables them to recognize tagged connections from Telex clients. The stations would send the connections so that the user could get to any site on the Internet. "It would likely require support from nations that are friendly to the cause of a free and open Internet," says Michigan professor J. Alex Halderman.

NSF's Seidel: 'Software Is the Modern Language of Science'
HPC Wire (08/09/11) Jan Zverina

Software is the core technology behind a dramatic acceleration in all areas of science, and the U.S. National Science Foundation (NSF) and the science community are faced with the challenge of building a cyberinfrastructure model that integrates advancing technologies, says NSF's Edward Seidel. NSF's Extreme Science and Engineering Discovery Environment (XSEDE) program, the successor to the TeraGrid project, is intended to be the most powerful cluster of advanced digital resources and services in the world. "I think XSEDE probably marks the beginning of a national architecture with the capability of actually putting some order into [multiple approaches to observation, experimentation, computation, and data analysis]," says Seidel. Technological innovations are creating a multilevel cybercrisis, including the problem of managing the exponentially increasing data volumes produced by various digital resources, according to Seidel. This flood of data offers opportunities for potentially powerful national and possibly global collaborations. "We need to be thinking about developing cyberinfrastructure, software engineering, and capabilities to mix and match components, as well as data sharing policies, that really enable scenarios such as coupled hurricane and storm surge prediction, as well as the human response to such events," Seidel says.

DOE at Work on Scientific 'Knowledgebase'
Internet Evolution (08/08/11) Ariella Brown

The U.S. Department of Energy (DOE) recently completed the research and development phase of the Systems Biology Knowledgebase (Kbase) project, which aims to make plant and microbial life data more accessible for scientific sharing and integration. Kbase will be useful to those who want to apply the project's data, metadata, and tools for modeling and predictive technologies to help the production of renewable biofuels and a reduce carbon in the environment, according to DOE researchers. The integration of the data, combined with computational models, is expected to greatly accelerate scientific advancements. "There are many different 'silos' of information that have been painstakingly collected; and there are a number of existing tools that bring some strands of data into relation," says Michael Schatz, a quantitative biologist involved in the project. "But there is no overarching tool that can be used across silos." Kbase would enable different labs to submit data and to advance the body of knowledge and spur innovations in predictive biology. DOE hopes Kbase will evolve into a system that can grow as needed and be used by scientists without extensive training in applications.

Abstract News © Copyright 2011 INFORMATION, INC.
Powered by Information, Inc.

To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe