Association for Computing Machinery
Welcome to the November 8, 2010 edition of ACM TechNews, providing timely information for IT professionals three times a week.

HEADLINES AT A GLANCE


'Net Pioneers: Open Internet Should Be Separate
Computerworld (11/05/10) Grant Gross

A group of Internet and technology pioneers recently filed a document with the U.S. Federal Communications Commission (FCC), advising the agency to allow for an open Internet separate from specialized services that can prioritize Internet Protocol traffic. The document was filed in response to an FCC request for public comments on proposed network neutrality rules. "Representations as to capacity and speed for the Internet must describe only capacity and speed allocated to Internet service," according to the group, which includes Apple co-founder Steve Wozniak, the open source software movement's founder Bruce Perens, New York University's Clay Shirky, and the Massachusetts Institute of Technology's David Reed. Google and Verizon Communications have suggested that specialized services and mobile carriers be exempted from net neutrality rules. "Today the open Internet is allocated a fraction of the capacity delivered over broadband services over fiber and coax by the providers, yet users and services (such as TV and telephony) are migrating to the open Internet and away from those specialized services," Reed notes. The open Internet would not require network management "unless the congestion was caused by less capacity being available than the provider offers to subscribers," according to the group's paper.


Researcher: Rollable Displays Unfold in 2015
EE Times (11/04/10) Rick Merritt

Taiwan's Industrial Technology Research Institute (ITRI) has developed a flexible display that uses a plastic material that can withstand the high temperatures of the thin film transistor liquid crystal display (TFT LCD) processes. The approach requires new manufacturing steps--a coating at the beginning and a de-bonding at the end. The flexible display has the same color and contrast capabilities as similar-sized conventional TFT LCDs. Researchers in the field continue to look for a way to enable consumers to roll a smartphone- or table-sized display in and out of a system the size of a fat crayon, without damaging the materials. "For bending, we can do this 15,000 times in the lab without deterioration of the screen's performance, but to make a device that's rollable by consumers who are more abusive requires a more sturdy design," says ITRI's Janglin Chen. He says rollable color displays could be available by 2015.
View Full Article - May Require Free Registration | Return to Headlines


SDSC Part of DARPA Program to Deliver Extreme Scale Supercomputer
University of California, San Diego (11/04/10) Jan Zverina

The University of California, San Diego's San Diego Supercomputer Center (SDSC) will provide expertise to the U.S. Defense Advanced Research Projects Agency's Ubiquitous High Performance Computing multi-year program, which is developing the next generation of extreme scale supercomputers. The program will be completed in four phases, the first two of which expect to be finished in 2014, and the final two phases expect to result in a full prototype system in 2018. The program's applications of interest include rapid processing of real-time sensor data, establishing complex connectivity relationships within graphs, and complex strategy planning. SDSC's Performance Modeling and Characterization (PMaC) laboratory will analyze and map applications for efficient operation on the project's Intel hardware. "We are working to build an integrated hardware/software stack that can manage data movement with extreme efficiency," says SDSC's Allan Snavely. "The Intel team includes leading experts in low-power device design, optimizing compilers, expressive program languages, and high-performance applications, which is PMaC's special expertise."


W3C Seeks Help, Patience With HTML5 Tests
CNet (11/04/10) Stephen Shankland

The technology industry should focus more on helping the World Wide Web Consortium (W3C) improve its tests for HTML5 rather than trying to draw conclusions from the tests or from the results, says Philippe Le Hegaret, who oversees HTML5 and other standards at W3C. Le Hegaret's remarks come after media reports said Internet Explorer 9 was leading the race to support the new Web page technology. The WSC added 135 new HTML5 compliance checks in October, bringing the total to 232. However, Le Hegaret says many more tests are needed to make the results indicative. "We need all the help we can get to make the test suite relevant and informative," he says. "Unless the community starts helping W3C, we won't be able to properly test HTML5."


DARPA-Funded Project to Spark Computer Science Education
eSchool News (11/04/10) Jenna Zwang

The U.S. Defense Advanced Research Projects Agency (DARPA) recently awarded TopCoder a contract to develop a virtual community featuring competitions and educational resources in order to boost computer science education and help middle and high school students improve their science, technology, engineering, and math skills. DARPA's Melanie Dumas says the virtual community is needed to help reverse the decline in the number of students pursuing computer science degrees, including a 70-percent reduction since 2001. "We've seen staggeringly disappointing results as far as the U.S. population is concerned, both in terms of participation and then, once they do participate, their actual performance," says TopCoder's Robert Hughes. TopCoder will construct a virtual community focused on computer science activities, including logic puzzles and games. "The intent isn't necessarily to improve the quality of education that's out there right now, but more to attract and then retain students … in computer science," Hughes says. He hopes the project also will help get students interested in computer science jobs. "The lack of qualified technologists has really driven the prices [of hiring] to almost a prohibitive level, where new technology development is almost prohibitive because of the cost," Hughes says.


A Software Application Recognizes Human Emotions From Conversation Analysis
Universidad Politecnica de Madrid (Spain) (11/02/10) Eduardo Martinez

Researchers at the Universidad Politecnica de Madrid have developed an application that can recognize human emotion from automated voice analysis. The program, based on a fuzzy logic tool called RFuzzy, analyzes a conversation and can determine whether the speaker is sad, happy, or nervous. If the emotion is unclear, the program can specify how close the speaker is to each emotion in terms of a percentage. RFuzzy also can reason with subjective concepts such as high, low, fast, and slow. The researchers say RFuzzy, which was written in Prolog, also could be used in conversation analysis and robot intelligence applications. For example, RFuzzy was used to program robots that participated in the RoboCupSoccer league. Because RFuzzy's logical mechanisms are flexible, its analysis can be interpreted based on logic rules that use measurable parameters, such as volume, position, distance from the ball, and speed.


Research Team Takes Another Big Step in Creating Small Chips
University of Alberta (11/03/10) Brian Murphy

Researchers at the University of Alberta and the National Institute for Nanotechnology have developed a method for heating plastic in a microwave oven that could help to re-invent the manufacture of computer chips. "When we heat block copolymer plastics, which are two different plastics attached together, the molecules begin separating and naturally self-assemble," says Alberta professor Jillian Buriak. The method reduces the size of computer chips and accelerates their production. "In the case of heat block copolymer plastics, the molecules spontaneously line up, creating nano-sized lines that act as a template for intricate circuitry patterns that can be imprinted on silicon to make computer chips," Buriak says. The researchers' heat and self-assembly technique produces denser patterns of lines on chips than existing methods, which could lead to an overall increase in the processing speed and storage capacity of next generation computers. "The industry is now seeking out a new generation of technologies capable of continuing the miniaturization of computer chips in a cost-effective and practical manner," says the National Institute of Nanotechnology's Ken Harris.


Versatile Algorithms for Nanoscale Designs
University of Cologne (11/04/10) Patrick Honecker

The Integrated Circuit/EM Simulation and Design Technologies for Advanced Radio Systems-on-chip (ICESTARS) project aims to develop and deploy integrated simulation algorithms and prototype tools that will overcome the barriers in existing and future radio frequency (RF) design flows. RF designs are moving into higher frequencies and growing in complexity to meet the market demand for higher bandwidth and more functionality. "The project's research areas have been the efficient connection between the frequency domain, where the RF part of wireless transceiver systems is usually designed, and the time domain, where the digital signal processing and control logic are developed," says ICESTARS researcher Jan ter Maten. The ICESTARS project developed a circuit simulation algorithm that produced a general mathematical framework for different classes of RF circuits by embedding the system of differential-algebraic equations (DAEs) into partial DAEs. The researchers say the ICESTARS algorithms have the potential to reduce the simulation overhead in the RF design process and improve the ability to augment chip development for the next several generations.


'Code for America' Programmers to Work in City Governments
Government Technology (11/03/10) Andy Opsahl

Boston, Seattle, Philadelphia, and Washington, D.C., will each receive a team of five open source Web programmers for 11 months, as chosen by Code for America, a new nonprofit organization that's pairing Web specialists with city governments. Each city paid $250,000 to participate, which included submitting applications and proposals for what they wanted to accomplish. Boston's Code for America fellows will design a Web platform enabling the city to use its educational services to engage students, which could allow students to get suggested readings for homework assignments, discuss their schoolwork with one another, and engage in reading contests. The D.C. team will put together a how-to manual designed to enable other local governments to replicate the city's work with open data programs, such as its Web application contest Apps for Democracy. The Philadelphia fellows will craft an open source mechanism for citizens to collaborate on activities related to local neighborhood services. Seattle's fellows will develop a way for communities to work with one another and public safety officials to improve neighborhood safety. The 20 programmers were chosen from a group of 360 applicants. Code for America executive director Jennifer Pahlka says the group was surprised by the number of applicants who were interrupting successful careers to join the program.


Lift-Off for Ulster Space Age Research
University of Ulster (11/01/10) Martin Cowley

The U.S. National Aeronautics and Space Administration (NASA) is using autonomic software engineering methods in the space agency's ground-control systems. The methods are based on the research of University of Ulster computer scientist Roy Sterritt. Sterritt has used the principles of autonomic computing, which seeks to mirror elements of human physiology in technology, to investigate ways of making next-generation space craft self-managing. Space scientists envision swarms of small craft replacing single-craft missions in the future. Collaborating with NASA scientist Mike Hinchey, the two devised programs that could make small robotic craft self-directing and self-controlling, and even self-destructing if their autonomous behavior were to threaten the safety or technical aims of the mission.


Obama Advances Nanotechnology for Agencies, Industry
NextGov.com (11/01/10) Aliya Sternstein

The Obama administration recently detailed the implications of the $1.76 billion National Nanotechnology Initiative (NNI), which aims to reinvent the nanoscale technology industry. The NNI directs agencies and the research community on how to distribute funding for nanotechnology development over at least the next three years. "Continuing to shrink the dimensions of electronic devices is important in order to further increase processing speed, reduce device switching energy, increase system functionality, and reduce manufacturing cost per bit," according to the NNI plan. Researchers are investigating the use of three-dimensional spatial architectures and transmission methods to overcome the physical limitations of nanotechnology. The NNI blueprint calls for establishing a world-class research and development program, promoting commercial production, and developing a skilled workforce, all with minimal health and environmental risks.


UA Part of Multi-Million Dollar Initiative to Improve Internet
UA News (AZ) (11/02/10) La Monica Everett-Haynes

University of Arizona researchers are participating in a U.S. National Science Foundation (NSF) effort that is investigating ways to revolutionize the Internet's architecture. The Named Data Networking (NDN) project is working to take the focus off the TCP/IP connection and place it on the data being transmitted. Changing the Internet's basic structure could solve issues related to application development, content delivery, mobility support, network trustworthiness, and security. "As our reliance on a secure and highly dependable information technology infrastructure continues to increase, it is no longer clear that emerging and future needs of our society can be met by the current trajectory of incremental changes to the current Internet," says NSF's Ty Znati. "We want a model that fits application and user needs to ensure that application development is easier and that there also is more efficient content delivery," says Arizona professor Beichuan Zhang. The new architecture also will have applications in electronic commerce. NDN will aid in improved content delivery and security by identifying the actual data by unique names and carry authentication information.


New Help on Testing for Common Cause of Software Bugs
Government Computer News (11/01/10) William Jackson

As part of the Automated Combinatorial Testing for Software (ACTS) program, the U.S. National Institute of Standards and Technology (NIST) has developed algorithms for automated testing of the multiple variables in software that can cause security faults. Research has shown that at least 89 percent of security faults are caused by combinations of no more than four variables, and nearly all are caused by no more than six variables, according to NIST. "This finding has important implications for testing because it suggests that testing combinations of parameters can provide highly effective fault detection," NIST says. The ACTS program is a collaborative effort by NIST, the U.S. Air Force, the University of Texas at Arlington, George Mason University, Utah State University, the University of Maryland, and North Carolina State University to produce methods and tools to generate tests for any number of variable combinations.


Abstract News © Copyright 2010 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe