Association for Computing Machinery
Welcome to the September 15, 2014 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).


Harvard Computer Science Introductory Course Logs Record-Breaking Enrollment Numbers
The Harvard Crimson (09/11/14) Meg P. Bernhard

Nearly 12 percent of Harvard College's students have enrolled in the college's introductory computer science class, Computer Science 50: "Introduction to Computer Science I." With a record-breaking total enrollment of 818 undergraduate students this semester, CS50 is the college's largest course, followed by "Principles of Economics," the previous semester's largest course. Several factors are contributing to the class's popularity. Instructor David J. Malan says the boost in enrollment in part reflects a growing interest among Harvard students and the general public in computer science. Professor Eddie Kohler says CS50's growing popularity also is due to its accessibility, characterizing the course as more of an experience. Harry R. Lewis, Harvard's director of undergraduate studies for computer science, says Harvard students have "figured out that in pretty much every area of study, computational methods and computational thinking are going to be important to the future." Lewis also says he has seen higher enrollment than ever in other computer science courses this semester, including "Introduction to the Theory of Computation," which has 153 students enrolled. The number of computer science concentrators at Harvard also has increased, nearly doubling between 2008 and 2013.

New Species of Electrons Can Lead to Better Computing
University of Manchester (09/12/14) Daniel Cochlin

Researchers at the University of Manchester and the Massachusetts Institute of Technology (MIT) have found that electrons move at a controllable angle to applied fields in an altered form of graphene, a discovery that could lead to next-generation low-energy computers. The new form of graphene is transformed into a superlattice state by placing it on top of boron nitride and then aligning the crystal lattices of the two materials; the graphene superlattice behaves as a semiconductor. Electrons in graphene superlattices behave as neutrinos that have acquired a notable mass, resulting in a new, relativistic behavior so electrons can skew at large angles to applied fields. The researchers say the discovery could help enhance the performance of graphene-based electronics, making it comparable to silicon. Graphene superlattices should consume less energy than conventional semiconductor transistors because charge carriers drift perpendicular to the electric field, which results in little energy dissipation. "The demonstrated transistor highlights the promise of graphene-based systems for alternative ways of information processing," says MIT professor Leonid Levitov.

Bound for Robotic Glory
MIT News (09/15/14) Jennifer Chu

Massachusetts Institute of Technology (MIT) researchers have developed an algorithm for bounding and successfully implemented it in a robotic cheetah. During testing, the robot sprinted up to a speed of 10 miles per hour, and was able to continue running after clearing a hurdle. The current version of the robot has the potential to reach speeds of up to 30 miles per hour, according to the researchers. Each of the robot's legs is programmed to exert a certain amount of force in the fraction of a second that it hits the ground, which enables it to maintain a given speed. By adapting a force-based approach, the cheetah-bot can run on rougher terrain, such as a grassy field, according to MIT professor Sangbae Kim. "That's what makes the MIT cheetah so special: you can actually control the force profile for a very short period of time, followed by a hefty impact with the ground, which makes it more stable, agile, and dynamic," Kim says. The algorithm's force prescriptions enable the robot to run at higher speeds without falling by precisely controlling the forces the robot can exert while running.

Expanding the Lifespan of Critical Resources Through Code Modernization
Scientific Computing (09/09/14) Doug Black

Continuing changes in high-performance computing (HPC) hardware architecture, which is steadily approaching the exascale era, makes the modernization of code crucial for scientific computing. "Legacy applications' and domain libraries' algorithms, data structures, design, and implementation do not perform well--or sometimes not at all--on new high-core count architectures, such as Intel Xeon Phi and NVIDIA [graphics processing units]," says Sandia National Laboratory's H. Carter Edwards. Code modernization is as much an issue of ensuring that current applications continue to function on new architectures, as it is of ensuring those architectures' potential is being exploited to the fullest. Sandia's Simon Hammond says there are many steps that must be taken to modernize existing applications, from improving code structure to increasing the use of templating techniques to improving the ability of code to vectorize by using language features. However, experts agree one of the major issues is adding thread-scalable parallel algorithms. This capability will enable code to keep up with the trend of multicore hardware. Experts also agree code modernization requires the efforts of dedicated teams with access to appropriate resources and a strategic plan for keeping pace with the latest hardware innovations.

Algorithms Reveal Forecasting Power of Tweets
Binghamton University (09/10/14) Todd R. McAdam

A team of Binghamton University system scientists led by alumnus Nathan Gnanasambandam, senior researcher at Xerox Research's Palo Alto Research Center (PARC), is developing technology that connects Tweets and other user metadata to make highly accurate predictions about everything from where someone plans to eat to when a traffic jam is likely to begin. Using machine learning and an artificial neural network, the PARC and Binghamton researchers analyzed 500 million tweets to develop their algorithms, which they say can predict the typical social media user's behavior with better than 90-percent accuracy in a three-hour time horizon. Some people share more than others, meaning there is variability in how accurate the predictions can be with any given user, but the researchers say their algorithms return usable predictions more than 60 percent of the time when other metadata, such as credit card transactions, phone calls, and global-positioning systems, are integrated. Xerox, which is funding the research, plans to use the technology in several areas including traffic control, where it has the capability of predicting when traffic flow is likely to become heavy and result in a traffic jam. Other potential applications exist in medicine and customer service call centers.

Making Video Games More Fun...for the Audience
Melbourne Newsroom (09/10/14)

Video games need to be redesigned to include audience experience, according to University of Melbourne professor Frank Vetere. "Advances in gaming technology open up design possibilities for game makers to produce games that are more engaging for the audience," he notes. University of Melbourne Ph.D. candidate John Downs has observed players and audience members in both a laboratory setting and in homes, and he says game makers do not really understand or cater to the experience of audience members. Downs found the design of a video game and the way the player interacts with it affects the enjoyment of others waiting their turn or those tolerating the player's hobby. Games that involve more physicality, such as those played with Xbox Kinect, are more fun to watch. The use of second screens to augment the main game play is an innovation that is on the rise. Downs suggests a parent, friend, girlfriend, or boyfriend might be more interested in contributing to the game through a mobile device, which they could use to control an enemy, some kind of obstacle, or assist the main player in some way.

How to Estimate Energy Footprint in Highways
Technical University of Madrid (Spain) (09/10/14)

A new application from Technical University of Madrid researchers is designed for estimating the energy footprint of highways. The Highway EneRgy Assessment (HERA) application consists of a methodology and software tool developed by researchers in the Transport Research Center. HERA focuses on the operation phase of a highway's energy footprint. The application estimates the energy footprint of traffic flow of a highway, taking into account certain traffic conditions. HERA can estimate fuel consumption, energy consumption, and greenhouse gas emissions of a highway or a section with different conditions during a year. The researchers say new features make the application stand out from existing tools. For example, HERA can be applied to any road network through the "section to section" approach. Moreover, the application can estimate the energy footprint of toll roads based on diverse payment systems, including money, credit card, and free flow. HERA is particularly interesting to evaluate alternatives and strategies focused on speed management, fleet renewal, heavy vehicles management, design of roads, and toll payment systems. Finally, HERA methodology links with the entry and exit data with a geographic information system to yield a geographical representation of the energy consumption and carbon footprint.

Cloud-Computing Revolution Applies to Evolution
Rice University (09/10/14) Mike Williams

Rice University researchers are using a $1.1-million U.S. National Science Foundation grant to develop parallel-processing tools that track the evolution of genes and genomes across species. The new open source algorithms will lead to sophisticated computing techniques that can be used by researchers around the world through the cloud. The programs will be able to run parallel analyses on thousands of computers, with results that may be faster and that can trace genes on scales that were not practical before. The project will expand upon Bayesian inference techniques that allow biologists to build on prior knowledge. "Analyzing data sets with 10 or 20 gene sequences can easily take hundreds of hours," says Rice professor Luay Nakhleh. "But the tree of life has millions of sequences and is built from millions of species. There's no way traditional Bayesian techniques are even going to get close to handling that." Computer farms that allow thousands of machines to cooperatively work on a problem have the potential to revolutionize bioinformatics, according to fellow Rice professor Christopher Jermaine. "We're talking about potentially taking a years- or decades-long computation and making it feasible by changing the underlying algorithm and making it amenable to distributed computing," he says.

First Graphene-Based Flexible Display Produced
University of Cambridge (09/05/14)

University of Cambridge researchers have combined graphene research with transistor and display processing to create the first transistor-based flexible device. The researchers say the breakthrough is the first step toward the wider implementation of graphene and graphene-like materials into flexible electronics. The new prototype is an active matrix electrophoretic display that is made of flexible plastic instead of glass. The display's pixel electronics include a solution-processed graphene electrode that replaced the sputtered metal electrode layer. The new 150-pixel-per-inch backplane was generated at low temperatures using Plastic Logic's Organic Thin Film Transistor technology. The graphene electrode was deposited from solution and patterned with micron-scale features to complete the backplane. The backplane was integrated with an electrophoretic imaging film to create an ultra-low power and durable display. "This is a significant step forward to enable fully wearable and flexible devices," says Cambridge professor Andrea Ferrari. "This cements the Cambridge graphene-technology cluster and shows how an effective academic-industrial partnership is key to help move graphene from the lab to the factory floor." Ferrari says future demonstrations of the technology could incorporate liquid crystal and organic light-emitting diodes technology to achieve full color and video functionality.

Nanotechnology Aids in Cooling Electrons Without External Sources
University of Texas at Arlington (09/10/14) Herb Booth

University of Texas at Arlington (UT-Arlington) researchers have developed a method for cooling electrons to -228 degrees Celsius without external means and at room temperature, a breakthrough that could lead to very low power electronic devices. The new method involves passing electrons through a quantum well to cool them and keep them from heating. "Obtaining cold electrons at room temperature has enormous technical benefits," says UT Arlington professor Seong Jin Koh. "For example, the requirement of using liquid helium or liquid nitrogen for cooling electrons in various electron systems can be lifted." Electrons are thermally excited at room temperature, but if that excitation could be controlled, then the temperature of those electrons could be lowered without external cooling. The researchers used a nanoscale structure comprised of a sequential array of a source electrode, a quantum well, a tunneling barrier, a quantum dot, another tunneling barrier, and a drain electrode to suppress electron excitation and to make electrons cold. Cold electrons can be used for a new type of transistor that can operate at extremely low-energy consumption. "When implemented in transistors, these research findings could potentially reduce energy consumption of electronic devices by more than 10 times compared to the present technology," says the U.S. National Science Foundation's Usha Varshney.

Essex Scientists Give Insight Into Future of Ultra-HDTV Live Stream Technology
University of Essex (09/09/14)

University of Essex scientists will showcase their pioneering ultra-high definition TV (UHDTV) research at the International Broadcasting Convention in Amsterdam. The researchers already have used new technology to stream the university's graduation ceremonies live across the globe via the Internet earlier this year. The team managed to live stream the 4K UHDTV--four times the current HD resolution--by adapting off-the-shelf video-compression equipment. The researchers were able to compress the ultra-high definition image so it could be live streamed at 8 Mbps via ordinary broadband connections without loss of quality and in real time, avoiding the frustrations of waiting for the stream to buffer. "This type of live streaming involves a huge amount of raw data, equivalent to about 63,000 phone calls being made all at once," says Essex professor Stuart Walker. "It was a major challenge to be able to compress this signal to a size which could be accessed by even the most basic broadband connection around the world." Walker says recipients would have needed a special 4K TV; the team's next challenge is to make live 8K images affordable.

National Trustworthy Software Initiative to Be Based at WMG's Cyber Security Center
University of Warwick (09/09/14) Peter Dunn

A new program at the University of Warwick will focus on enhancing the cybersecurity of everyday technologies and tools by helping to ensure the underlying software is more trustworthy. Warwick Manufacturing Group's (WMG) Cyber Security Center will host the two-year program. A team from the Trustworthy Software Initiative (TSI), which is funded by the U.K. government, will work alongside researchers in the WMG Cyber Security Center. Cyber Security Center director and professor Tim Watson notes the trend to embed software in everyday items is increasing. "However, there are concerns about the quality of the software that underpins all of this, and we have not seen significant improvements," he says. Untrustworthy software is the root cause of many cybersecurity problems due to vulnerabilities linked to safety, reliability, availability, resiliency, and security, says TSI Knowledge Transfer director Tony Dyhouse. "There is a pressing need to address the quality and robustness of our software--to establish its trustworthiness," he says. Dyhouse also notes TSI has "made a significant start in the U.K. by documenting, for first time, the overall principles for effective software trustworthiness in PAS 754:2014 Software trustworthiness--Governance and management--Specification."

Abstract News © Copyright 2014 INFORMATION, INC.
Powered by Information, Inc.

To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe