Association for Computing Machinery
Welcome to the November 14, 2012 edition of ACM TechNews, providing timely information for IT professionals three times a week.

ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


Cray Bumps IBM From Top500 Supercomputer Top Spot
IDG News Service (11/12/12) Joab Jackson

Oak Ridge National Laboratory's Titan supercomputer system was named the world's fastest supercomputer in the latest edition of the Top500 list. Titan, a Cray XK7, took the top spot from Lawrence Livermore National Laboratory's Sequoia supercomputer, the IBM BlueGene/Q, which came in second. Under the Linpack benchmark, Titan executed 17.59 petaflops, compared to Sequoia's 16.32 petaflops. Also in the top five were the RIKEN Advanced Institute for Computational Science's K supercomputer performing at 10.5 petaflops, the DOE Argonne National Laboratory's Mira, a BlueGene/Q system performing at 8.16 petaflops, and the German Forschungszentrum Juelich's Juqueen, also an IBM BlueGene/Q system, performing at 4.14 petaflops. The most recent Top500 list had 23 systems demonstrating petaflop performance, just four and a half years after the National Center for Supercomputing Applications' Roadrunner system became the first petaflop-scale system. The latest list also reveals other trends in supercomputing. This edition lists 62 systems using accelerator and co-processor technology such as NVIDIA graphics processing units, up from 58 six months ago. The United States hosts 251 of the top 500 systems, while Asia and Europe host 123 and 105 systems, respectively.


Exploring Credits for Free Online Courses
Washington Post (11/13/12) Nick Anderson

The American Council on Education (ACE), the Bill & Melinda Gates Foundation, and Coursera have launched an initiative to determine if free online courses are worthy of academic credit and how they might be used to help more people pursue college degrees. The announcement is the latest sign of the emerging influence of massive open online courses (MOOCs). As part of the initiative, Coursera will pay ACE a fee to evaluate the credit worthiness of a selection of its courses. A recommendation from ACE that the courses are worthy of credit would be a key step toward helping students obtain transfer credit from other schools. If that path to credit becomes a reality, "it is going to push more people into college and make them more successful," says Stanford University professor and Coursera co-founder Daphne Koller. The Gates Foundation is awarding ACE more than $895,000 in grants to coordinate research on MOOCs and convene university presidents for an innovation lab to discuss strategies to capitalize on the potential of MOOCs. ACE also is in discussions with edX, a nonprofit MOOC venture led by the Massachusetts Institute of Technology and Harvard University, about analyzing its courses.


Speeding Algorithms by Shrinking Data
MIT News (11/13/12) Kimberly Allen

Massachusetts Institute of Technology (MIT) researchers have developed a technique to represent data so that it takes up much less space in memory but can still be processed in conventional ways. The researchers say the approach will lead to faster computations, and could be more generally applicable than other big-data techniques because it can work with existing algorithms. The researchers tested the technique on two-dimensional location data generated by global positioning system receivers used in cars. The algorithm approximates the straight line that is made by the different points at which a car turns. The most important aspect of the algorithm is that it can compress data on the fly by using a combination of linear approximations and random samples, says MIT's Dan Feldman. Although some of the information is lost during compression, the researchers were able to provide mathematical guarantees that the error introduced will stay beneath a low threshold. The researchers are now investigating how the algorithm can be applied to video data analysis, in which each line segment represents a scene, and the junctures between line segments represent cuts.


White House Tells Tech Sector to Dream Big
CIO (11/12/12) Kenneth Corbin

The White House Office of Science and Technology Policy (OSTP) is challenging industry members, government agencies, academics, and philanthropic organizations to develop ways to use new technology to solve some of the U.S.'s most pressing problems. OSTP's Tom Kalil recently cited several promising projects, including Google's efforts to develop driverless cars, IBM's work in artificial intelligence, and the sequencing of the human genome. The administration's efforts to promote technology to address the country's challenges follows previous outreach efforts to court the business and startup communities and build collaborative partnerships with the private sector. Last January the White House convened business and philanthropic leaders to unveil the Startup America Partnership, a campaign to help incubate and launch innovative startup businesses. The administration also supported the passage of the Jumpstart Our Business Startups ACT, which aims to ease the path of startups to secure capital through mechanisms such as crowdfunding. In addition, Kalil's office wants to elevate the cultural status of successful entrepreneurs by encouraging filmmakers and others in the entertainment industry to focus on scientists and inventors who are developing transformative technologies and businesses.


New NCSA Team to Focus on Big Science and Engineering Data Challenges
NCSA News (11/08/12) Trish Barker

The U.S. National Center for Supercomputing Applications' (NCSA's) new Data Cyberinfrastructure Directorate will combine NCSA projects, personnel, and capabilities to focus on data-driven science. "Science and engineering are being revolutionized by the increasingly large amounts and diverse types of data flowing from new technologies, such as digital cameras in astronomy, highly automated sequencers in biology, and the detailed simulations enabled by the new generation of petascale computers, including NCSA's Blue Waters," says NCSA's Thom Dunning. The Data Cyberinfrastructure team will fill the gap between data and the research and educational uses of the information that comes from that data, says NCSA's Rob Pennington, who leads the effort. He says data-driven science builds on advanced information systems to analyze raw data, and the resulting conclusions are accessible, searchable, and usable to the wider community of scientists and engineers. "Many disciplines and projects face the same or very similar issues, so it is more productive for the researchers if we leverage solutions across multiple domains rather than each community separately grappling with its data challenges in isolation," Pennington says. The Data Cyberinfrastructure Directorate initially will focus on five areas of science and engineering, including astronomy, biomedicine, sustainability, industry, and geographic information systems.


The True State of Artificial Intelligence
Monash University (11/09/12)

Monash University researcher Kevin Korb recently discussed what stage artificial intelligence (AI) research has reached. "The goal of AI as a discipline is to produce AI as an artifact, and the motivations for that are many and diverse," Korb says. He says the debates surrounding AI have led to three prominent conclusions. The first conclusion is that AI is, was, and always will be brain dead, according to Korb. He cites philosopher Hubert Dreyfus' argument that traditional AI, which uses using rules, symbols, and data structures, cannot possibly simulate human intelligence. Korb says the second possibility is that AI requires only another 10 or 20 years to put human brain simulation within researchers' grasp, at which point the evolution of humanity will be overtaken and absorbed by the evolution of our artifacts. Korb says the final conclusion is that if an AI is ever to be developed, it will require a long-term, collective effort of many researchers over multiple generations.


Federal Agencies, Private Firms Fiercely Compete in Hiring Cyber Experts
Washington Post (11/13/12) Ellen Nakashima

U.S. federal agencies are intensely vying with the private sector for people with cyberskills as cyberthreats escalate, and among the factors agencies must contend with is the higher salary that private-sector cybersecurity jobs tend to offer. The U.S. Department of Homeland Security (DHS) currently has a 400-person cyber workforce, and officials say it wants to build an even bigger federal cybersquad in addition to the 1,500 contractors who already work for the agency. DHS secretary Janet Napolitano organized a cyber taskforce whose goals include having the government reserve the most technically intensive jobs for government employees rather than outsource them to contractors, and training military veterans for mission-critical cybersecurity operations. The U.S. National Security Agency anticipates hiring more than 1,200 people this year, with about 45 percent in the science, math, and cyber disciplines. The Intelligence and National Security Alliance's Vanee Vines says the agency is especially seeking cyberprofessionals in computer science, computer engineering, electrical engineering, and math. Meanwhile, the U.S. Federal Bureau of Investigation (FBI) also is fortifying its cyber capabilities to probe and halt the theft of information, money, and intellectual property. The bureau plans to hire 50 computer scientists, up from 20 today, says the FBI's Richard A. McFeely.


Crowdsourcing Feature Lets iPhone Users Determine Best Time to Cross U.S. Border
UCSD News (CA) (11/08/12) Doug Ramsey

University of California, San Diego researchers have developed the Best Time to Cross the Border mobile app, which offers a real time, eyewitness account of how long commuters have to wait at border crossings. The crowdsourced data is combined with data on wait times at the border from U.S. Customs and Border Protection to improve the accuracy of the wait times. Crowdsourcing information from a wide range of people crossing the border adds an extra dimension to the survey. "Crowdsourcing should eliminate the problems that the border patrol has been having with regards to the accuracy of the wait times," says California Institute for Telecommunications and Information Technology (Calit2) researcher Ganz Chockalingam. The app builds on other services that Calit2 provides, such as the California Wireless Traffic Report, which facilitates access to information on commuting times in the Bay Area, Los Angeles, Orange County, and San Diego County. "With a little help from our app, you can make an informed decision about when and where you want to cross the border if time is a factor," Chockalingam says. The app also uses historical graphs to show users the best times to cross.


A New Chip to Bring 3-D Gesture Control to Smartphones
Technology Review (11/13/12) Jessica Leber

Microchip technology has developed a mobile device controller that uses electrical fields to make three-dimensional measurements. The low-power chip makes it possible to interact with mobile devices and other consumer electronics using hand gestures. Gesture-recognition technology has advanced as scientists attempt to create more natural user interfaces that go beyond touchscreen, keyboards, and mouses. Microchip says its controller uses 90 percent less power than camera-based gesture systems, and it never has to be turned off. The system works by transmitting an electrical signal and then calculating the three-coordinate position of a hand based on the disturbances to the field created by the hand. The controller comes with the ability to recognize 10 predefined gestures, including wake-up on approach, position tracking, and various hand flicks. The gesture library was built by the chipmaker using algorithms that learned how different people make the same movements. The system cannot yet make a distinction between an open hand and a closed fist, or simultaneous movements of different fingers, and it only works within a six-inch range. "That’s the biggest drawback,” says University of Washington researcher Sidhant Gupta. "But I think, still, it’s a pretty big win, especially when compared to a camera system."


Technology Will Help Humans Overcome Societal Challenges
Gulf Times (Qatar) (11/13/12) Noimot Olayiwola

Moza Bint Nasser University professor Raj Reddy says computing technology will transform the way humans live, learn, work, and communicate. "It is very important to provide people with access to education and healthcare and we should also begin the process to eliminate the over one million auto-related deaths that occur every year," Reddy says. He notes many academic institutions are developing programs to help spread computing technology. For example, Carnegie Mellon University is developing an automated tutoring system for children with learning disabilities, while Rajeev Gandhi University of Knowledge provides free online lectures to about four percent of every school in a rural community. Reddy also notes that the Khan Academy has developed computing technology for India's emergency services. "We have seen an exponential change in the information technology world in the last 40 years so much so that the [information technology] processing power increases one thousand times every 15 years and the disk memory as well as the fiber bandwidth each multiplies by a thousand every 10 years," Reddy says. However, he notes it has been more difficult to find computing solutions for shortages in food, clothing, and shelter.


Female Developers and Athletes Take the Leading Role at espnW Hack Day
Wired News (11/13/12) Alexandra Chang

ESPN's espnW hack day at Stanford University concentrated on women producing sports-related computer applications for women. "We've come a long way, but there's still a long way to go in terms of getting the numbers of women engaged in technology," says Facebook software engineer and espnW hackathon judge Sophia Chung. One participant, visual designer Terry Rodriguez-Hong, aimed to put together a team to develop a social-networking app for helping former athletes connect and encourage each other to reach their fitness objectives. Meanwhile, a five-woman team outlined a concept for a calendaring app to help sports teams keep track of their games and events and interact with fans through social media. The winning app was iSports, developed by Carnegie Mellon University engineering students and designed to pull athlete videos from YouTube and pertinent facts from ESPN. Mightybell founder Gina Bianchini says the goal of the hackathon is not just attracting more women into the technology field, but building a broader overall community. "Events serve as the heartbeat of any community and any movement," she notes. "I think about it less as we should have events for women developers and more as if you want to build a community of creative thinkers building new and interesting things."


ICTs Allow Electrical Consumption to Be Reduced by One Third
Carlos III University of Madrid (Spain) (11/12/12)

The ENERsip project, which consists of 10 partners from five European countries, has developed an information and communications technology platform that reduces residential electrical consumption by 30 percent and integrates micro-generating installations using renewable energy. The platform is based on reducing the consumption of electricity in homes and adjusting the consumption and generation of electricity in districts. The system "gives the users information regarding their consumption, allowing them to identify the appliances that use the most energy; it then suggests possible solutions, attempting to modify certain behaviors and fomenting good practices that allow consumers to reduce their electricity bill," says Carlos III University of Madrid professor Jose Ignacio Moreno. The ENERsip platform monitors appliances using networks of sensors and actuators and wirelessly controls them using Web applications. The system also carries out automatic actions that enable the consumption in homes within a district to be adjusted so they use renewable energy generated by sources from within the same district. The researchers tested the system in computer simulations and validated the platform in a pilot project.


Predicting Presidents, Storms, and Life by Computer
Associated Press (11/10/12) Seth Borenstein

Better and more accessible data and rapidly growing computer power have helped make computer models more precise, as reflected by the incredibly accurate predictions of the development of Hurricane Sandy and the outcome of the U.S. presidential election. Computer model predictions founded on historical evidence are "one of the more positive trends we're going to see this century," says Tom Mitchell with Carnegie Mellon University's Machine Learning Department. The predictive power of computer models resides in three elements--computer power, mathematical formulas designed to mirror real world cause-and-effects, and present conditions rendered as numbers that can be used in formulas. Current condition data is entered by experts into formulas that anticipate a specific outcome if specific factors are combined, and then the systems repeatedly run those what-if simulations, with slight variations changing the final outcomes. An entire range of results is produced by running these scenarios tens of thousands of times. The end product is a breakdown of future events into probabilities. Statistician Nate Silver says his correct prediction of how all 50 states would vote for president is a triumph for computer modeling's use in the field of politics.


Abstract News © Copyright 2012 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe