Association for Computing Machinery
Welcome to the April 18, 2012 edition of ACM TechNews, providing timely information for IT professionals three times a week.

ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


Online Education Venture Lures Cash Infusion and Deals With 5 Top Universities
New York Times (04/18/12) John Markoff

Stanford University computer science professors Andrew Ng and Daphne Koller recently received $16 million in venture capital funding and formed partnerships with five universities to form Coursera, a free online learning system. Ng and Koller say Coursera is a Web portal for distributing a broad array of interactive courses in the humanities, social sciences, physical sciences, and engineering. The five universities working with Coursera include Stanford, the University of California, Berkeley, the University of Michigan, the University of Pennsylvania, and Princeton University. "When we offer a professor the opportunity to reach 100,000 students, they find it remarkably appealing," Koller says. Ng and Koller were motivated to launch Coursera by the opportunity to reach hundreds of thousands of students instead of just hundreds. "We decided the best way to change education was to use the technology we have developed during the past three years," Ng says. Coursera breaks lectures into segments as short as 10 minutes and offers quick online quizzes as part of each segment. Coursera also offers an online feature that enables students to get support from the global student community. During testing, the researchers found that questions were typically answered in less than 22 minutes.


Data Centers in Va. and Elsewhere Have Major Carbon Footprint, Report Says
Washington Post (04/17/12) Juliet Eilperin

Data centers and mobile telecommunications networks use more than 623 billion kilowatt hours of electricity annually, and a 2008 study found that the information technology (IT) sector represented 2 percent of the world’s greenhouse gas emissions. If the industry were a country, it would rank fifth in the world in terms of electricity demand. In addition, all but one of the U.S.'s major IT companies still rely on fossil fuels to power more than half of their cloud operations, according to a new Greenpeace report. "Data centers and the cloud would be an environmental win if we build them in the right way, and connect them in the right way," says Greenpeace's Gary Cook. “If we just connect them to traditional sources of fossil fuel energy, that becomes a real train wreck.” IT firms continue to grow as important customers for the U.S.'s utility companies. As this growth continues, IT firms are positioning themselves to lobby politicians and company executives to boost renewable energy supplies. Some IT companies are already using this leverage by choosing to have data centers near renewable sources, investing directly in renewable energy, or pushing for legislative changes on the state level.


Programming Project Comes to Primary Schools
BBC News (04/17/12)

A volunteer project in the United Kingdom is writing session plans for teaching the basics of computer programming to children between the ages of 10 and 11. Called Code Clubs, the sessions will give schoolchildren an opportunity to engage in hands-on tasks such as making games and eventually controlling robots. "The idea is to build things that are really exciting," says Clare Sutcliffe, who developed the idea for Code Clubs along with Linda Sandvik. Several schools have volunteered to test the session plans. The club sessions will make use of the Massachusetts Institute of Technology's Scratch tool, which enables youngsters to engage in programming by dragging and dropping code elements instead of typing them. The first 12 sessions should be free for participating schools to run, and the only extra step would be to download and install Scratch on their computers. Volunteers would run the Code Clubs, rather than teachers. Sutcliffe and Sandvik hope to have Code Clubs in 25 percent of British primary schools by 2014. The Code Club would "slot neatly alongside" the changes to the national curriculum that will emphasize programming, Sutcliffe notes.


Using His Software Skills With Freedom, Not a Big Payout, in Mind
New York Times (04/17/12) Jim Dwyer

A group of computer scientists, led by college student Nadim Kobeissi, is developing Cryptocat, which uses encryption technology to create an Internet chat room where people can be free from surveillance. "The whole point of Cryptocat is that you click a link and you’re chatting with someone over an encrypted chat room," says Kobeissi, who is working with Guardian Project developers on the program. Cryptocat disguises the content of chat messages so that they look like gibberish to anyone who does not have the encryption key. As many as 10 users can simultaneously speak privately to one another in a Cryptocat chat room, which separates the platform from other encryption chat services. "Cryptocat is an enabling, positive technology, and it’s an alternative," says Jacob Appelbaum, a developer with the Tor project, which routes Web traffic in ways that help disguise sites that people have visited. Critics say the development of powerful tools that prevent companies and government from analyzing personal data could create safe havens for terrorists and online sexual predators. "Evil people have been evil forever," Kobeissi says. "I don’t think they’re going to stop being evil or become more evil because of Cryptocat."


How to Perfect Realtime Crowdsourcing
Technology Review (04/17/12)

One strategy to incorporate human-like intelligence in common computing applications is to employ people in some type of crowdsourcing process, although latency issues need to be resolved. Massachusetts Institute of Technology researcher Michael Bernstein and colleagues are focused on addressing this problem, and they say they have found a way to shrink crowd reaction time to 500 milliseconds. Their process involves "precruiting" a crowd and keeping them on standby mode until a task opens up. The researchers use the concept of queuing theory to determine how to optimize the precruitment process according to how frequently the task comes up, the duration of the task, and other variables. To keep workers wired for action, they display a "loading" bar on the worker's screen when no task is at hand, and if no task materializes 10 seconds after the loading bar appears, the worker is paid off. Bernstein suggests that real-time crowdsourcing could be used to direct cameras, control robots, and generate instant opinion polls, but crowdsourcers will first need to change their business model. Bernstein and colleagues suggest they could build a retainer into their system design to produce a base of instantaneously available workers.


Can Social Media Detect the Changes in Public Mood?
University of Bristol News (04/17/12)

Researchers at the University of Bristol's Intelligent Systems Laboratory recently used a word-counting method to analyze the mood of Twitter users in the United Kingdom. The researchers saw a significant increase in negative mood, anger, and fear coinciding with the announcement of spending cuts and the riots during the summer of 2011. "Social media allows for the easy gathering of large amounts of data generated by the public while communicating with each other," says Bristol professor Nello Cristianini. The study aimed to determine if the effects of social events could be seen in Twitter content. The researchers first confirmed that a word-counting method could provide a reasonable approach to sentiment analysis. By making use of lists of words that are correlated with the sentiments of joy, fear, anger, and sadness, the researchers found that periodic events such as Christmas evoke the same response in the population every year. The research focused on a visible change-point that occurred in October 2010, when the government announced cuts to public spending. The same method found another important period when riots broke out in several cities in 2011. The researchers found that an increase in public anger on Twitter preceded the riots.


Multi-Agency Earth System Models (EaSM) Proposals Due
CCC Blog (04/16/12) Erwin Gianchandani

Through its Science, Engineering, and Education for Sustainability initiative, the U.S. National Science Foundation, in collaboration with the U.S. Departments of Agriculture and Energy, is soliciting proposals for its Earth Systems Modeling (EaSM) program. EaSM's purpose is to cultivate interdisciplinary projects geared toward the "development and application of next-generation Earth system models" that facilitate a better understanding of climate change, its likely effects on the world, and how people can plan for its consequences. Each model would include "coupled and interactive representations of such things as ocean and atmospheric currents, human activities, agricultural working lands and forests, urban environments, biogeochemistry, atmospheric chemistry, the water cycle, and land ice," according to the solicitation. The initiative seeks to draw scientists from the geosciences, social sciences, agricultural and biological sciences, mathematics and statistics, physics, and chemistry. May 11 is the deadline for full proposals. Among the solicitation's long-term goals is enhancing and extending current Earth system modeling capabilities to realize comprehensive, reliable global and regional forecasts of decadal climate variability and change via advanced understanding of the coupled interactive physical, chemical, biological, and human processes driving the climate system.


The Intangible Assets of the Internet of Things
EE Times (04/16/12) R. Colin Johnson

Internet Protocol addresses assigned to all electronic appliances will prompt the challenge of extracting value from the big data of the Internet of Things (IoT). IBM researchers envision a worldwide electronic nervous system with trillions of sensors monitoring the status of everything of interest to humans and streaming the resulting exabytes of data to cloud-based cluster supercomputers that use analytics software modeled on the human mind to extract the ultimate value from the data. “It’s not about making individual widgets, but rather about constructing a complete sensory system--a vertical solution that includes the sensors, networking, storage, servers, software, and analytics,” says Hewlett-Packard's Stan Williams. Engineering challenges for the IoT center on addressing difficult problems in security, standardization, network integration, ultralow-power devices, energy harvesting, and perceived network reliability. Data overload prevention is the prime engineering challenge for networking the IoT, notes Freescale Semiconductor's Ian Davidson. Gartner's Mark Hung says the unifying theme in consumer IoT applications will be control, monitoring, and diagnosis to facilitate a seamless user experience. "The Internet of Things will use a set-and-forget model that makes people more efficient, makes their lives safer, and just generally makes everything more convenient," says Ember Corp. CEO Bob LeFort.


Research Lab Extends Host-Based Cyber Sensor Project to Open Source
Network World (04/16/12) Ellen Messmer

The public has an opportunity to participate in the development of a U.S. Department of Energy lab's security sensor through the open source Hone Project. The Pacific Northwest National Laboratory (PNNL) is open-sourcing the software to encourage feedback and participation. "We'd love to have other people use this," says PNNL scientist Glenn Fink, who invented Hone, a cyber-sensor that is currently available for the Linux operating system kernels 2.6.32 and later. There are Windows 7 and XP versions in development, and plans for a Mac OX version. The host-based sensor may represent a potential breakthrough in identifying suspicious communications between monitored computers and network activity, whether from the Internet or the internal network. Hone can identify relationships between programs and network activities, and might be able to identify cyberattacks accurately as well as adapt to limit how processes can communicate to the network. Fink says Hone could potentially be used to monitor wireless networks. PNNL uses Hone, which includes some visualization display, as part of a research project to understand how attackers break into computer systems.


Crunch Commuter Data to Track Changing Communities
New Scientist (04/16/12) Jacob Aron

University of Cambridge researchers compared commuter data with official measures of social deprivation and found that a community's prosperity is reflected in the movements of its residents. London's rail commuters have cards with anonymized identification numbers, which enabled the researchers to determine where commuters lived based on picking the two most visited stations for each person, assuming that these corresponded to home and work locations. The researchers then compared the number of visitors an area received with the Index of Multiple Deprivation, a composite measure based on census data that weighs factors such as income, health, and crime levels in a particular community. The researchers found that more deprived areas were visited more often and that people from deprived areas did the most traveling. The results enable policy-makers to respond more rapidly to changing social dynamics within a city than by relying on census data alone, says Cambridge researcher Daniele Quercia. The researchers also developed UrbanOption, an online game that tests which places stick in people's minds. The program shows users a random Google Street View image of a location and then asks them to match it to the nearest station or borough.
View Full Article - May Require Free Registration | Return to Headlines


Cyber Security Exercise Puts Laboratories to the Test
Los Alamos National Laboratory News (04/12/12)

The Los Alamos National Laboratory (LANL) recently hosted a simulation of a cracked security network, with sensitive data spewing across the Internet. Dubbed Eventide, the information security exercise required more than 100 participants to assess what was happening and how to respond, as their systems became progressively compromised, sensitive data appeared on hostile Web sites, and invisible bad guys revealed their nefarious plans. LANL's Dale Leschnitzer coordinated Eventide, which brought together cyber and information technology leaders from 20 U.S. government sites, including the Federal Bureau of Investigation, the Department of Energy and its Cyber Forensics Laboratory, and the National Nuclear Security Administration to develop recommendations on resources they need from the new federal Joint Cyber Coordination Center (JC3). JC3 is charged with improving the national response to threats, leveraging resources, and sharing information to meet the U.S.'s information security commitments. Eventide served as an incubator for assisting JC3 in developing a practical path forward. "We've had a trial by fire and it's toughened our teams," says LANL's Tom Harper. "Now we can strengthen and optimize our joint defenses to ensure we're a national resource ready to develop responses and templates to assist government and industry."


Sounds of Silence Proving a Hit
ANU News (04/11/12)

Researchers at the Australian National University (ANU) say they have tuned very sensitive light detectors to listen to vacuum to develop the world's fastest random number generator. Vacuum is an extent of space that has virtual sub-atomic particles spontaneously appearing and disappearing, which give rise to random noise. "While it has always been thought to be an annoyance that engineers and scientists would like to circumvent, we instead exploited this vacuum noise and used it to generate random numbers," says ANU professor Ping Koy Lam. Most random number generators are based on algorithms, but knowing the input conditions to the algorithm makes the numbers not truly random. "Vacuum noise is one of the ultimate sources of randomness because it is intrinsically broadband and its unpredictability is guaranteed by quantum theory," notes ANU's Thomas Symul. The team linked the table-top laser experiment directly to the Internet, and anyone who downloads live random numbers from the ANU Web site will get a unique sequence of numbers that is different from all other users. The team is working to miniaturize the technology down to the size of a thumb drive.


The Processors of Petascale
HPC Wire (04/10/12) Michael Feldman

Historical trends suggest that the processor architecture of future exascale systems will be variegated, based on the diversification of the current crop of petascale systems. Of the 20 currently operational petascale supercomputers, eight employ X86 central processing units (CPUs) exclusively, seven use an x86/graphical processing unit combination, two use Fujitsu's SPARC64 CPUs, one uses the Chinese ShenWei SW1600 processor, one employs IBM's PowerPC 450 CPU, and the last system uses the PowerXCell 8i alongside AMD x86 CPUs. The latest petascale system is the Oakleaf-FX, a Fujitsu machine based on its PRIMEHPC FX10 servers. The Oakleaf-FX is the first supercomputer installed with Fujitsu's newest SPARC64 IXfx CPUs. The variety of these petascale machines implies that, if historical trends recur, the first exascale systems will be driven by exotic processors, but eventually more commodity-based processors will assume control. The biggest current limitation for large supercomputers is power consumption, so other processor architectures such as ARM could gain prominence. The Chinese are developing their own series of high-performance computing processors, which could lead to a game change in the supercomputing industry before the end of the decade.


Abstract News © Copyright 2012 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe