Association for Computing Machinery
Welcome to the September 18, 2013 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


MIT Announces Two MOOC Sequences as edX Strategy Begins to Take Shape
Inside Higher Ed (09/18/13) Carl Straumsheim

The Massachusetts Institute of Technology (MIT) is planning to package some of its online courses into more cohesive sequences, known as the "XSeries," while edX is preparing to launch a certificates of completion program using identify verification. The two initiatives may provide a look at what the future holds for the massive open online course (MOOC) provider. The XSeries adds a new layer of structure to MITx, the university's section of the edX platform. Each XSeries sequence will feature topics found in two to four face-to-face courses, and the first of seven courses in the Foundations of Computer Science XSeries will be offered this fall. edX's verified certificates are intended for students who enroll in online courses to further their careers. "Students have been asking for certificates that have more verification, more meaning behind them that they can add to their resumes," says edX's Dan O'Connell. A spokesperson for MIT says the new XSeries is part of an experiment to find new ways to offer programs. "Personally, I think it's pretty obvious we're headed into a new era of education," says MIT senior lecturer Christopher J. Terman. "I would be surprised if in 10 years the lay of the land wasn't really a lot different."


Computer Science Major Ranks No. 8 for Salary Potential
Network World (09/17/13) Ann Bednarz

Science, technology, engineering, and math (STEM) degrees are among the college degrees with the highest earning potential, according to PayScale's College Salary Report, which provides the median starting and mid-career pay for alumni of more than 1,000 schools in 129 majors. Engineers claimed seven of the top 10 highest-paying degrees, with petroleum engineering at the top of the list with a median starting salary of $103,000 and a mid-career median salary of $160,000. Eight computer-related majors are among the top 30, including, by order of rank, computer engineering, computer science, software engineering, management information systems, electrical engineering technology, computer information systems, information systems, and information technology. In addition, PayScale ranked colleges by alumni pay in each major, and California schools took all five top spots in computer science.


Experts See Potential Perils in Brazil Push to Break With U.S.-Centric Internet Over NSA Spying
Associated Press (09/17/13) Bradley Brooks; Frank Bajak

Brazil is planning to remove itself from the U.S.-centric Internet in the aftermath of the U.S. National Security Agency's (NSA) online spying program, a move that could be a dangerous first step toward fracturing a global network built with minimal interference by governments. The Brazilian government's reaction to NSA's spying program could set the Internet on a course of Balkanization, according to Internet security and policy experts. "The global backlash is only beginning and will get far more severe in coming months," says Open Technology Institute director Sascha Meinrath. "This notion of national privacy sovereignty is going to be an increasingly salient issue around the globe." Meinrath says the danger of mandating the kind of geographic isolation Brazil is considering is that it could render inoperable popular software applications and services and endanger the Internet's open, interconnected structure. Several countries advocating greater "cyber-sovereignty" recently pushed for control at an International Telecommunications Union meeting, with Western democracies led by the United States and the European Union in opposition.


UK Enters Global Online University Race
BBC News (09/17/13) Sean Coughlan

Twenty-one United Kingdom universities have launched FutureLearn, the country's biggest online university project. The massive open online course (MOOC) program is expected to have 20 short courses, with eight starting this year. During the initial experimental phase of the project, students taking the MOOCs will carry out multiple-choice questions, without any formal qualifications. FutureLearn will offer courses designed to work across all types of online platforms, so that a student could begin a course on a laptop at home and then continue on a mobile phone while commuting, according to FutureLearn CEO Simon Nelson. In addition, although FutureLearn can be a solitary experience, it also will try to create a supportive online community with a very strong social architecture. "Learning never stops and as the economy's demand for higher skills rises, universities should be in the vanguard when it comes to providing new opportunities," says Reading University vice-chancellor David Bell. "Making courses accessible online, on mobiles and tablets, means that people can fit their studying around their lives, rather than their lives around study." FutureLearn's participating universities include Birmingham, Bristol, Leeds, Nottingham, Reading, Sheffield, Southampton, and Warwick, as well as Trinity College Dublin and Monash University in Australia.


DOE: Federal Spending Necessary for Exascale Supercomputer Development
FierceGovernmentIT (09/15/13) David Perera

Federal agencies must spend money on the development of an exascale supercomputer if a viable machine is to be built by 2022, according to a report from the U.S. Energy Department to Congress. The report notes that an exaflop computer would consume more than 1 gigawatt of power, "roughly half the output of Hoover Dam," if the approach to exascale supercomputing follows the one taken for petascale machines. The Energy Department achieved petascale-level performance by networking clusters of commodity processors and memory. Still, at the normal pace of technological improvement and without the benefit of commercially risky investments, an exascale machine would require more than 200 megawatts of power at an estimated cost of $200 million to $300 million annually. Developing an exascale system is expected to cost $1 billion to $1.4 billion. Supercomputers currently need 10 times more energy to bring two numbers from memory into the processor than to execute the subsequent operation itself, and the Energy Department estimates that by 2020, that ratio could reach 50 times more. The department also would need to address memory and data movement issues, as well as determine how to cope with runtime errors in an exascale machine.


Google's Coder Tool Turns Raspberry Pi Into a Mini Web Server
ZDNet (09/13/13) Liam Tung

Coder, a new open source development tool from Google, is designed to make it easier to use Raspberry Pi computers to build Web applications. Coder converts inexpensive Raspberry Pi computers into personal Web servers through a stripped-back Web-based development environment. Coder is targeted at the education market, and Google says the tool offers a simple platform that teachers and others can use to demonstrate how to build for the Web through browser-based projects written in HTML, CSS, and JavaScript. Coder can be downloaded from the Web to a Mac or PC, and users also will need a 4 GB SD card to transfer the Coder SD image to the Raspberry Pi. The tool can run on a standard wired Ethernet connection, but running it on a Wi-Fi connection will require a mini Wi-Fi module for the Pi that currently costs about $12. All projects will be stored on the Pi device.


Academics Launch Fake Social Network to Get an Inside Look at Chinese Censorship
Technology Review (09/12/13) Tom Simonite

Harvard University professor Gary King this year launched a social media site to gain firsthand experience of online censorship in China. King discovered that online censorship is a thriving market in China, with companies competing to provide the best censoring technology and services. King contracted with a major Chinese provider of Web software to help run his site, and was able to ask customer service representatives about how and when to use the censoring tools. The toolkit the researchers received enabled new posts to be automatically held back for human review based on specific keywords and to be scrutinized based on length as well as the site location on which they appeared. Furthermore, individuals could be singled out for greater censorship based on reputation, IP address, and how recently they had last posted. Censorship firm representatives told King that a website should employ two or three censors for every 50,000 users to satisfy the government, leading King to estimate that there are about 50,000 to 75,000 censors working at China's Internet companies. The Harvard team recruited people in China to post 1,200 updates to 100 different social sites, and found that just over 40 percent of the posts were immediately held back by automated censorship tools.


Carnegie Mellon Researchers Say Twitter Analysis Can Help Gamblers Beat the Spread on NFL Games
Carnegie Mellon News (PA) (09/12/13) Byron Spice

Carnegie Mellon University (CMU) researchers are studying how to use analyses of Twitter feeds to predict the outcomes of National Football League football games. The study encompassed three NFL seasons from 2010 to 2012, and involved the use of automated tools to sort through a stream of tweets that averaged 42 million messages a day in 2012. Out of those, the researchers pulled out messages with hashtags associated with individual NFL teams that were sent at least 12 hours after the start of the team's previous game and one hour before the start of its upcoming game. The goal was to see what might be learned from the collective wisdom or sentiments of fans, as reflected by their tweets, according to CMU professor Christopher Dyer. The researchers found that their method was 55 percent accurate in predicting the winner with the point spread. Such an advantage may be sufficient to reap profit, Dyer notes. However, he points out that the analysis did not work well for the first four weeks of the season, nor was it useful in the last few weeks, when many teams begin changing their strategies in preparation for the post-season. In the future, Dyer says improvements might be possible with a more sophisticated analysis of tweet content.


Music Site Chatbot Wins AI Loebner Contest
BBC News (09/16/13) Mark Ward

The Loebner contest has awarded its top prize to a chatbot called Mitsuku, which was judged the most convincingly human artificial intelligence (AI) program among the contestants. The contest's four finalists participated in a series of rounds in which they chatted via text with judges. Mitsuku creator Steve Worswick began programming chatbots in an effort to engage visitors to his dance music website. In 2004, Worswick was commissioned by a games company to write Mitsuku, and he improved on the program's abilities through its many conversations with website visitors. The process helped Worswick program Mitsuku to answer questions that require not only canned responses, but also common sense and an understanding of the world--areas in which AI significantly lags humans. "The difficulty is trying to teach these things about the world because they have no sensory input," Worswick says. The annual contest, launched by U.S. businessman Hugh Loebner, offers a real-world test of a question posed by Alan Turing in the 1950s in which he suggested that a computer could be considered thinking if its responses to questions were indistinguishable from that of a human.


Robots Take Over
University of Miami (09/11/13) Annette Gallagher

University of Miami researchers recently conducted a study documenting the appearance of an "ultrafast machine ecology" of interacting robots in the global financial market. The findings suggest that for time scales of less than one second, the financial world makes a sudden transition into a cyberenvironment inhabited by packs of aggressive trading algorithms. "Our findings show that, in this new world of ultrafast robot algorithms, the behavior of the market undergoes a fundamental and abrupt transition to another world where conventional market theories no longer apply," says Miami professor Neil Johnson. The industry's push for faster systems that can outpace competitors has led to the development of algorithms that can operate faster than the response time for humans. The researchers assembled and analyzed a high-throughput millisecond-resolution price stream of multiple stocks and exchanges. From January 2006 through February 2011, the researchers found 18,520 extreme events lasting less than 1.5 seconds, including both spikes and crashes. They developed a model to understand the behavior and concluded that the events were the result of ultrafast computer trading and not attributable to other factors, such as regulations or mistaken trades. "What we see with the new ultrafast computer algorithms is predatory trading," Johnson says. "In this case, the predator acts before the prey even knows it's there."


Detecting Program-Tampering in the Cloud
MIT News (09/11/13) Larry Hardesty

Massachusetts Institute of Technology (MIT) researchers have developed a system that can quickly verify if a program running on the cloud is executing properly, and that no malicious code is interfering with the program. The system also protects the data used by applications running in the cloud, cryptographically ensuring that the user will not learn anything other than the immediate results of the requested computation. The system currently only works with programs written in the C programming language, but the researchers say adapting it to other languages should be straight forward. MIT's Alessandro Chiesa notes that since the system protects both the integrity of programs running in the cloud and the data they use, it is a good complement to the cryptographic technique known as homomorphic encryption, which protects the data transmitted by the users of cloud applications. The system implements a variation of a zero-knowledge proof, a type of mathematical game that enables one of the game's players to prove to the other that he or she knows a secret key without actually divulging it.


LSU CCT's Researchers Develop Melete, Among First Interactive Supercomputers
Louisiana State University (09/09/13)

Researchers at Louisiana State University's (LSU) Center for Computation & Technology (CCT) have developed Melete, a supercomputing system that integrates an interaction-oriented compute cluster with tangible interfaces to support collaborative research and the classroom. "Melete features several interactive face nodes in addition to the head node," says LSU's Chris Branton, who has been leading the development of software infrastructure for the project. "These are a combination of dynamic screens, passive printed visuals, addressable [light-emitting diodes], and other interactive elements," Branton says. "They are planned to be placed in labs, meeting spaces, and classrooms both at CCT and elsewhere on the LSU campus to give interactive control of the machine to authorized users." The researchers say Melete should offer benefits to researchers in computational biology, materials, mathematics, engineering, and the arts. "This area of research is just a few years old, so our software is under rapid development, and it is a tremendous advantage to use Melete with our new Mathematica codes," notes LSU professor Les Butler. The researchers also are developing simpler access to the Melete system for scientists who are unfamiliar with command-line interfaces.


The Masters of Uncertainty
HPC Wire (09/13/13) Nicole Hemsoth

The California Institute of Technology's Houman Owhadi and Clint Scovel recently spoke with HPC Wire about Bayesian methods and the role of uncertainty in supercomputing. Bayesian inference allows researchers to test the outcomes of interest by modeling uncertainty combined with some prior data. Uncertainty quantification is especially useful with high-performance computing, as more advanced computers enable researchers to compute the Bayesian posterior, or the conditional probability of an uncertainty that is assigned after known evidence is considered, which previously could not be calculated. Fields such as risk analysis and climate modeling can particularly benefit from uncertainty quantification. For example, Boeing must certify that new airplane models have a probability of catastrophic event that is less than 10 to the power of minus nine per hour of flight. To perform safety assessments, Boeing cannot fly a billion airplanes to determine how many crash, Owhadi says, so the company takes limited data and processes it in an optimal way to predict risk. Owhadi notes that he and other researchers are developing an algorithmic framework that enables optimal processing of information. "We're saying we want to formulate this problem that we're trying to solve and we're going to use our computing capability--in particular, high-performance computing--to solve these problems," Scovel says.


Abstract News © Copyright 2013 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe