Association for Computing Machinery
Welcome to the July 22, 2013 edition of ACM TechNews, providing timely information for IT professionals three times a week.

ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


A Faster Internet--Designed by Computers?
MIT News (07/19/13) Larry Hardesty

Massachusetts Institute of Technology (MIT) researchers have developed a computer system called Remy that automatically generates transmission control protocol (TCP) algorithms with transmission rates two to three times higher than those designed by humans. TCP regulates the rate of computer transmissions to minimize network congestion, and human engineers have worked to improve TCP congestion-control algorithms over the past 25 years. Remy is a machine-learning system that tests variations on possibilities that seem to work best. Certain network characteristics, such as whether the number of users changes and by how much, are determined by users. In addition, users establish network performance evaluation metrics and a traffic profile. The researchers created an algorithm that focuses Remy's analyses on scenarios in which small network condition variations lead to large performance variations, and to concentrate less on predictable network behavior. "Traditionally, TCP has relatively simple endpoint rules but complex behavior when you actually use it," says MIT graduate student Keith Winstein. "With Remy, the opposite is true. We think that's better, because computers are good at dealing with complexity. It's the behavior you want to be simple."


Stanford Expert Says Internet's Backbone Can Readily Be Made More Sustainable
Stanford Report (CA) (07/19/13) Mark Golden

The U.S. Department of Energy recently announced that it wants to establish minimum energy efficiency standards for all computers and servers sold in the United States, after a new Stanford University study found that large server farms can cut electricity use and greenhouse gas emissions by 88 percent with off-the-shelf equipment and proven energy management practices. The carbon emissions generated by large data centers are related to the computing efficiency of IT equipment, the amount of electricity the data center's building uses for things other than computing, and how much of the center's electricity comes from renewable or low-carbon sources. "Of these three, improving the efficiency of the IT devices is overwhelmingly the most important," says Stanford professor Jonathan Koomey. He says data centers can avoid wasting electricity by using faster computers that pay for themselves fairly quickly, and by using flash memory on the motherboard instead of hard disks. "Once you fix the institutional problems, then the company can move quickly, because the needed equipment is off-the-shelf and the energy management practices are well understood," Koomey notes. Of the potential 88-percent reduction in greenhouse gas emissions, IT device efficiency accounts for about 80 percent, and facility energy management for about 8 percent.


Congressional Panels Dump on STEM Reshuffling Plan
Science (07/19/13) Jeffrey Mervis

Several congressional panels have rejected the Obama administration's plan to significantly realign the federal government's $3-billion annual budget for programs focused on science, technology, engineering, and mathematics (STEM) education at 13 agencies. The Senate Appropriations Committee joined other congressional panels in questioning the White House's proposed STEM realignment, as it approved a 2014 spending bill covering the National Aeronautics and Space Administration (NASA), the Commerce Department, and the National Science Foundation (NSF). The committee said the realignment proposal had not received adequate input from the education community, congressional authorizing committees, or the federal agencies involved. The administration's suggestion to form lead STEM agencies has not been proven viable, the committee says. In addition, the House of Representatives science committee approved a bill to reauthorize NASA programs that turns down the administration's proposal to take away most of its STEM education agencies and consolidate the remainder. The White House's plan called for eliminating redundant and ineffective programs and consolidating STEM authority in the Department of Education, NSF, and the Smithsonian Institution, which together would lead federal efforts in K-12 education, undergraduate and graduate training, and informal science education.


DOE Appeal: Breaking Exaflop Barrier Will Require More Funding
Federal Computer Week (07/17/13) Frank Konkel

The Department of Energy (DOE) will likely need a significant increase in funding if the United States wants to be the first to break the exaflop barrier. China is putting aside funds for an exascale supercomputer, and Japan recently invested $1 billion in its effort. Both countries hope to build one by 2020, but the European Union, Russia, and a handful of large companies also are working to be the first to break the exaflop barrier. DOE has stated that 2020 is its goal for building an exascale supercomputing system, but developing the technology could require at least an additional $400 million in funding a year, according to Argonne National Laboratory's Rick Stevens. "At that funding level, we think it's feasible, not guaranteed, but feasible, to deploy a system by 2020," he said, testifying before the House Science, Space and Technology subcommittee on Energy in late May. However, he noted that current funding levels would not allow the United States to reach the exascale mark until about 2025. DOE's Office of Science and the National Nuclear Security Administration have made $450-million and $400-million fiscal 2014 budget requests, respectively, for advanced computing programs.


Conductivity Gain for Stretchable Electronics
The Engineer (United Kingdom) (07/19/13)

University of Michigan researchers have found that networks of spherical nanoparticles embedded in elastic materials could make the best bendable conductors, which could make possible flexible electronics that could be used to make flexible displays or implantable devices. "We found that nanoparticles aligned into chain form when stretching," says Michigan's Yoonseob Kim. "That can make excellent conducting pathways." The researchers took electron microscope images of materials at various tensions, and found that under strain, the nanoparticles could filter through the gaps in the polyurethane, connecting in chains as they would in a solution. "As we stretch, they rearrange themselves to maintain the conductivity, and this is the reason why we got the amazing combination of stretchability and electrical conductivity," says Michigan professor Nicolas Kotov. The researchers made two versions of the material by building it in alternating layers or filtering a liquid containing polyurethane and nanoparticle clumps to leave behind a mixed layer. The researchers see their stretchable conductors being used as electrodes in applications such as brain implants. "The stretchability is essential during implantation process and long-term operation of the implant when strain on the material can be particularly large," Kotov says.


How Computer Analysis Uncovered J. K. Rowling's Secret Novel
Popular Science (07/17/13) Francie Diep

Duquesne University professor Patrick Juola used computer analysis to determine that Harry Potter creator J.K. Rowling wrote "The Cuckoo's Calling." The novel, purportedly written by first-time author Robert Gailbraith, was the subject of speculation over its authorship. After the computer analysis was presented to Rowling, she admitted to writing the novel. To conduct his analysis, Juola compared digital copies of "The Cuckoo's Calling" with other Rowling novels, as well as with works written by three different authors. The analysis looked for four writing habits that are considered strong indicators of authorship to determine which author matched most closely. The program analyzed each book's distribution of word lengths, 100 most common words, pairs of words frequently used together, and groups of four characters called four-grams that include letters, spaces, and punctuation. Rowling came up most frequently in the tests as the likely author, although the analysis is not considered conclusive. This type of analysis could provide significant evidence of authorship of historical texts, contested court documents, and other written works.


Combining Computer Science, Statistics Creates Machines That Can Learn
UChicago News (IL) (07/16/13) Rob Mitchum

University of Chicago professor John Lafferty focuses on theories and algorithms that enhance machine learning. "Computer science is becoming more focused on data rather than computation, and modern statistics requires more computational sophistication to work with large data sets," he observes. "Machine learning draws on and pushes forward both of these disciplines." Lafferty’s work targets the development of computer programs that can extract knowledge from large volumes of numbers, text, audio, or video with little or no human intervention, and make predictions and decisions about events that have not been coded in its instructions. Lafferty studies modern data sets, such as those in genomics, that are frequently "short and wide," which feature few subjects and tens of thousands of variables. The aim is to differentiate relevant factors from irrelevant ones to make statistical analysis possible. He also studies semi-supervised learning, a machine-learning technique in which a human trains a computer to categorize inputs and gives it free rein to process new, unseen data. Lafferty says research universities offer the opportunity to broadly study machine learning. "The potential impact is very large, and the ideas that we're developing will be applied in ways that we can't even anticipate," he says.


'Huge Opportunities' for Harvesting Data Shown
University of Manchester (07/17/13) Mike Addelman

University of Manchester researchers have conducted a study on harvesting information from digital and administration data. The study found that huge quantities of useful data are being created as a result of processes connected to Internet users, Twitter and Facebook, but much of it is owned by commercial companies or public agencies. "These new forms of data are a huge opportunity for understanding more about the big questions that face us all, such as health inequalities, discrimination, and environmental change," says Manchester professor Mark Elliot. He notes that the line between researcher and researched is becoming blurred, and warns there is a risk that the reliability of social research will diminish if companies and organizations restrict access to data sources. "This is because to carry out proper research, we need to validate, replicate, and peer-review research effectively," Elliot says. "At the moment, the data world can seem a bit like the Wild West." The series of interviews and a survey with leading academics also suggest that privacy could be an issue due to concerns about the links and data sharing between companies and government agencies.


Cello Could Be Music to the Ears of C Developers
InfoWorld (07/18/13)

A University of Edinburgh student plans to release a development tool in a couple of months that will facilitate higher-level programming in the C language. Daniel Holden has developed Cello as a GNU99 C library. Cello features capabilities such as interfaces for structured design, exceptions for controlling error handling, and constructors/destructors to aid in memory management. A duck-typing capability supports generic functions and enhances programming, and syntactic sugar boosts readability. Holden says the technology will be available under a BSD-3 license, and is geared toward C power users. He notes Cello's high-level structure was "inspired" by Haskell, while the syntax and semantics were inspired by Objective-C and Python. "It's a bit of an experiment, really," says Holden, who notes that only a few bugs in Cello still need to be worked out. "C's often used in embedded systems and systems very close to the metal and the processor, but it's actually a very simple and elegant language and surprisingly powerful," he says.


Computing Toxic Chemicals
EurekAlert (07/18/13)

University of Kansas researchers have tested a statistical algorithm designed to predict whether a chemical might cause toxicity problems. The researchers say they built on the principle known as quantitative structure-activity relationships, but instead of searching for features in a molecule that would benefit medicine, the algorithm looks for the atomic groups and the type of bonds that hold them together to find associations with toxicity. The researchers say their statistical approach combines "Random Forest" selection with "Naive Bayes" statistical analysis to achieve a level of prediction that is well beyond random guessing. About 10,000 industrial chemicals need toxicity profiling, but the researchers' computation method could enable industry and regulators to focus on those believed to be the most toxic. Their technique also could lessen the need for animal testing of compounds. The team is now working to improve the speed and precision of the algorithm.


Computer Smart as a 4-Year-Old
UIC News Center (07/15/13) Jeanne Galatzer-Levy

University of Illinois at Chicago (UIC) researchers have IQ-tested the ConceptNet4, considered one of the best available artificial intelligence systems, and found that it is about as smart as the average four-year-old. The researchers tested the ConceptNet4 system on verbal portions of the Wechsler Preschool and Primary Scale of Intelligence Test, a standard IQ assessment for young children. They found that ConceptNet4 has the average IQ of a young child, although its scores were very uneven across different portions of the test, which is unlike most children. "If a child had scores that varied this much, it might be a symptom that something was wrong," says UIC professor Robert Sloan. The researchers found that ConceptNet4 did very well on the vocabulary and similarities portions of the test, but very poorly on the comprehension portion. Sloan says one of the hardest problems in building an artificial intelligence is devising a computer program that can make sound and prudent judgments based on a simple perception of the situation or facts, which he notes is the dictionary definition of common sense.


Sebastian Thrun on the Future of Learning
Technology Review (07/19/13) Rachel Metz

In an interview, Udacity CEO and Stanford University professor Sebastian Thrun says massive open online courses (MOOCs) are still in the early phases, but are making continuous improvements that will enable them to fit the needs of today's students. Even as Udacity partner San Jose State University recently suspended their joint effort until next spring due to low student passing rates, Thrun says the online education startup is improving the support and value that it provides to students. He notes that recent pilot tests have shown completion rates of 85 percent, as opposed to the 4 percent or 5 percent that is common for MOOCs. Udacity has begun offering college credits and degrees to motivate students to persevere in courses, and has increased student support services through help lines and mentors. In the future, Thrun says artificial intelligence will play a larger role in online education, but human involvement will still be necessary. For example, artificial intelligence can be applied to student profiling to identify at-risk students and pinpoint the areas in which a student requires help. Computer programs also can assist with grading work that has distinct right and wrong answers, but people should provide the qualified feedback that students need with work such as essays, Thrun says.


Abstract News © Copyright 2013 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe