Welcome to the September 13, 2013 edition of ACM TechNews, providing timely information for IT professionals three times a week.
Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).
HEADLINES AT A GLANCE
Here's an Intel Chip That Uses Wine--Yes, Wine--for Power
IDG News Service (09/12/13) Agam Shah
Intel researchers have developed a system that uses wine to power a microprocessor. In a recent demonstration of the new technology, Intel researcher Genevieve Bell poured red wine into a glass containing circuitry on two metal boards. Once the red wine hit the metal, the microprocessor on a circuit board powered up and ran a graphics program on a computer with an e-ink display. The demonstration aimed to show Intel's progress in developing low-power chips. Researchers at Intel's "New Devices" group also are experimenting with a range of other products, including embedded devices with sensors, smartwatches, and eyewear. Future computing devices will be able to understand human behavior through data gathered by embedded sensors and other wearable technology, according to Bell. For example, data gathered from sensors and mobile devices could help a remote food kiosk anticipate what a customer would likely order. "Mobility isn't just about the devices, but the places we visit, and also what we will do while we're there," Bell says.
The Man Who Would Build a Computer the Size of the Entire Internet
Wired News (09/09/13) Cade Metz
Solomon Hykes has started an open source software project called Docker that aims to use the Internet as one enormous computer, providing a standard method of moving software applications from one machine to another. Docker packages software applications into their own shipping containers, so they can load and run on any machine running the Linux operating system. Software in Docker containers can spread across the machines in a user's own data centers, onto popular cloud services, and back again. Although Docker is only a few months old, it has garnered attention from major Silicon Valley firms, because it takes proven technologies and repackages them for easier use. Docker divides software into cells of code, and a Docker container holds everything the application needs to run, including software libraries and other application-related code. Docker applications do not depend heavily on operating system code, with the operating system mainly offering hooks into the Docker containers. The result is a computer or group of computers that function like an organism, in which cells operate independently as well as cooperatively. Hykes says the software has been downloaded 60,000 times in the five months that it has been available, and 80,000 people are visiting the project's website monthly.
Artificial-Intelligence Research Revives Its Old Ambitions
MIT News (09/09/13) Larry Hardesty
Artificial intelligence (AI) research has advanced significantly, but much work remains before computers can truly replicate or surpass human intelligence. Although machine learning has enabled computers to perform tasks such as interpreting spoken language, AI has proven more complicated than anticipated. "These recent achievements have, ironically, underscored the limitations of computer science and artificial intelligence," says the Massachusetts Institute of Technology's (MIT's) Tomaso Poggio. "We do not yet understand how the brain gives rise to intelligence, nor do we know how to build machines that are as broadly intelligent as we are." However, Poggio believes new knowledge of neuroscience, cognitive science, and computer science place today's researchers in a position to try again to attain AI's original goals. Poggio will head the Center for Brains, Minds, and Machines (CBMM) at MIT, one of three new research centers funded by the U.S. National Science Foundation's Science and Technology Centers Integrative Partnerships program. CBMM will be an international, multi-institution effort focusing on the integration of intelligence, including vision, language, and motor skills; intelligence circuits, encompassing neurobiology, and electrical engineering; the development of intelligence in children; and social intelligence.
A Better Way for Supercomputers to Search Large Data Sets
Government Computer News (09/09/13) Kevin McCaney
Lawrence Berkeley National Laboratory researchers have developed techniques for analyzing huge data sets by utilizing "distributed merge trees," which take better advantage of a high-performance computer's massively parallel architecture. Distributed merge tree algorithms are capable of scanning a huge data set, tagging the values a researcher is looking for, and creating a topological map of the data. Distributed merge trees separate the data sets into blocks and leverage a supercomputer's massively parallel architecture to distribute the work across its thousands of nodes, according to Berkeley Lab's Gunther Weber. He notes the algorithms also can separate important data from irrelevant data. Weber says the new technique will enable researchers to get more out of future supercomputers. "This is also an important step in making topological analysis available on massively parallel, distributed memory architectures," Weber notes.
Helping Students Crack Computer Science Code
Boston Globe (09/09/13) Callum Borchers
Massachusetts' Beaver Country Day School wants to integrate software programming into all of its subjects, and is experimenting with uses not only in math and science classes, but even in English and art. Just as people learn their first spoken languages through immersion, Beaver students will pick up coding languages similarly, without structured lessons. "A lot of what we've been talking about is using it when it's organic, and just having students and teachers keep an eye out for those moments," says Beaver math department chairman Rob MacDonald. "So if there's somebody in the room who's already got some skills who can jump in and share that with the class, that's great. If the students need some new tools to do the coding work, we can teach them on the fly." State education officials and technology groups are watching Beaver's integrated strategy as a promising way to introduce students to computer programming. Last year, only 713 Massachusetts students took an Advanced Placement exam in computer science, according to Mass Insight Education. The Massachusetts Computing Attainment Network, a coalition that includes Google and Microsoft, recently set out to help the state education department provide more computer science to interested students in all school systems.
Security Flaw Shows Tor Anonymity Network Dominated by Botnet Command and Control Traffic
Technology Review (09/11/13)
Botnet traffic dominates the Tor network, according to researchers at the University of Luxembourg. The online service is championed as a key tool for promoting free speech and protecting personal privacy because it enables users to surf the Web anonymously. However, Alex Biryukov, Ivan Pustogarov, and Ralf-Philipp Weinmann report that traffic on Tor is largely botnet traffic and much of the rest is adult content and traffic related to black market and illegal goods. Biryukov, Pustogarov, and Weinmann also discovered a flaw in the Tor protocol earlier this year; Tor immediately corrected the flaw, which enabled anybody in the know to track users back to their origin. However, before the flaw became public, the researchers analyzed Tor traffic to see where it came from and what it contained. They collected about 39,000 unique addresses offering Tor content, and concluded that the number devoted to legal and not-so-legal content is about equal.
Breakthrough in Cryptography Could Result in More Secure Computing
Bristol University (09/09/13)
Researchers at the University of Bristol and Aarhus University have developed the SPDZ protocol, which they say is the fastest known protocol to implement a theoretical idea called Multi-Party Computation. This theory enables two or more people to compute any function of their choosing on their secret inputs, without revealing their inputs to either party. The researchers say the SPDZ protocol turns Multi-Party Computation from a theoretical tool into a practical reality. The SPDZ protocol enables researchers to compute complex functions in a secure manner, enabling possible applications in the finance, pharmaceutical, and chemical industries where computation often needs to be performed on secret data. "We have demonstrated our protocol to various groups and organizations across the world, and everyone is impressed by how fast we can actually perform secure computations," says Bristol professor Nigel Smart. "Only a few years ago, such a theoretical idea becoming reality was considered Alice in Wonderland style over ambitious hope."
Beyond Peer Review: NIST and Five Journals Find a Way to Manage Errors in Research Data
NIST News (09/09/13) Laura Ost
Traditional peer review is not enough to ensure data quality amid the recent boom in scientific research findings, according to a recently completed 10-year study between the U.S. National Institute of Standards and Technology (NIST) and five technical journals. The study found that the traditional peer-review process is under pressure to work too fast to fully evaluate all new experimental data. NIST researchers have developed NIST ThermoLit, a set of customized software tools and procedures for validating experimental data and eliminating errors after a paper is approved by peer review, but before a journal formally accepts the paper. "The solutions we offer, while centered on the field of thermodynamics, should be applicable in principle to other areas of science and engineering," says NIST's Michael Frenkel. Advances in measurement science have boosted data collection, but increased automation has resulted in the loss of personnel expertise and knowledge required to run manual systems, according to the report. NIST ThermoLit enables researchers to generate a literature report containing relevant references retrieved from a NIST database. If the proposed paper passes a journal's peer review, NIST generates a report highlighting any inconsistencies between the new experimental data and critically evaluated data based on past research.
Intel Foundation: Changing Attitudes Is Key in STEM Education
U.S. News & World Report (09/09/13) Allie Bidwell
As the creation of science, technology, engineering, and math (STEM) jobs continues to outpace the number of students preparing to enter those fields, the Intel Foundation is funding annual competitions and science fairs to foster STEM education innovation. Intel Foundation executive director Wendy Hawkins recently spoke with U.S. News & World Report about steps that should be taken to improve STEM education. Despite concern in the United States about students losing interest in STEM and poor test performance, other countries are just as worried and many look to the United States as an example, Hawkins notes. Instead of covering a broad range of topics on a surface level, education should focus on delving more deeply into subject matter, which STEM competitions give students the opportunity to do. Although very young children are eager to learn science, "every year they're in school, their enthusiasm for science and their vision of themselves as scientists goes down," says Hawkins, stressing the need to intervene with younger children and to support teachers in science instruction. Meanwhile, she says, underrepresented minorities "are not well-prepared by our K-12 system." Hawkins notes that one critical step toward improving STEM education is overhauling the Common Core and Next Generation Science standards.
Robotic Therapy Helps Children's Coordination
University of Leeds (09/06/13) Ben Jones
Researchers from the University of Leeds are developing a robotic device that helps children practice and improve their hand coordination. The device is a robotic arm that uses haptic technology to guide a child's hand as they play computer games designed to assist with writing. The robot's arm helps them learn the correct hand and wrist movements commonly made during handwriting and other manual tasks by pushing and pulling the pen in the direction required to make the right moves. "In trying to support a child with handwriting and coordination difficulties, one of the major challenges teachers and occupational therapists come up against time and again is the limited time they have to work one-to-one with each child," says Leeds' Liam Hill. "In this respect, haptic robotic technologies have huge potential efficiency benefits." A pilot test in the United Kingdom found that young children enjoyed the tasks and performed them at an acceptable level. The researchers say the system could be used by multiple children in a clinic or a classroom setting, under the supervision of a single professional.
Center for Networked Systems: Finding the Silver Lining in a Sometimes Dark 'Cloud'
UCSD News (CA) (09/06/13) Tiffany Fox
Researchers at the University of California, San Diego Center for Networked Systems (CNS) design and improve data center and wide-area networks, with some scientists working to make data widely accessible via mobile and cloud computing while others paradoxically work to make access nearly impossible. CNS researchers refer to their divergent goals as "the beauty of the decentralized system," which now prevails as a way of creating, sharing, and storing information. Principal investigators at CNS "are known for making big contributions in designing scalable, fault-tolerable networks and understanding how networks work," says CNS associate director George Porter. "They're known for developing and designing next-generation storage technologies and they're also making advances in terms of security writ large, from understanding how spam works to understanding how to make the computerized systems in cars safer and more reliable." Accessing sensitive data in the cloud, often via mobile devices, introduces numerous new security threats. Data stored in the cloud is subject to loss, but hard drives can fail as well and the benefits of the cloud are so significant that the trend "will only become deeper and more broad" in the coming decade, Porter says.
NSA Expands Academic Cyber Initiative
HPC Wire (09/05/13) Isaac Lopez
The U.S. National Security Agency (NSA) has added the Air Force Institute of Technology, Auburn University, Carnegie Mellon University, and Mississippi State University to its National Centers of Academic Excellence in Cyber Operations program. The academic cyberinitiative is a technical program that seeks to better educate students on how they could help defend the U.S. against cyberattacks. The schools will be able to offer a special summer teaching program in which participants who undergo background checks and obtain temporary top-secret security clearances can engage in a 10-week internship. Given the recent scrutiny of NSA, officials say ethics are a key part of the curriculum. "Cyberskills are increasingly important in national defense, but it's even more important to operate as responsible citizens in the use of such skills," says NSA technical leader Steven LaFountain. Dakota State University, the Naval Postgraduate School, Northeastern University, and the University of Tulsa also are participants in the program. The goal of the Cyber Operations program is to make U.S. national information infrastructure less vulnerable by promoting high tech, interdisciplinary higher-education programs that promote education and research in information assurance, while boosting the number of qualified professionals with this expertise across disciplines.
IBM Watson: How the Jeopardy-Winning Supercomputer Was Born, and What It Wants to Do Next
TechRepublic (09/09/13) Jo Best
IBM's Watson supercomputer in 2011 won a Jeopardy match against the game's two top players. Five years before Watson's landmark victory, IBM Research executives initiated work on Watson as part of one of the company's Grand Challenges, which aim to create human-computer competition, have international appeal, and attract people to science and math fields. The Watson team began by developing DeepQA, a massively parallel software architecture that examines natural language content in both Jeopardy's clues and Watson's own stored data, while also examining the structured information it holds. Creating the component-based system, built on a series of pluggable components for searching and weighting information, required about 20 researchers and three years to be able to surpass human competitors on a quiz show. Shortly after the Jeopardy show aired, IBM moved to establish a Watson business unit and the project was folded into the IBM Software group. The group set about understanding the 41 separate subsystems that went into Watson, and then consolidated and accelerated the system. For example, to prepare Watson for applications in oncology, the team worked on content adaptation, training adaptation, and functional adaptation. The next Watson offerings will be embedded into products in the IBM Smarter Planet initiative.
Abstract News © Copyright 2013 INFORMATION, INC.
To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.