Welcome to the August 5, 2015 edition of ACM TechNews, providing timely information for IT professionals three times a week.
Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).
HEADLINES AT A GLANCE
Alan Turing Institute Gets Down to Work
ComputerWeekly.com (08/04/15) Brian McKenna
The Alan Turing Institute, founded in March via a joint-venture pact between five U.K. universities and the Engineering and Physical Sciences Research Council (EPSRC), has commenced operations and accepted 10 million pounds in research funding from the board of the Lloyd's Register Foundation. Microsoft Research U.K. scientist and University of Cambridge professor Andrew Blake has been appointed the institute's director. The institute will collaborate with Britain's Government Communications Headquarters (GCHQ) on training and research in data analysis techniques that may be applied in open access and commercial environments. In addition, the institute will work with Cray and EPSRC to leverage next-generation analytics capabilities on the Archer supercomputer. EPSRC CEO Philip Nelson predicts the initiative will "transform...Archer...into the largest data analytics platform in the world." In addition, the institute will soon begin seeking applications from postdoctoral researchers. "The vision of bringing together the mathematical and computer scientists from the country's top universities to develop the new discipline of data science, through an independent institute with strategic links to commerce and industry, is very compelling," Blake notes. GCHQ director Robert Hannigan says the association of Turing's name with an institute committed to leading the world in big data and algorithm research is "a fitting tribute."
Siebel Institute Seeks to Make the Smart Grid Smarter
The New York Times (08/04/15) Steve Lohr
The Siebel Energy Institute, which announced its first round of grants this week, has a mission to meet the challenge of enhancing the global power grid with intelligent software. The grants are designed to expedite the development of algorithms and machine-learning tools to make emergent digital energy networks more efficient, safer, and more secure. "What we're trying to do is accelerate things as we build out this cyberphysical system," says software entrepreneur Thomas Siebel. "This is big data meets the Internet of Things in the energy infrastructure, a field that is in its infancy but is going to change the world." With seed funding of $10 million, the institute has issued 24 $25,000 and $50,000 grants to recipients developing software for equipment failure forecasting, using smartphones as monitoring sensors, and simulating power-grid security risks, among other projects. Siebel says the institute plans for its smaller grants to help foster innovative research projects that then receive major funding from government agencies. "We're taking our physical infrastructure of energy and putting a cyberheart in it," notes institute director S. Shankar Sastry. "And we need to rethink what this instrumenting of the world--all that data being generated and collected--means for privacy and how all this data is used."
How Secure Scrum Can Help You Build Better Software
IT World (08/04/15) Phil Johnson
Researchers at the Munich University of Applied Sciences' Munich IT Security Research Group report developing a new version of Scrum coding that better supports secure software development. "One reason for security vulnerabilities lies in the agility of Web application software development processes," maintains professor Hans-Joachim Hof. "We decided to have a closer look into this issue and identified Scrum as a common agile software development process that needs a kind of framework for security." The Secure Scrum method involves identifying security issues during the initial planning of the product backlog. Secure Scrum also allows the inclusion of third-party security experts in different ways, for example by training the Scrum team, marking security concerns, and providing black-box solutions to security issues during sprints. Hof and Ph.D. student Christoph Pol note although other Scrum techniques try to better integrate security issues, Secure Scrum keeps the effect of the normal Scrum process to a minimum while encouraging developers to weigh issues across the development cycle. An empirical study to test their theory found developers who practiced Secure Scrum quickly familiarized themselves with the process, spotted and deployed more security requirements, and had fewer code bugs than other groups using either traditional Scrum or any other method.
Blue Sky Ideas-AAAI-RSS Special Workshop on the 50th Anniversary of Shakey
CCC Blog (08/03/15) Helen Wright
The Computing Community Consortium recently sponsored a Blue Sky Ideas Track Competition during a workshop at the 2015 Robotics Science and Systems (RSS) Conference. Organized by the Association for the Advancement of Artificial Intelligence (AAAI) and the IEEE Robotics & Automation Society, the workshop was part of a series of events held to mark the anniversary of Shakey, the first mobile intelligent robot able to reason about its own actions. Peter Hart, one of the original members of the Shakey team, was an invited speaker. He presented at the AAAI-RSS Special Workshop on the 50th Anniversary of Shakey: The Role of AI to Harmonize Robots and Humans. The organizers invited submissions of abstracts describing new ideas and ongoing work at the junction of artificial intelligence and robotics. The winning papers were "Acquiring Object Experiences at Scale" by researchers from Brown University, "Robobarista: Object Part-based Transfer of Manipulation Trajectories from Crowd-sourcing in 3D Pointclouds" by a team from Cornell University, and "End-to-End Training of Deep Visuomotor Policies" by a group from the University of California, Berkeley.
This Is What Controversies Look Like in the Twittersphere
Technology Review (08/03/15)
Kiran Garimella and colleagues at Aalto University in Finland say they have developed a more reliable way of spotting controversies in the Twitterstream in real time. Although other researchers have focused on pre-identified arguments, Garimella's team sought to determine whether the structure of controversial conversations is different from that of benign discussions. The researchers theorized this structure could be spotted by studying the network of connections between those involved in a topic, the structure of endorsement (or who agrees with whom), and the sentiment of the discussion. They tested their theory by first studying 10 conversations associated with hashtags known to be controversial and 10 known to be benign. The researchers then mapped out the structure of the discussion by examining the networks of retweets, follows, and keywords. The team says the retweet and follow graphs reveal clusters that might indicate a form of polarization, as networks can be partitioned into groups on either side of the issue. Images of controversies resemble fireworks, and the team says the method can spot controversies better than existing tools.
The Computing Power Behind the Large Hadron Collider
CIO Australia (08/04/15) Rebecca Merrett
Bob Jones, project manager at CERN, which operates the Large Hadron Collider (LHC), recently attended ADMA's Advancing Analytics conference in Sydney, Australia, to discuss the LHC's massive supporting computing infrastructure. The LHC accelerates particle beams at each other at 99.99-percent of the speed of light in order to study their collisions with high-tech digital cameras that take 40 million photographs per second. Automatic filtering of this data results in 6 gigabytes of data being written to permanent storage every second during an experiment, and 30 petabytes of data annually. To handle this massive volume of data, CERN employs a network of about a dozen major data centers around the world, as well as an array of smaller data centers, for a total of 170. CERN's two major data centers, located in Geneva and Budapest, are connected by the world's fastest network, which supports speeds of up to 100 gigabytes per second. "What this means is we can operate that infrastructure as a single cloud deployment, it's now operated as a single open stack," Jones notes. The LHC's massive computing infrastructure also is active when the collider is idle, running simulations to check the quality of the collected data. Most of the data eventually is stored on magnetic cassettes, which have to be replaced every two years because the tapes deteriorate over time.
For Sympathetic Ear, More Chinese Turn to Smartphone Program
The New York Times (07/31/15) John Markoff; Paul Mozur
The Xiaoice smartphone chatbot introduced in China by Microsoft last year has gained a major following, with people turning to it for comfort and sympathy. Xiaoice (Chinese for "Little Bing") currently exists as a text-messaging program, but the next iteration will incorporate a voice so users can converse with it. The program recalls details from previous dialogues with users, and its developers have accorded it a more compelling personality and sense of "intelligence" by mining the Chinese Internet for human exchanges. Microsoft also developed language-processing technology that picks question-answer pairs from typed conversations, giving Xiaoice a database of human, up-to-date responses from which to choose. The chatbot reflects the dramatic advances of deep learning via artificial neural networks, which are capable of recognizing patterns in speech, images, and language. Some researchers think Xiaoice's popularity can be explained by cultural factors, such as the dense population of Chinese society. Juji CEO Michelle Zhou says the Chinese engage in more face-to-face daily conversations than most Americans would in their own country, and a chatbot such as Xiaoice might offer them a sense of personal space that is otherwise lacking. Microsoft executives note they are attempting to determine Xiaoice's business potential, and they are collaborating with partners to convert the chatbot into a shopping assistant, as well as to add voice recognition to home appliances.
Illinois Funded by Google on Innovative Mobile-First, Open Web of Things Initiatives
University of Illinois at Urbana-Champaign (08/03/15)
The University of Illinois at Urbana-Champaign's Department of Computer Science has received funding from Google for two of its ongoing research projects. The first program is Mobile First, a three-year, $1-million project focused on mobile research and education. Mobile First will support professor Robin Kravets' research into heterogeneous mobile computing and distributed systems, with a focus on cross-device power management and efficient collaborative communications in crowded bandwidth-constrained environments such as concerts or sporting events. Mobile First also will enable the university to integrate Android as the primary mobile platform for three of its critical freshman and sophomore computer science classes. In addition, the university will join Google's Open Web of Things Expedition, a Google-led initiative to develop open source Internet of Things (IoT) technology. The researchers, led by Kravets, will collaborate on the project with other researchers at Google, and at Carnegie Mellon, Cornell, and Stanford universities. The Illinois researchers will focus on balancing mobile devices' interactions with IoT devices with energy consumption. "We anticipate innovations in cross-device power management and coordinated mobile-to-mobile communication that will support and enhance the experiences of users who are increasingly carrying more devices with more capabilities," Kravets says.
MONT-BLANC Successfully Tests Its Software on High-Performance ARM-Based Servers
CORDIS News (08/03/15)
MONT-BLANC project researchers report their software was successfully tested on new high-performance ARM-based server platforms. The Barcelona Supercomputer Center (BSC), with support from E4 Computer Engineering, ran the tests about a month after deploying the MONT-BLANC prototype. "We wanted to show that our [high-performance computing (HPC)] system software is able to run also on standard production machines and not only in the MONT-BLANC prototypes, and we finally managed it," says BSC's Filippo Mantovani. He notes the benchmarks showed "extremely satisfactory results and perfect stability" in the new architecture. However, Mantovani says exploring performance with commercially available ARM 64-bit platforms is necessary if MONT-BLANC is to meet its core goal of setting future global HPC standards based on energy-efficient solutions already used in embedded and mobile devices. Energy-efficient ARM processors powering mobile devices could potentially provide a solution for addressing HPC's energy consumption issues. Mantovani and colleagues believe unified software support for all available ARM platforms would boost acceptance of the technology on the server market. MONT-BLANC was originally set to end in 2015, but it was extended three years, and now the European Union has decided to support a third version of the project under Horizon 2020.
Georgia Tech Receives $4.2 Million for Military Research to Better Secure Data Transfer
Georgia Tech News Center (07/30/15) Tara La Bouff
Georgia Institute of Technology (Georgia Tech) researchers working on the THEIA project were awarded $4.2 million from the U.S. Defense Advanced Research Projects Agency (DARPA) and the Air Force Research Laboratory (AFRL) to improve how data is tracked between computers, Internet hosts, and browsers for better cybersecurity. The four-year project will research where data moves as it is routed from one Internet host to another and whether any malicious code is attached to data during transfer. "If we have the ability to fully track how data is processed until it reaches the intended recipient, then we can better detect and stop advanced persistent threats [APT]," says Georgia Tech professor Wenke Lee. THEIA will monitor information at the point of user interaction with a program, at the point of program processing of data input, and at the point where programs and networks interact with an operating system. "THEIA represents what could be a significant advance over state-of-the-art approaches, which typically are forced to make arbitrary trade-offs between verifying accuracy and maintaining total computational efficiency," Lee says. THEIA will record a sufficient amount of data at runtime, replay and analyze recorded events in almost real time, or analyze data completely offline.
Researchers Create First Firmware Worm That Attacks Macs
Wired (08/03/15) Kim Zetter
Researchers have discovered Mac computers can be affected by known firmware bugs, and they created a proof-of-concept worm to enable automatic proliferation of a Mac firmware attack without networking. "[The attack is] really hard to detect, it's really hard to get rid of, and it's really hard to protect against something that's running inside the firmware," says worm co-developer Xeno Kovah. Firmware is susceptible to malware infection because most hardware manufacturers do not cryptographically sign the firmware embedded in their systems, or their firmware updates, and fail to include any authentication functions that would only permit installation of legitimate signed firmware. Kovah uncovered firmware defects last year that affected 80 percent of the PCs he examined, and with security engineer Trammell Hudson found five of the six bugs affected Mac firmware. Three of the bugs remain unpatched, which Kovah and Hudson exploited to build and implement the Thunderstrike 2 worm, which spreads by infiltrating the option ROM on peripheral devices. Booting another machine with the worm-bearing peripheral causes the machine firmware to load the option ROM from the infected device, and the worm writes its malware to the boot flash firmware on the machine. Kovah and Hudson say air-gapped systems lacking network connections would be particularly vulnerable to this type of attack.
Stanford Team's Brain-Controlled Prosthesis Nearly as Good as One-Finger Typing
Stanford Report (07/31/15) Tom Abate
An interdisciplinary team of Stanford University researchers has developed a technique to make brain-controlled prostheses more precise by analyzing a neuron sample and making dozens of corrective adjustments to the estimate of the brain's electrical pattern in near-real time. The researchers tested a brain-controlled cursor meant to operate a virtual keyboard designed for people with paralysis and amyotrophic lateral sclerosis. "The speed and accuracy demonstrated in this prosthesis results from years of basic neuroscience research and from combining these scientific discoveries with the principled design of mathematical control algorithms," says Stanford professor Krishna Shenoy. The corrective technique is based on a recently discovered understanding of how monkeys naturally perform arm movements. By studying the electrical patterns from 100 to 200 neuron samples, the researchers were able to understand the "brain dynamics" underlying reaching arm movements. They used this knowledge to create an algorithm that can analyze the measured electrical signals the prosthetic device obtained from the sampled neurons. The algorithm slightly changed the measured signals so the sample's dynamics were more like the baseline brain dynamics, with the goal of making the thought-controlled prosthetic more precise. "This is a fundamentally new approach that can be further refined and optimized to give brain-controlled prostheses greater performance, and therefore greater clinical viability," Shenoy says.
Self-Building 3D Printed Bricks Hint at Future Without Assembly Lines
The Conversation (07/30/15) Eliza Berlage
Bar-Ilan University researchers have used an algorithm to show that high frequency vibrations can cause bricks to self-assemble into a larger three-dimensional object, a breakthrough they say could eventually make factory assembly lines obsolete. The new system works by encoding assembly rules on brick faces that have magnets embedded inside them. "The bricks can then be mixed in a container and agitated, leading to properly assembled objects at high yields and zero errors," according to the researchers. The researchers were inspired by the molecular assembly of the DNA molecule. During testing, they showed a two-brick assembly took less than a minute to self-assemble, but an 18-piece assembly required more than two hours to perform the same feat. Although the initial research is limited to building small objects, future demonstrations combining other techniques, such as embedded electronics, could make the rapid construction of larger devices viable, says University of Melbourne researcher Bernard Meade. "Perhaps furniture scale production might be possible in future--imagine flatpack IKEA--but I think it would be hard to get to something the size of a house," he notes. The researchers say the next step to developing this technology for the manufacturing and construction industries is to use both magnetic forces and adhesives to ensure the assembly stays in place.
Abstract News © Copyright 2015 INFORMATION, INC.
To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.