ACM TechNews
Association for Computing Machinery
Welcome to the July 28, 2014 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


Researchers Develop 'Blackforest' to Collect, Correlate Threat Intelligence
Dark Reading (07/25/14) Brian Prince

The Georgia Tech Research Institute (GTRI) has launched a new cyberthreat intelligence tool dubbed BlackForest, which scrapes public Internet sources such as forums and social media frequented by hackers and malware writers for threat intelligence. The tool has a variety of use cases, says GTRI research scientist Christopher Smoak. One use case is identifying and following the activities of interesting individuals across the Internet. "Identifying someone on a forum that has previously posted credit card information as being related to someone active in IRC speaking of a future attack may lead us to conclusions about the type, scale, and potential target for such an attack," Smoak says. Another scenario in which BlackForest would be useful is real-time monitoring of denial-of-service attacks being orchestrated through social media, in which the tool could offer insight into the scale of the attack and who is coordinating it. Ryan Spanier, head of GTRI's Threat Intelligence Branch, says BlackForest also could be used by companies and organizations to determine what, if any, of their data has been leaked onto the Internet, potentially alerting them to data breaches they were unaware had happened. GTRI already has made two other security systems available: Apiary, which helps government and business share information about malware attacks; and Phalanx, which defends against spear-phishing attacks.


Collecting Just the Right Data
MIT News (07/25/14) Larry Hardesty

A great deal of artificial intelligence research is focused on using large data sets to generate predictions. Great success has been realized in less complex situations with well-defined data, such as generating customer recommendations based on shopping history. However, for some systems like the weather or the analysis of geological data, there is not enough data or not enough time to process the data. MIT graduate student Dan Levine and his advisor Jonathan How are working on a new technique for sidestepping these issues by identifying subsets of the total data set that are most likely to yield reliable predictions. Their technique centers on a refinement to the use of probabilistic graphical models that represent the relationships between data points, such as in a network diagram or family tree. Levine and How's technique helps to navigate probabilistic graphical models even in the presence of loops, situations in which the relationships between data turn back on themselves so it is difficult to determine how each data point is influencing the others. A potential use for the new technique is predicting the development of weather patterns in real time.


NASA Looking for Out-of-This-World Mars Communications Services
Network World (07/25/14) Michael Cooney

The U.S. National Aeronautics and Space Administration (NASA) recently issued a Request for Information that explores options to buy commercial communications services to support users on Mars. "In this model, the commercial provider would own and operate relay orbiter(s), and NASA would contract to purchase services over some period of time," according to NASA. NASA recently demonstrated optical communications between the Moon and Earth with download rates of 622MB/sec and an error-free data upload rate of 20MB/sec. NASA's current Mars relay infrastructure is aging, and there is a potential communications gap in the 2020s, which is why NASA wants to explore alternative models to sustain and develop the Mars relay infrastructure. Mars landers and rovers are constrained in mass, volume, and power, all of which contribute to a substantial restriction in the data rates and volumes that can be communicated on the direct link between Mars and Earth. To address the limitation in direct-to-Earth bandwidth, the Mars Exploration Program has developed a strategy of including a proximity-link telecommunications relay payload on each of its Mars science orbiters. The relay payloads establish links with landers and rovers on the surface, supporting very high-rate, energy-efficient links between the orbiter and lander.


New Tools Help Neuroscientists Analyze Big Data
Howard Hughes Medical Institute News (07/27/14) Jim Keeley; Robert Gutnikoff

Researchers at the Howard Hughes Medical Institute and the University of California, Berkeley are using Thunder, a distributed computing system, to help analyze massive data sets. Thunder accelerates the analysis of data sets that are too large and complex for individual workstations. The Berkeley researchers used Thunder to find patterns in high-resolution images collected from the brains of active zebrafish and mice with multiple imaging techniques. "For a lot of these data sets, a single machine is just not going to cut it," says Howard Hughes Medical Institute researcher Jeremy Freeman. The Berkeley researchers also developed Spark, a large-scale computing tool for data caching that eliminates the bottleneck of loading a complete data set for all but the initial step. The researchers made Spark suitable for analyzing a broad range of neuroscience data by developing standardized representations of data that were amenable to distributed computing. "We started with our questions about the biology, then came up with the analyses and developed the tools," Freeman says. The researchers also used Thunder to analyze images of the brain in minutes, interacting with and revising analyses without the lengthy delays associated with previous methods.


The Birth of Topological Spintronics
Penn State News (07/23/14) Barbara K. Kennedy

Researchers at Pennsylvania State University (PSU) and Cornell University have discovered a new material combination they say could lead to a more efficient approach to computer memory and logic. The researchers focused on "spintorque" in devices, which combines a standard magnetic material with a novel material known as a topological insulator. The results show the new method can be 10 times more efficient for controlling magnetic memory or logic than any other combinations of materials measured to date. "This is a really exciting development for the field because it is the first promising indication that we actually may be able to build a practical technology with these topological insulator materials, which many condensed-matter physicists have been studying with spintronics applications as the motivation," says PSU professor Nitin Samarth. The researchers fashioned thin-film materials into devices and carried out the spin-torque measurements. "Our experiment takes advantage of the very special surface of bismuth selenide--a material that is a topological insulator--which inherently supports the flow of electrons with an oriented spin," Samarth notes. "Our collaborators at Cornell found that, at normal room temperatures, we can use these spin-oriented electrons to very efficiently control the direction of the magnetic polarity in the adjacent material."


Microsoft's Station Q Delves Into Quantum Computing
eWeek (07/24/14) Pedro Hernandez

The purpose of Microsoft's quantum computing lab, Station Q, is to explore the convergence of computer science and quantum physics so a quantum computer can eventually be constructed. With the threat of Moore's Law reaching its physical limits looming over the tech industry, quantum computing is seen as a way to transcend those limitations and create advanced technology. Station Q director Michael Freedman and fellow researchers are developing quantum computing techniques, and the initial challenge they face is addressing the problem of "fussy" quantum bits (qubits) that lose coherency from even the slightest and most minuscule forms of interference. "Scaling enough qubits to be useful, doing so in a stable way, and keeping them from falling apart--these are some of the fundamental challenges of the field," Microsoft notes. Station Q is using topological mathematics as a foundation for building a stable quantum information platform, with Microsoft describing topological mathematics as "the study of geometric forms that remain unchanged when bent or stretched." Microsoft Research director Peter Lee says Station Q's notions are becoming influential. "I think you could argue that the topological approach has become mainstream," he says. "Physicists don't think we're crazy people anymore."


Transistor Successor Set to Bring on 'The Machine' Age Soon
Scientific American (07/14) Wendy Grossman

Researchers at Hewlett-Packard's (HP) Information and Quantum Systems laboratory are working to develop memristors, a response to the diminishing return of the transistor technology used in today's computer chips that will combine the best characteristics of both dynamic and flash memory. Unlike transistors, memristors can occupy a range of states between on and off. Researchers believe memristor-based computer chips will be more energy-efficient and better able to handle the flood of data many expect as the countless embedded devices making up the Internet of Things comes online. HP hit upon the memristor idea, originally proposed in the 1970s, while researching the future of ever-smaller transistors. The first functioning memristor array was built in 2012 by HRL Labs. HRL and HP are developing memristor technology so chips using it can be made using the existing complementary metal-oxide semiconductor manufacturing process. Many expected memristor technology to develop faster than it has, but HP now expects to begin putting memristors into production in 2015, making the technology available first in the form of dual in-line memory modules in 2016. Although memristors likely will integrate with existing hardware and software at first, many expect the technology to lead to fundamental redesigns of computer architecture and operating systems.


UNSW Robots Win World Football RoboCup
IT News (07/25/14) Allie Coyne

A University of New South Wales (UNSW) team won the grand final of the 2014 RoboCup Standard Platform League robot soccer tournament, held in Brazil. The university's rUNSWift team took on Leipzig University's Nao-Team HTWK. Each RoboCup team competes with identical robots, which are required to operate autonomously without external human or computer control. A team consists of a maximum of five players, in addition to a robot "coach," which is allowed to sit on the sideline observing the game while sending "tactical and strategic" advice to the players. One of the players can be designated as a goalkeeper and is the only player allowed to use its hands. The victory marks the first time a UNSW team has won the RoboCup championship. About 20 teams from all over the world participated in this year's Robocup. The UNSW team consisted of undergraduate students from the university's School of Computer Science and Engineering, who use the challenge as part of their fourth-year thesis.


Wireless Home Automation Systems Reveal More Than You Would Think About User Behavior
Saarland University (07/24/14)

Saarland University researchers are studying ways to make home automation systems more secure, having found that modern smart home wireless automation systems can pose a security risk. "Many of the systems do not provide adequate security against unwanted third-party access and therefore threaten the privacy of the inhabitants," says Saarland professor Christoph Sorge. As part of the study, the researchers took on the role of a malicious attacker. "Using a simple mini-PC, no bigger in size than a packet of cigarettes, we eavesdropped on the wireless home automation systems [HASs] of two volunteers and were thus able to determine just how much information a conventional wireless HAS reveals about its user," Sorge says. The study showed that non-encrypted systems provide large quantities of data to anyone that wants to access the data, and the attacker does not need prior knowledge of the system. "The results indicate that even when encrypted communication is used, the number of messages exchanged is enough to provide information on absence times," Sorge says. He notes potential attacks can be directed against the functionality of the system or the privacy of the inhabitants. Sorge says enhanced data encryption and concealment technologies would help protect HAS users' privacy, and his group is collaborating with researchers at the University of Paderborn on developing the technology.


Art By Algorithm: Computer Evolves New Artworks
New Scientist (07/24/14) Aviva Rutkin

Nagoya University researcher Yasuhiro Suzuki and colleagues have developed software that creates digital artworks using algorithms that mimic natural selection. Suzuki says the researchers developed the software after learning how artistic methods are passed down through generations. "Paintings that have remained to the present were painted by scaling, rotating, and combining motifs that had already existed," he says. To use the software, an artist first indicates the desired style of art and selects a picture from a few preloaded images to feed into an algorithm. The algorithm then mutates the image in different ways and generates images that either are removed or kept depending on how closely they adhere to the user's initial stylistic choices. The process repeats until the user stops it to select an image they like. The researchers tested the program with different sets of preferences and starting images, letting it run for up to 4,800 generations at a time. "Where algorithms do help is in the crafting of a system or situation which can produce interesting results," says School of the Museum of Fine Arts in Boston artist Kurt Ralske. The Nagoya University researchers also are exploring whether they can evolve pieces that look similar to paintings by renowned artists.


K Computer Runs Largest Ever Ensemble Simulation of Global Weather
RIKEN (07/23/14)

Researchers at Japan's RIKEN Advanced Institute for Computational Science have successfully run 10,240 parallel simulations of global weather on the 10-petaflops K computer--the largest number ever executed--using data assimilation to reduce the range of uncertainties. A three-week computation of data from the ensembles for simulated global weather was made possible by the eight-fold improvement of the Local Ensemble Transform Kalman Filter's efficiency by the EigenExa high-performance eigenvalue solver software. The researchers learned that remote observation, even further away than 10,000 kilometers, may have an immediate effect on the ultimate state of the estimation via analysis of the 10,240 equally likely estimates of atmospheric states. The outcome suggests further research is needed on innovative techniques that can better leverage faraway observations, which could potentially lead to an improvement of weather forecasts. Contributing to this milestone were several projects financed by Japan Science and Technology Agency CREST (Core Research for Evolutionary Science and Technology) programs. The projects included one focused on advancing big data assimilation technology for revolutionizing short-range severe weather prediction; a project to integrate big data and high-performance computing for yottabyte processing, and an effort to develop an Eigen-supercomputing engine using a post-petascale hierarchical model.


The Need for Speed
Topics (07/22/14) Caroline Perry

Harvard University researchers are using high-performance computers to achieve new breakthroughs in science and engineering. For example, Harvard professor Alan Aspuru-Guzik is using Harvard's Orgoglio computing cluster of parallel graphics processing units to determine the electronic properties of individual molecules with a very high level of accuracy in less than a day. The researchers also have analyzed thousands of kinds of quinones to find the best candidates to store energy in a new type of organic flow battery, which could fundamentally transform the electrical grid by providing an affordable and continuous supply of power from wind and solar farms. Harvard researchers also are working on the Connectome Project, which aims to create a complete wiring diagram of all the neurons in a healthy human brain, empowering neuroscientists to study the causes and mechanisms of thought, behavior, memory, aging, and mental illness. In addition, the Multiscale Hemodynamics Project is using advanced computational techniques and visual simulations to understand how red blood cells and other particles flow through the human circulatory system. Harvard researchers also contributed to the completion of the Murchison Widefield Array, an advanced radio telescope that generates wide-field images continuously in real time and captures 19 gigabytes of data a second, totaling a petabyte a day.


Abstract News © Copyright 2014 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: technews@hq.acm.org
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe