Welcome to the September 14, 2015 edition of ACM TechNews, providing timely information for IT professionals three times a week.
Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).
HEADLINES AT A GLANCE
Xerox PARC's New Chip Will Self Destruct in 10 Seconds
IDG News Service (09/10/15) Martyn Williams
Engineers at Xerox PARC have developed a chip that can self-destruct upon command, making it potentially suitable for high-security applications. The chip was developed as part of the U.S. Defense Advanced Research Projects Agency's (DARPA) vanishing programmable resources project, and designed to store data such as encryption keys. On command, the chip shatters into thousands of tiny pieces. During a demonstration last week at a DARPA event in St. Louis, the self-destruct circuit was triggered by a photo-diode, which switched on the circuit when a bright light fell on it. For this demonstration the light was provided by a laser, but the trigger could be anything from a mechanical switch to a radio signal. Xerox engineers based the chip on Gorilla Glass, a Corning-manufactured tough glass used in the displays of numerous smartphones. "We take the glass and we ion-exchange temper it to build in stress," says senior Xerox PARC scientist Gregory Whiting. The destruction of the chip could ensure destruction of an encryption key as part of a routine process or if the key were to fall into the wrong hands.
U of Michigan Project Combines Modeling and Machine Learning
HPC Wire (09/10/15) Tiffany Trader
The University of Michigan (U-M) has received a $2.42 million grant from the U.S. National Science Foundation for ConFlux, a new computing resource that will enable supercomputer simulations to interface with large datasets while running. In addition, U-M will provide $1.04 million toward the project, which will begin with the construction of the new Center for Data-Driven Computational Physics, supporting the fields of aerodynamics, climate science, cosmology, materials science, and cardiovascular research. The university says ConFlux will enhance traditional physics-based computer models with big data techniques and specialized supercomputing nodes matched to the needs of data-intensive operations. "ConFlux will be a unique facility specifically designed for physical modeling using massive volumes of data," says U-M professor Barzan Mozafari. ConFlux will use machine-learning algorithms to create more reliable models trained with a combination of scaled-down models and observational and experimental data. The complex will focus on five major areas of study: cardiovascular disease; turbulence; clouds, rainfall, and climate; dark matter and dark energy; and material property prediction.
Artificial Intelligence Computer Program Transforms Typed Text Into Handwriting
Techly (09/14/15) Chloe Olewitz
A new computer program developed by the University of Toronto's Alex Graves applies his work on recurrent neural networks (RNNs) to convert typed text into organic-like handwriting. The program also enables the user to adjust the bias setting to shape the legibility and style of the handwriting it generates. Graves' program and similar long short-term memory RNNs combine individual data points into complex sequences based on probability predictions. The RNN samples from its own output, feeding each data point back into the algorithm as a fresh input. Many neural networks that depend on simpler learning via trial and error can only produce the same outcome patterns once any choice or string of choices the network makes is rated as successful. However, RNNs seldom generate the same solution more than once, because they synthesize data in an intricate input and output system that analyzes data and produces results point-by-point. Graves implies RNNs are superior to other artificial intelligence models, as the RNN system is more capable of modeling complex, multivariate data.
First New Cache-Coherence Mechanism in 30 Years
MIT News (09/09/15) Larry Hardesty
Massachusetts Institute of Technology (MIT) researchers have developed Tardis, which they say is the first fundamentally new approach to cache coherence in more than 30 years. With existing techniques, the directory's memory allotment increases in direct proportion to the number of cores, but with Tardis it increases according to the logarithm of the number of cores. In a 128-core chip, the new technique would require only one third as much memory as conventional methods, according to the researchers. With a 256-core chip, the space savings rises to 80 percent, and with a 1,000-core chip it would be 96 percent. The researchers note the key to Tardis was finding a simple and efficient means of enforcing global logical-time ordering. "[W]e just assign time stamps to each operation, and we make sure that all the operations follow that time stamp order," says MIT graduate student Xiangyao Yu. In the new system, each core has its own counter, and each data item in memory has an associated counter as well. When a program launches, all the counters are set to zero, and when a core reads a piece of data, it increments the data item's counter to a specific number. As long as the core's internal counter does not exceed that number, its copy of the data is valid.
New Photonic Chips Could Transform How Online Data Is Sent and Stored
CORDIS News (09/11/15)
The European Union (EU)-funded IRIS project has developed new photonic silicon chips offering higher bandwidth, and data center operators will be key potential end users. The project aims to resolve the limitations of the interconnection network, a major roadblock for data center capacity. The new chips use silicon as a transmission medium, enabling optical interconnection so massive volumes of data can be sent and received concurrently. The IRIS project promises to cut power consumption and boost capacity at the same time. The project coordinators also think IRIS will be essential to developing new functions that should enable the development of fifth-generation (5G) mobile network technology products. Such technology--expected to be rolled out sometime in 2020 and continue for about 15 years--targets the delivery of capacity to accommodate the anticipated boost in wireless communication and data exchange. One 5G projection says it will encompass more business-to-business services, and the IRIS project dovetails with this expectation. 5G development is the focus of a significant concentration of European investment because the EU wants to guarantee that it becomes a 5G leader, while also making European companies capable of capitalizing on demand for new applications and functions.
Student Conducts Significant Computer Research
The Campus (09/10/15) Tyler Stigall
Allegheny College computer science major Cody Kinneer has published a pair of papers about his research into methods of evaluating software performance. Kinneer has developed ExpOse, a tool that evaluates the performance limitations of database schema-testing software. Schema-testing software automatically runs different types of data entries by these schema to see if they allow inappropriate data entries, but gauging the performance requirements of such software has been difficult. Kinneer's ExpOse program is designed to make that process easier. Allegheny professor Gregory Kapfhammer has been working with Kinneer on the project since the last academic year and says its results are significant. "The system that Cody has developed is something that I'm going to use when I teach the second-level computer science class at Allegheny," he says, adding it has applications from introductory computer science education to research to industry. Kinneer has created a GitHub site for ExpOse and presented his findings this summer at the 2015 International Conference on Software and Knowledge Engineering in Pittsburgh. Kinneer's program is a variation of the working being done by University of Sheffield researchers Phil McMinn and Chris Wright on a project called Schema Analyst, which is still ongoing.
Wikipedia Page Views a Potential Key to Open Source Web Trends Data
The Stack (UK) (09/09/15) Martin Anderson
Researchers from three universities in Japan have developed a way to use Wikipedia's page view data to gain insight into Web trends. The researchers sought to gauge interest in the term "Anne Hathaway" after a movie featuring the actress was aired on TV in Japan in December 2014. The researchers found the average correlation coefficient for keywords ranked from 1 to 1,000 in Wikipedia page views was 0.72, while Google Trends keywords ranked from 1,001 to 2,000 at an accuracy of 0.74. The researchers mapped each keyword to a page path and calculated daily and monthly page views on a per-keyword basis and compared these to public server logs. They found hidden, "low-volume" swells can be the earliest indicators of larger interest gathering, and leveraging this more granular low-level information promised more useful predictive mechanisms via publicly available data sources, especially for those who do not have direct access to Google's datasets on Web searches. The researchers note that in order to obtain a set of all trend keywords for a specific date, they would need "to burden Google servers by querying all possible keywords that may have been popular on that day. As such, there is a need for a source of open data that can simulate search logs."
New Tool Reduces Smartphone Battery Drain From Faulty Apps
Purdue University News (09/10/15) Emil Venere
A new software tool that can reduce smartphone battery power drain by 16 percent is now available for free at Github.com. Researchers at Purdue University developed the HUSH system after conducting the first large-scale study on smartphones in everyday use by consumers. In examining the use of 2,000 Samsung Galaxy S3 and S4 phones served by 191 mobile operators in 61 countries, the team found apps drain 28.9 percent of battery power while the screen is off. Phone hardware should enter the sleep state, but apps wake the phone up periodically to do useful things, says Purdue professor Y. Charlie Hu. Afterward, they should let the phone go back to sleep. "They are not letting the phone go back to sleep because of software bugs and, specifically, due to the incorrect use of Android power control application programming interfaces called wakelocks," Hu says. The researchers presented their findings last week at ACM's MobiCom 2015 conference in Paris. "We presented the first study a few years back showing wakelock bugs could cause significant energy drain," Hu says. "But this is the first study showing that wakelock bugs appear prevalent on real users' phones."
Watch Out: If You've Got a Smart Watch, Hackers Could Get Your Data
University of Illinois News Bureau (09/09/15) David Robertson
University of Illinois at Urbana-Champaign (UIUC) researchers recently concluded the Motion Leaks through Smartwatch Sensors (MoLe) project, which found smartwatches are vulnerable to hackers. The researchers designed an app and were able to guess what a user was typing through data "leaks" produced by the motion sensors on smart watches. The research has privacy implications, as an app that looks harmless could gather data from email messages, search queries, and other confidential documents. Although smartwatches can offer valuable insights into human health and context, "the core challenge is in characterizing what can or cannot be inferred from sensor data and the MoLe project is one example along this direction," says UIUC professor Romit Roy Choudhury. The app uses an accelerometer and gyroscope to track the micro-motion of keystrokes as a wearer types on a keyboard. A possible solution to these motion leaks would be to lower the sample rate of the sensors in the watch, notes UIUC Ph.D. student He Wang. The researchers' current system cannot detect special characters such as numbers, punctuation, and symbols that could appear in passwords, but the researchers say hackers could develop techniques for these characters in the future.
The Website That Visualizes Human Activity in Cities Across the World
Technology Review (09/08/15)
Daniel Kondor and colleagues at the Massachusetts Institute of Technology's SENSEable City Laboratory and Ericsson have developed an online tool that uses mobile phone data to visualize human activity. The researchers say ManyCities organizes and presents data in intuitive ways to reveal trends and special events. The researchers gathered mobile phone data from base stations across Los Angeles, New York City, London, and Hong Kong between April 2013 and January 2014. The data includes the number of calls placed, the number of text messages sent, the amount of data downloaded and uploaded, and the number of data requests at 15-minute intervals. No sensitive customer information is used, but the data gives "enough detail about the typical usage patterns on the scale of small neighborhoods," the researchers say. For example, ManyCities analyzed the data and found text message activity peaks in the morning in Hong Kong, in the evening in New York, and at midday in London. The researchers also were able to compare patterns in different neighborhoods or in different cities and activity clusters in different parts of cities and variations over time. The researchers have made ManyCities available online for anybody to try.
U-M Launching $100 Million Data Science Initiative
University Record (MI) (09/08/15) David Lampe
The University of Michigan (U-M) has announced a new Data Science Initiative (DMI) that will invest $100 million in data science projects across the university over the next five years. U-M plans to establish the Michigan Institute for Data Science to lead the university's research and educational activities in big data. The university plans to hire 35 additional faculty members to support the DMI and provide new educational opportunities for students pursuing careers in data science. The DMI also will direct funds to support interdisciplinary research initiatives, pursue the development of new methodological approaches to big data, expand the university's research computing capacity, and strengthen its data management, storage, analytics, and training resources. Provost Martha Pollack says the DMI also will launch "challenge initiatives" in for interdisciplinary areas: transportation, health sciences, learning analytics, and social science research. The transportation project will support the data needs of U-M's ongoing research into connected vehicles, while the health science initiative will focus on cancer prevention and personalized treatments. The learning analytics project will focus on using big data to develop personalized learning strategies, and the social science effort will focus on using big data techniques to analyze social media postings.
Physicists Catch a Magnetic Wave That Offers Promise for More Energy-Efficient Computing
NYU News (09/08/15) James Devitt
Scientists at New York University (NYU), Stanford University, and the SLAC National Accelerator Laboratory have taken pictures of a theorized but previously undetected magnetic wave. The scientists say the research could lead to more a energy efficient means of transferring data in consumer electronics. NYU professor Andrew Kent, the study's senior author, says the discovery was enabled by a "specialized x-ray method that can focus on particular magnetic elements with very high spatial resolution." He says the findings demonstrate that "small magnetic waves--known as spin-waves--can add up to a large one in a magnet, a wave that can maintain its shape as it moves." The scientists deployed x-ray microscopy at the Stanford Synchrotron Radiation Lightsource in order to image the behavior of specific magnetic atoms in materials. The technique offers extremely high spatial and temporal resolution. The researchers' goal was to create a condition in magnetic materials in which the sought-after solitons should exist by injecting an electrical current into a magnetic material to excite spin-waves. During testing, the observed an abrupt onset of magnetic waves with a well-defined spatial profile, which matched the predicted form of a solitary magnetic wave, or magnetic soliton.
Abstract News © Copyright 2015 INFORMATION, INC.
To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe
|