Association for Computing Machinery
Welcome to the May 6, 2015 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


WPI's Team Gears Up for Final Battle of the Bots
Computerworld (05/05/15) Sharon Gaudin

The Worcester Polytechnic Institute (WPI) robotics team is putting the finishing touches on Warner, the robot that is set to compete against 24 other systems from around the world in the U.S. Defense Advanced Research Projects Agency's Robotics Challenge finals next month. The challenge is designed to get roboticists working on semi-autonomous robots that could potentially be used in disaster situations. In the last challenge before the finals, the robots were required to perform eight tasks, one at a time, within 30 minutes. In the finals, each team's robot will have an hour to work through a simulated disaster, stringing all of the tasks together in one overall situation. The final challenge will ask each robot to drive a car, climb stairs, use a drill, and turn a valve. The WPI team wrote about 1 million lines of code to run its humanoid robot, and already has published several papers on addressing the robot's balancing and vision. "We need to focus on getting as many tasks done in an hour as we can," says Matt DeDonato, the WPI team's technical project manager. "At this point, our goal is to do them all, but it may make sense to skip some."


The Path to a Wearable Future Lies in Academia
Reuters (05/04/15) Jeremy Wagstaff

Many of the innovative products now available to consumers, such as the smartphone and the Apple Watch, are based on technologies that were developed in academic laboratories years ago. For example, several of the features of the Apple Watch, such as sending doodles and "touches" to other users, were demonstrated in a device called the Connexus, which was developed by researcher Eric Paulos more than a decade ago. Aaron Quiqley, chair of Human Interaction at the University of St. Andrews in Scotland, says academics interested in wearable technology developed much of the technology behind today's commercial products 20 years ago. Today, those same academics are working on other new futuristic technologies. Paul Strohmeier of Ontario's Human Media Lab, for example, is developing a flexible screen that can wrap around a person's wrist and adapt its display by tracking a user's eyeballs. Other futuristic interfaces being developed by academics include drawing finger gestures in the air or projecting buttons onto the skin. Other researchers are developing ways of transmitting tactile sensations through the air or enabling users to send aromas to someone's smartphone. "Most 'breakthroughs' today are merely implementations of ideas that were unimplementable in that particular time," says Carnegie Mellon University's Ashwin Ashok. "It took a while for industry to catch up, but now they are almost on par with academic research."


Internet Pioneer Vint Cerf Calls for Rapid Web Security Enhancements
eWeek (05/05/15) Wayne Rash

The need for security was a recurring theme of the remarks made by Internet pioneer and former ACM president Vint Cerf at the National Press Club on Monday. Cerf discussed encryption at length, and said all Internet traffic should be encrypted, as should individual devices. He said passwords should be supplemented with two-factor authentication where possible, and also criticized recent efforts by the U.S. Federal Bureau of Investigation and Justice Department to mandate back doors in encryption methods. However, Cerf, co-recipient in 2004 of the ACM A.M. Turing Award, also acknowledged law enforcement has legitimate needs to access encrypted information and he said what is needed is new legislation to resolve the issue. Cerf also approved of recent efforts by the U.S. Federal Communication Commission to regulate Internet providers, but noted a long-term fix in the form of new legislation is still required. Cerf also discussed the transition to IPv6, noting adoption of the protocol needs to accelerate to the meet the needs of a rapidly growing Internet. In addition, he called for the ubiquitous adoption of techniques and technologies such as BCP-38 and DNSSEC to help improve the security of Internet infrastructure. Finally, Cerf renewed his frequent calls for a free and open Internet, criticizing countries that seek to curtail their citizens' ability to access the Internet.


First Evolutionary History of 50 years of Music Charts Using Big Data Analysis of Sounds
Queen Mary, University of London (05/05/15)

Evolutionary biologists and computer scientists at Queen Mary University of London (QMUL) and Imperial College London used signal-processing and text-mining techniques to analyze the musical properties of 17,000 songs from the U.S. Billboard Hot 100 charts between 1960 and 2010. The researchers studied the trends in style, the diversity of the charts, and the timing of musical revolutions. The study found the greatest musical revolution occurred in 1991, when hip-hop arrived in the charts, and not 1964, as is the commonly held belief. In addition, the researchers found there is no evidence for a general trend toward homogenization in the modern music charts, and that 1986 was the least diverse year for the charts. The researchers identified these trends using a system that automatically grouped the songs by patterns of chord changes and tone in order to statistically identify patterns with an unprecedented degree of consistency. "We can actually go beyond what music experts tell us, or what we know ourselves about them, by looking directly into the songs, measuring their makeup, and understanding how they have changed," says QMUL researcher Matthias Mauch.


New Gold Standard Established for Open and Reproducible Research
University of Cambridge (05/04/15) Sarah Collins

University of Cambridge computer science researchers have shared more than 200 GB of data and 20,000 lines of code with the hope that this type of openness will be adopted by other fields, increasing the reliability of research results. The field of computer science has embraced open access more than other disciplines, but as more and more organizations publish their results, the reliability of research results has come into question. "Due to commercial sensitivities, corporations are reluctant to make their code and data sets available when they publish in peer-reviewed journals," says Matthew Grosvenor, a Ph.D. student from the university's Computer Laboratory. "But without the code or data sets, the results are irrelevant--we can't know whether an experiment is the same if we try to recreate it." In releasing their data, the Cambridge researchers have gone several steps beyond typical open access standards. All of the experimental figures and tables in the final version of their paper, which describes a new method of boosting data centers' efficiency, are clickable. "We think that this is the way forward for all scientific publications and so we've put our money where our mouth is and done it," Grosvenor says.


Computer Scientists Combine Computer Vision and Brain Computer Interface for Faster Mine Detection
UCSD News (CA) (05/04/15) Ioana Patringenaru

University of California, San Diego researchers have developed a new method, combining computer-vision algorithms with a brain-computer interface, to speed the detection of landmines in sonar images of the ocean floor. The researchers collected a dataset of 450 sonar images containing 150 inert, bright-orange mines placed in test fields. The researchers also trained the computer-vision algorithms on a data set of 975 images of mine-like objects. As part of the study, the researchers showed six volunteers a complete dataset, before it had been analyzed by the computer-vision algorithms. They then ran the image dataset through mine-detection algorithms, which flagged images that most likely included mines. The researchers showed the results to subjects outfitted with an EEG system, which was programmed to detect brain activity. The system demonstrated that subjects reacted to an image thought to contain a mine much faster when the images had already been processed by the algorithms. The algorithms are a series of classifiers, working in succession to improve speed and accuracy. The system's goal is to detect 99.5 percent of true positives and only generate 50 percent of false positives during each run through a classifier, which means the number of false positives will reduce with each pass.


New Methods for Realistic Surface Rendering in Computer Games
Vienna University of Technology (05/04/15)

Researchers at Vienna University of Technology's (TU Wein) Institute of Computer Graphics and Algorithms have developed a mathematical technique that makes surfaces appear much more realistic by accounting for light scattering that occurs below the surface. The technique, called the separable subsurface scattering (SSSS) method, calculates the scattering of a single beam of light below the surface of an object. "With this result we can create a simple filter profile, which can then be applied to the images again and again," says TU Wein researcher Christian Freude. A traditional computer image is then created and modified with the SSSS method, improving the appearance of the surfaces. "The final version of our method only takes half a millisecond per image in full-HD resolution, on standard commodity hardware," adds TU Wein researcher Karoly Zsolnai. The researchers say their method reduces the modification of a two-dimensional image to two one-dimensional calculations, which saves computing time but still yields very convincing results. "This reduction of the dimensionality has been achieved using various mathematical methods, ranging from exact integration and numerical optimization to user-driven color-profile modeling," says TU Wein researcher Thomas Auzinger.


Data Fusion Heralds City Attractiveness Ranking
Technology Review (05/01/15)

The Massachusetts Institute of Technology's Stanislav Sobolevsky and several colleagues used big data analysis to develop a method of ranking a city's attractiveness to visitors. The researchers' study focused on cities in Spain and drew on three large data sets: about 17 million anonymized credit and debit card transactions carried out by foreign visitors to Spanish cities in 2011, 3.5 million photos and videos taken in Spain by foreign visitors and posted to Flickr between 2005 and 2014, and 700,000 geotagged tweets posted from Spain by foreign visitors during 2012. The researchers, who defined a city's attractiveness as the total number of pictures, tweets, and transactions that took place there, say they learned several things by employing these ratings. First, they found a city's attractiveness scaled superlinearly, or more quickly, with its size, so larger cities attracted many more photos, tweets, and transactions. However, there were deviations from this finding. For example, some cities known to be popular locations for retired tourists from Europe saw high numbers of card transactions, but few photos or tweets. The study also was not able to clearly determine what was bringing foreign visitors to a given city, whether for tourism, business, or some other reason.


UC3M Creates Tool for Monitoring Brands on Twitter
Carlos III University of Madrid (Spain) (04/30/15)

A monitoring tool developed at Carlos III University in Madrid (UC3M) will enable brands to test their strategies on social media. The extractor detects different terms in tweets and creates a specific corpus for analyzing them, according to researcher Angel Garcia Crespo. The tool is designed to consider four indicators: popularity, engagement, reach, and quantity and variety of the content in tweets. Brands will be able to determine what people are talking about on Twitter, what users think, and who is being talked about the most on the social network. Experts have validated the initial results of the system, and UC3M already has presented a ranking of an analysis of the use of social media by 30 makes of cars. "Moreover, we found that among the 10 most powerful makes on social networks according to the ranking, eight of them are also the highest-selling," says UC3M Institute for Business Development's Nora Lado. The results generally suggest brands that make a greater effort obtain greater engagement. The researchers note this information is very important, as success on social networks does not rely so much on the number of followers as on the ability to generate interactions and create a solid community around the brand.


6 Big Data Projects to Aid Disaster Response
Government Computer News (05/04/15) Mark Pomerleau

The U.S. National Science Foundation (NSF) and the Japan Science and Technology Agency, which want to support efforts to transform disaster management with big data and analytics, have announced a program to jointly fund six U.S.-Japan research projects. Researchers at Johns Hopkins University and the University of Tokyo will develop new olfactory search algorithms that use sensors to identify sources of pollutants or other agents released into the air or sea. A team from Temple University and Japan's University of Aizu will design smartphone-based ad hoc emergency networks that can evolve as a disaster unfolds. Researchers from Arizona State University and Japan's National Institute of Information will explore resilient networks, social media mining, and information dissemination during disasters. The other projects will focus on a human-centered situation awareness platform for disaster response and recovery; data-driven critical information exchange in disaster-affected public-private networks; and efficient and scalable collection, analytics, and processing of big data for disaster applications. If big data and analytics are to be used, updated methods to "analyze large, noisy, and heterogeneous data in order to facilitate timely decision-making in the face of shifting demands" are necessary, according to NSF.


A Mouse That Beats the Gamers' at Super-Quick Motions
Swiss Federal Institute of Technology in Lausanne (04/29/15)

A new computer mouse developed by Arash Salarian at the Swiss Federal Institute of Technology in Lausanne looks and responds like a conventional mouse but offers nearly unlimited tracking speed. The supermouse is based on an algorithm that combines an optical sensor with a system based on accelerometers and gyroscopes. The algorithm enables seamless switching between the optical and inertial modes, with tracking at low speed using the optical sensor and high-speed tracking using the inertial sensor. A 32-bit microcontroller calculates the trajectory every millisecond and allows the inertial system to take over beyond a certain speed limit. The mouse is five times faster than those that use optical sensors, and Salarian says "it is very likely that it supports an even greater speed. The limit we measured only corresponds to the upper limit of test instruments." Salarian designed the mouse with gamers in mind, and blogs and specialized sites report it makes the action much more fluid.


Smart Phones Do Not Keep Secrets
UNM Newsroom (05/01/15) Carolyn Gonzales

University of New Mexico architecture professor Alex Webb has been using data and analytics for years to influence his designs. He says he seeks to gather together environmental data, urban behavioral data, and other data sets and use evolutionary intelligence and algorithmic models to integrate them into architectural designs. One of Webb's current projects is what he calls the Big Data Wall, a wall onto which data harvested from cell-phones in the area is projected. Webb says the goal of the project is to inform people of just how easy it can be to obtain information about themselves, such as social media and online shopping activity. In the case of the Big Data Wall, the data is gathered using an app that participants voluntarily download. Another of Webb's recent projects involved mining a variety of data, including geolocated tweets, to help determine the optimal location for a new public art project in Albuquerque. Using the data, Webb and his students were able to identify the Kirtland area as being "ripe for an influx of public art." Webb has also used environmental data to help influence the design of buildings, including a Mars rover testing facility.


Crawling Toward a Wiser Web
American Scientist (05/15) Vol. 103, No. 3, P. 184 Brian Hayes

When search engines respond to a query, they do so not by searching the Web, but a local replica called a crawl. Carrying out and storing a crawl of the Web has historically meant having access to significant storage and computing resources, but the Common Crawl Foundation, established in 2007, is trying to change that. Through its Common Crawl, the foundation seeks to give the public direct access to crawls of the Web. The latest Common Crawl was conducted in January 2015 and covered 1.8 billion Web pages. The Common Crawl is hosted on Amazon Web Services and anyone with sufficient storage capabilities can download the entire 139 terabytes for their own use. The Common Crawl website features more than 30 publications detailing studies that make use of Common Crawl data. Linguists have been among the most enthusiastic adopters of the Common Crawl, using it to collect millions of exemplars of the same text rendered in different languages, which can be used as the raw material for automatic translation. Other researchers have used the Common Crawl to analyze duplication of content on the Web, the frequency with which various numbers are used online, and the interconnectedness of the Web.


Abstract News © Copyright 2015 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe