Association for Computing Machinery
Welcome to the September 28, 2015 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


Gender Gap Widens in Security Field Long Dominated by Men
Reuters (09/28/15) Jim Finkle

The number of female information security professionals has declined over the last two years, according to an ISC2 survey of nearly 14,000 cybersecurity professionals. In ISC2's survey two years ago, 11 percent of cybersecurity professionals were women, but women made up only 10 percent of those polled by the organization for its latest survey. "It is certainly alarming to see it go down to 10 percent," says ISC2's Elise Yacobellis. The survey also found pay inequalities between men and women in the cybersecurity field. Although 47 percent of men in the survey reported annual salaries of at least $120,000, only 41 percent of women reported receiving salaries at that same level. Sixty-two percent of respondents also said their organizations did not have enough security professionals. Yacobellis says correcting the gender imbalance in the field could help to address this labor shortage. Joyce Brocaglia, founder of the Executive Women's Forum for information security professionals, says although many companies claim they want to hire more women in information security positions, "I don't often see them making the investments they need to ensure that."


Scientists Stop and Search Malware Hidden in Shortened URLs on Twitter
Engineering & Physical Sciences Research Council (09/25/15)

Cardiff University researchers have developed a technique for detecting tweets containing malicious links. The researchers note hackers frequently exploit public excitement around events such as sports tournaments by crafting tweets that are made to look like they relate to the event, but instead contain links to malicious websites hosting "drive-by download" attacks. They say it can be difficult to determine what links are malicious due to ubiquitous URL-shortening services used by Twitter users. Cardiff researchers studied tweets containing URLs collected during the 2015 Super Bowl and the cricket world cup finals, and monitored the interactions between the linked URLs and a user's device in order to identify the features of a malicious attack. They found these features include creating processes and modifying registry fields, as well as certain patterns of processor use and network adapter status. The researchers used this data to train a machine classifier to recognize predictive signals that can identify a malicious URL. During testing, the researchers were able to identify potential malicious tweets with 83-percent accuracy within five seconds after a user clicked on a given link, and with 98-percent accuracy within 30 seconds. The researchers plan to stress-test their tool during next summer's European Football Championships. The researchers presented their study at the 2015 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining in August.


Research Sheds New Light on Big Data Breaches
Government Technology (09/23/15) Eyragon Eidam

Researchers at the University of New Mexico (UNM) and Lawrence Berkeley National Laboratory have conducted a study examining the recent history of data breach trends. They examined 10 years' worth of data from the Privacy Rights Clearinghouse and found the number of large breaches has decreased slightly since 2005.  The study points to a "heavy-tailed distribution" and indicates the vast majority of breaches are small, with large breaches skewing the average results, say UNM researchers Ben Edwards and Stephanie Forrest. However, the researchers also note the number of accidental and negligent data breaches was holding steady during the same period of time. "We tested models in which there was no trend in the size or frequency of breaches, ones that had a linear trend, and models in which the size fluctuated several times over the course of the last decade," they say. "Comparing these models, we found that neither data breach frequency nor size have increased over time."  Edwards and Forrest recommend a balanced view of recent events.  In a concept known as the "Red Queen" hypothesis, both hackers and companies are improving their methods and maintaining a sort of balance.  "If true, rather than an increase or decrease in breaches, we may have stasis, with neither attackers nor defenders gaining an advantage, even though both are 'running' very hard," the researchers say.


Election Polling: When 'Who Will Win?' Beats 'Who Do You Like?'
The Wall Street Journal (09/25/15) Neil King Jr.

Whereas recent poll numbers for U.S. presidential candidates are based on gauges of voter preferences, statisticians and political scientists are more reliant on the question, "Who do you think will actually win?" when it comes to predictive markets.  Microsoft Research economist David Rothschild says asking voters' expectations of the actual winner "grabs a much larger slice of people's experience and knowledge, including a whole range of idiosyncratic facts," which otherwise cannot be measured.  Microsoft's PredictWise project employs an aggregate of several Internet betting exchanges and prediction markets to create a running tabulation of the likelihood of something happening.  PredictWise established a 37-percent chance that Jeb Bush will be the Republican presidential nominee, while Donald Trump is currently the leading candidate with 28 percent in a running average of national preference polls.  In a 2012 study, Rothschild and economist Justin Wolfers argued voter expectation is the more accurate data point, while a 2013 report on U.S. presidential elections found voter expectation polls failed to pick the winner "in only 8 percent of 214 surveys conducted from 1932 to 2012."  Last year, Rothschild calculated the expectation question is 10 times more potent than the preference survey.


Google AI Beats Humans at More Classic Arcade Games Than Ever Before
Tech Republic (09/23/15) Nick Heath

Google's DeepMind artificial intelligence (AI) has been upgraded to defeat human players in 1980s arcade games even better than before, thanks to refinements to its reinforcement learning software. Its performance has been improved so it can beat people in 31 games, up from 23 games in an earlier iteration. The updates have brought DeepMind close to the performance levels of a human expert in various games, such as Up and Down, Zaxxon, and Asterix. The system has not been coached on how to win the 49 Atari games it is exposed to; instead it plays each of the games over seven days, gradually improving its performance. The AI employs a deep neural network, with each linked layer of computer nodes tasked with feeding data back through the layers to top-level neurons that make the final call on what DeepMind needs to decide. The system's Deep Q-network is fed pixels from each game, and it applies its reasoning powers to determine the distance between objects on screen and other factors. The system also builds a model of which action will lead to the best results by studying the score achieved in each game. The updated DeepMind makes fewer mistakes when playing the games by reducing the odds of it overestimating a positive outcome from a specific action.


Computer Scientist Seeks Stronger Security Shroud for the Cloud
UT Dallas News Center (09/24/15) Chaz Lilly

University of Texas at Dallas computer scientist Zhiqiang Lin will be exploring virtual machine introspection further thanks to a U.S. National Science Foundation (NSF) Faculty Early Career Development Award. Lin has developed a novel technique that enables one computer in a virtual network to monitor another for invasions or viruses.  With $500,000 in funding provided over five years by NSF, Lin will develop principles and techniques to automatically bridge the semantic gap that exists between the data structures a virtual machine understands and the raw bytes viewable from the outside that provide little contextual information.  Lin says the semantic gap is a challenge because it demands a great deal of manual effort.  "It is consequently tedious and time-consuming to enable virtual security for different operating systems and even different versions of the same system," He says. Lin plans to reuse compiled code that already understands the semantics of the virtual machine's data structures. The existing code can process data from the virtual machine and permit an analysis of the activities occurring inside. Lin says his research will be helpful for protecting cloud applications.


Immersive 3D Training Experience Unveiled With Potential to Transform Chemical Engineering Industry
Loughborough University (09/23/15) Charlotte Hester

The chemical industry could use immersive environments to train its technicians and staff on safety-critical tasks, such as safe startup and emergency shutdown procedures.  A Loughborough University computer scientist has collaborated with BP and immersive technology specialist Igloo Vision to build an immersive training experience called the Igloo.  The team brought a virtual production plant to life by integrating it with a fully dynamic process simulator and a control room simulator.  The realistic plant environment has a 360-degree projection screen that enables trainees to learn as they go, and creates realistic sound effects that mimic the real-life operation of the plant.  The dynamic process simulator produces the responses expected from the real plant in different safety-critical operation scenarios. "The Igloo presents a fantastic opportunity to bring about a step-change for the chemical engineering industry; it is perfect for training and allows technicians to try out operations which they don't do that often, such as an emergency shutdown," says Loughborough University professor Paul Chung. "A tool like this could also be used to certify that a technician has been trained properly." The team has tested the concept at BP's chemical manufacturing site in Hull, Britain.


Study Suggests London Underground May Be 'Too Fast'
BBC News (09/23/15) Jonathan Webb

A computer model of the London Underground predicts trains that travel too fast will compound overall congestion levels when key locations outside the city center, where travelers switch to another mode of transport, become bottlenecks.  In comparison, the New York City subway system, which is more centralized, gives rise to downtown congestion that would be improved by faster trains.  The researchers found London's system would best operate with trains traveling about 1.2 times faster than the average speed on the roads, making optimum train speed about 13 mph, compared to the current average of 21 mph.  The study reflects the increasing interconnectedness of transport networks and a trend toward multimodal trips, which Marc Barthelemy, a statistical physicist at the CEA research center in Saclay, France, says are unavoidable in large urban centers.  Once models of the London and New York road and subway networks were complete, the researchers linked them based on the proximity between streets and subway stations.  "We create these connections, and then we make an assumption, which is: when someone wants to go from A to B, they look for the quickest path--whatever the mode," Barthelemy notes.  The study is theoretical, as the model includes no passenger data from the transport system itself.


GitHub Open Sources a Tool That Teaches Students to Code
Wired (09/22/15) Cade Metz

GitHub, which has long been a fixture of the coding world, increasingly is becoming an integral part of coding education. GitHub's John Britton says there are hundreds of thousands of students currently enrolled in GitHub's various education programs, and there are more than 3,000 teachers using GitHub as an education tool. This growing popularity among students and educators has inspired some of them to find ways to make GitHub work better in an educational context. For example, Mark Tareshawty, a senior computer science major and teaching assistant (TA) at Ohio State University, says working as a TA opened him up to the promise GitHub holds as an educational tool, but he found it still had some drawbacks. In particular, he thought sharing assignments on GitHub could be easier. So, with funding from GitHub's Summer of Code program, he developed Classroom for GitHub. The tool enables teachers to invite students onto GitHub to share coding assignments with a single URL, which automatically sets them up to view, modify, and collaborate on code. GitHub released the open source tool earlier this week as part of its GitHub Education service, which provides free code repositories teachers and students can use to post and collaborate on code.


Improving Simulation Tools
CORDIS News (09/22/15)

The numerical simulation in technical sciences (NUMSIM) project, a four-year initiative funded by the European Union involving researchers from Europe and Latin America, developed improved simulation tools. NUMSIM simulated corrosion protection, developing meshless methods for optimizing cathodic protection systems.  The researchers say they advanced existing boundary element method (BEM) models for cathodic protection.  The effort also involved exploring topics such as computer graphics and adaptive coupling, in addition to high-performance computing.  Work included advanced BEM models based on the optimization of problems involving multiple materials, plus isogeometric boundary element formulations for application in future models. The researchers also developed software to validate the modeling outcomes, as well as techniques for ascertaining where one modeling method becomes more suitable than another. They also implemented models, algorithms, and other elements of computer graphics that support interactive technical software.  The results are applicable to crack propagation in solids, tunnel excavation, and identification of non-linearities.  NUMSIM also developed parallel computing tools for use in large engineering problems.


Diagnostic Apps for ADHD, Dementia?
Jewish Voice (09/23/15) Abigail Klein Leichman

IBM Israel researchers won the top prize at the Brain Inspired Technology for Education (BITE) Hackathon for a prototype application that screens for early indications of attention deficit and hyperactivity disorder (ADHD).  The smartphone app asks users to draw a rectangle 10 times, while accelerometers and sensors in their phone or wearable device track the movement.  The researchers note this process helps identify and classify individuals who have difficulty initiating and maintaining continuous motor activity. The researchers first established a baseline from electroencephalogram headband tests on 17 volunteers, four of whom have ADHD, and then let them try the app and found preliminary evidence for a possible relationship between mental and motoric characteristics within 25 seconds of the subject drawing the rectangles.  "The IBM researchers believe this type of analytics can help identify different types of ADHD, not just a single classification of those with difficulties in concentration," says IBM Research. IBM Israel researchers also developed a prototype smartphone technology to detect dementia at an early stage by analyzing voice and speech patterns.  Users are asked to perform cognitive tasks such as counting backwards, describing a picture, or identifying words that start with a certain letter. The technology has shown an 85-percent accuracy rate in preliminary trials.


Larry Smarr: Building the Big Data Superhighway
EnterpriseTech (09/25/15) Alison Diana

In an interview, California Institute for Telecommunications and Information Technology founding director Larry Smarr details the Pacific Rim Platform, an effort to create a high-speed virtual big-data superhighway that serves as a separate Internet infrastructure. The program seeks to transmit data at speeds of 10 Gbps to 100 Gbps by leveraging fiber-optic networks. Smarr says the U.S. National Science Foundation-funded Pacific Rim Platform will be built on previous investments such as the Corporation for Education Network Initiatives in California (CENIC) network. He thinks the construction of the platform is easier on the West Coast because of the existence of CENIC, as well as the abundance of research campuses that have a long tradition of collaboration. Smarr also credits the pioneering efforts of the U.S. Energy Department's Energy Sciences network, which showed the feasibility of linking together multiple demilitarized zones at various Energy Department labs via a 100 Gbps backbone, as a tech-transfer model for the Pacific Rim Platform. He notes today's Internet cannot support big data, creating the need for an independent high-performance cyberinfrastructure. Smarr cites project management as the most formidable challenge of the project, while technologies researchers are deploying across participating campuses include FLASH I/O Network Appliances to address the problem of terminating a 10 Gbps optical fiber. Smarr says later stages of the project will migrate to software-defined networks.


Abstract News © Copyright 2015 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe