Association for Computing Machinery
Welcome to the September 23, 2013 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


Better Models for Studying the Flow of Information in Networks
Umea University (Sweden) (09/19/13)

Umea University's Atieh Mirshahvalad has developed a group formation model for social systems. "This model shows that social groups are formed as a result of feedback between social ties, e.g. friendship, and individuals' intrinsic interest," Mirshahvalad says. She notes the model assesses the significance of communities for sparse networks that are inherently sensitive. "I show the result of our method on the sparse network of the European Court of Justice case law, for example, to detect significant areas of law," Mirshahvalad says. In addition, she focused on the simple spreading models that researchers often use to understand how information flows through a network. Mirshahvalad found that no idea is independent from other ideas, and waves of new information or technology necessarily interact with one another as they move through society. To account for these interactions, Mirshahvalad says she used a simple model in which different information waves interact with each other based on their novelty and analyzed the global effects of these interactions.


Linux Kernel Luminaries Talk Enterprise, Embedded and Why They're Coming Together
Network World (09/18/13) Jon Gold

Although developing the Linux kernel is a difficult and complicated process, it also is one that is moving ahead with some speed. One of the central issues at the LinuxCon North America conference was a discussion of present-day kernel development. The final keynote address discussed the proliferation of Linux-based mobile devices, making the embedded development branch much more important than it has been. However, although kernel development may still be somewhat divided, the overlap between kernel and embedded code is becoming more pronounced, according to Linux stable branch maintainer Greg Kroah-Hartman. In addition, the kernel is still likely to be important to future embedded and mobile development, says Linux creator Linus Torvalds. "The reason Linux runs really well on cellphones is that cellphones grew up," he notes. "They're already thousands of times more powerful than the original machine that Linux came to be on." One aspect of kernel development that could help in bringing in new innovation is the diverse nature of the work. "The kernel, in many respects, has more opportunities for new people to come in than any other open source project," Torvalds says.


EC to Present ICT Education Overhauls to Create Future Facebook and Google
V3.co.uk (09/19/13) Dan Worth

The European Commission is outlining plans to improve information and communications technology (ICT) education in Europe to better prepare students to work in technology-related fields. European commissioner Neelie Kroes says a shortage of skilled teachers and equipment is making ICT education inadequate, thereby limiting Europe's innovation potential. "The fact is, ICT enables a whole new way of learning," Kroes says. "Information is no longer locked up; there is an open world out there for all to explore. Open resources that enable a million different ways to learn. …If we enable that there's a huge opportunity." Another important change, according to Kroes, is her plan to end roaming charges to ensure that young people are not cut off by expensive fees. With these changes, Europe could lead the next wave of technology innovation and produce global tech heavyweights, she says. "Why shouldn't the next Facebook, the next Google, the next Kickstarter be European?" Kroes asks. "I think they can. We have the tools, we have the technology, we definitely have the talent. And in a connected continent there is no limit to our ambitions."


A Computer Scientist's Approach to Medicine
MIT News (09/18/13) Abby Abazorius

Massachusetts Institute of Technology computer scientist Stephanie Seneff applies her natural language processing research to solving biological problems. Seneff began using computer science in human biology by comparing the side effects of statin drugs to other drugs, which led her to focus on the relationship between nutrition and health. Seneff then modified a natural language processing algorithm that she and her colleagues had developed to study the interaction between health and environmental factors such as drugs and toxic chemicals. The algorithm analyzes research papers and testimonials of drug side effects for notable statistical frequencies of key symptom-related words and biological terms. Seneff next correlates keywords that appear together in the same paragraph or the same drug side effect report, and applies the results to research literature to link symptoms documented in the side effect reports, biologically-related words in research literature, and environmental toxins or medications used by the selected population. Seneff's research has linked herbicides to problems with the body's gut bacteria, essential amino acid loss, and interference with necessary enzymes, which could result in conditions such as obesity, Alzheimer's, and autism.


Cybersecurity an Occupation, Not a Profession, Says Report
FierceGovernmentIT (09/18/13) David Perera

A new study concludes that cybersecurity is still an emerging field, and threats change too quickly for the federal government to undertake its professionalization. The report from a National Academy of Sciences panel commissioned by the U.S. Department of Homeland Security stated that professionalization means the formation of standards governing ethics and education as well as an expectation that workers have certification to prove minimum levels of knowledge. Cybersecurity comprises "a variety of contexts, roles, and occupations and is too broad and diverse to be treated as a single occupation or profession," the report said. The task force allowed for only one possible cybersecurity sub-field: digital forensics examiners. "The work is comparatively narrowly defined by procedures and law, the relevant domain of expertise appears to be sufficiently narrow, and the appropriate professionalization mechanism is clear," the report said, noting that such conditions are absent elsewhere in cybersecurity.


Suspect NIST Crypto Standard Long Thought to Have a Back Door
Government Computer News (09/17/13) Kevin McCaney

The U.S. National Institute of Standards and Technology is strongly advising against using the Dual Elliptic Curve Deterministic Random Bit Generation (Dual_EC_DRBG) standard for elliptic curve cryptography. Adopted in 2006, Dual_EC_DRBG was long suspected by cryptographers to have a back door, but The New York Times recently reported that the U.S. National Security Agency installed a back door during the standard's development. In 2006, ProPublica reported researchers in the Netherlands published a paper saying the algorithm was insecure and could be hacked from an ordinary PC. The following year, cryptographer Bruce Schneier wrote about the algorithm's slow speed and small bias in generating random numbers, favoring some numbers over others, making them more predictable. He also pointed to a paper by Dan Shumow and Niels Ferguson presented at the CRYPTO 2007 conference showing a secret set of numbers linked to the numbers used to generate the elliptical curve, and noting that the algorithm could be exploited if someone knew the second set of numbers. Such issues, coupled with Dual_EC_DRBG's slow speed, would appear to make it unlikely to be used very often.


Gold in the Data, but a Shortage of Miners
Federal Computer Week (09/17/13) Mark Rockwell

As the government and private industry increasingly strive to mine huge quantities of data for valuable information, a shortage of big data experts is likely to persist. "I don't see the gap narrowing. Universities aren't producing enough. We have 80 grads per year," says North Carolina State University director of advanced analytics Michael Rappa. "We could be producing 800 per year and still not meet demand. With each class, the demand goes up." Although the term data analyst can apply to workers ranging from scientists to data entry personnel, the masters of science in analytics degree integrates math, analytics, business, and science to enable students to optimize information use. While over two dozen universities have launched data science programs in the past three years, demand is significantly outpacing supply. Only approximately 300 data specialists graduate each year from U.S. universities, says Northwestern University program director in analytics Diego Klabjan. A McKinsey Global Institute study predicts a shortfall of between 140,000 and 190,000 data analytics experts in the United States by 2018. Federal agencies are likely to have difficulty competing with private industry to hire data specialists, due to a lack of flexibility in pay and benefits and the time required to hire new workers.


Inspired by Ruby on Rails, Grails to Go Beyond Web App Dev
InfoWorld (09/17/13) Paul Krill

Grails has been available since 2007, and has been used to deploy Netflix to the Amazon Web Services cloud, as well as for the Hudki search engine for real estate and used cars. In an interview, Grails project lead Graeme Rocher of Pivotal discusses where the open source Web development framework is headed. Inspired by Ruby on Rails, Grails leverages the Java Virtual Machine and the Groovy language. The headline feature for Grails 2.3 is comprehensive REST support on the server side, but there are new application programming interfaces for supporting asynchronous programming as well, Rocher says. There will be a minor 2.4 release, which supports Spring 4.0 and Groovy 2.2, before the arrival of version 3.0 next year. "With Grails 3, we're looking at extending the reach of Grails to target other [deployment] destinations [besides Java application servers], so things like batch processing applications, Hadoop, event-driven systems,"Rocher says. The project will introduce the concept of an application profile, or a set of plug-ins that allows users to target the particular deployment environment.


Sweet! Tiny, Far-Flying Robot Based on Honeybee
Computerworld (09/17/13) Sharon Gaudin

University of Queensland researchers are developing a flying robot based on the honeybee. The honeybee is a fuel-efficient flyer that uses its eyes, antennae, and abdomen to direct itself and to fly using as little energy as possible. "The bees are living proof that it's possible to engineer airborne vehicles that are agile, navigationally competent, weigh less than 100 milligrams, and can fly around the world using the energy given by an ounce of honey," says Queensland professor Mandyam Srinivasan. "Honeybees often have to travel very long distances with only a small amount of nectar, so they have to be as fuel-efficient as possible." Honeybees use vision to judge their air speed and then move their abdomens to make their bodies more fuel-efficient. The honeybee also uses its antenna to judge the speed of the air flowing past its body. "A better understanding of how honeybees fly takes us one step further towards perfecting the flying machines," Srinivasan says.


On the Road to Fault-Tolerant Quantum Computing
Berkeley Lab News Center (09/16/13) Lynn Yarris

Researchers at the Lawrence Berkeley National Laboratory and Tsinghua University have demonstrated, for the first time, high-temperature superconductivity in the surface of a topological insulator. They say the breakthrough could lead to the creation of a prerequisite for fault-tolerant quantum computing, a quasiparticle known as the "Majorana zero mode." "We have shown that by interfacing a topological insulator, bismuth selenide, with a high temperature superconductor, BSCCO (bismuth strontium calcium copper oxide), it is possible to induce superconductivity in the topological surface state," says Berkeley Lab's Alexei Federov. Although the researchers have not yet identified a Majorana zero mode in their bismuth selenide/BSCCO heterostructures, they believe the material is fertile enough to do so. "Our studies reveal a large superconducting pairing gap on the topological surface states of thin films of the bismuth selenide topological insulator when grown on BSCCO, [which] suggests that Majorana zero modes are likely to exist, bound to magnetic vortices in this material, but we will have to do other types of measurements to find it," Federov says.


Researchers Use Machine Learning to Boil Down the Stories That Wearable Cameras Are Telling
University of Texas at Austin (09/12/13)

Computer scientists at the University of Texas at Austin are using machine learning to create tools that will help manage the large quantities of video data generated by wearable camera technology. "The amount of what we call 'egocentric' video, which is video that is shot from the perspective of a person who is moving around, is about to explode," says UT Austin professor Kristen Grauman. "We’re going to need better methods for summarizing and sifting through this data." The new method, called story-driven video summarization, takes lengthy videos and condenses them into short clips that capture the main story. Using machine learning, the team taught the system to score the significance of objects based on factors such as whether the user touched them and the frequency with which they appeared in the center of the frame. The system then compiles high-scoring frames and looks for earlier frames that impact later frames, using a method adapted from a Carnegie Mellon University technique. Beyond everyday video users, the researchers say video summarization has potential applications for niche markets such as law enforcement and people with memory loss.


Supercomputer Boosted With Graphic Processors
ETH Life (09/12/13) Simone Ulmer

The Swiss National Supercomputing Center (CSCS) is developing Piz Daint, a new supercomputer system that will provide the necessary compute performance and consume less power than the CSCS' current flagship supercomputing system, Monte Rosa. Piz Daint is being extended with graphic processing units (GPUs). As part of the extension, one of two conventional processors (CPUs) located on a computer node is being replaced by a GPU. The new supercomputer receives much of its overall performance and efficiency from a novel interconnecting network between compute nodes. "Given the ever-growing demands of computer models, we can only contain energy consumption in supercomputing with a radical change in computer architecture," says CSCS director Thomas Schulthess. Initial testing shows that a climate simulation on Piz Daint runs more than three times faster and finds a solution with seven times less energy consumed compared to Monte Rosa. Although accelerator processors often remain unused in real simulations because they cannot be used effectively with the legacy computer codes and algorithms, several computing centers have recently implemented them successfully into supercomputers, according to Schulthess.


Abstract News © Copyright 2013 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe