Welcome to the March 2, 2015 edition of ACM TechNews, providing timely information for IT professionals three times a week.
Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).
HEADLINES AT A GLANCE
Could IBM's Brain-Inspired Chip Change the Way Computers Are Built?
The Washington Post (03/02/15) Amrita Jayakumar
A team from IBM last week traveled to Washington, D.C., to demonstrate new computer chips they have developed that are inspired by the structure of the brain. The chips grew out of funding from the U.S. Defense Advanced Research Projects Agency and partnerships with several national laboratories. The goal was to create chips that consume little power but have a powerful ability to identify patterns, two weaknesses of current computer architectures. Dharmendra Modha, head of brain-inspired computing at IBM who received the ACM Gordon Bell Prize in 2009 for his work on cortical simulations, says the company wants to create a "brain in a box" that consumes less than 1 kilowatt of power. The chips could have many applications, ranging from performing biological security checks and monitoring computer networks for malicious activity to piloting autonomous spacecraft and consumer products. The chips also could be used in supercomputers, in particular the new Summit and Sierra supercomputers IBM is developing for the Energy Department. However, such applications are still a long way off. The chips were invented last year and the most advanced "brain in a box" IBM has produced using them so far is only as intelligent as the brain of a bee.
U.K. Research Aims for Pervasive Mobile Robotics
The Engineer (United Kingdom) (03/02/15) Julia Pierce
Oxford University researchers are developing situation-aware mobile robotic systems for use in applications such as transport, logistics, space, defense, agriculture, and infrastructure management. The project, led by Oxford professor Paul Newman, aims to create the world's leading research program in mobile autonomy. The researchers plan to develop robots that can interpret data from sensors such as cameras, radars, and lasers in order to form a map of their surroundings. However, the group also wants to reduce the reliance on expensive sensors by developing technologies that lower the cost of autonomy and enable less-expensive sensors to be used. "Surveying is an important part of our work," Newman says. "Many of our algorithms rely on detailed and accurate surveys of the environment to function." The research program also will include several separate Flagship projects covering applications such as personal transport, inspection, and logistics. For example, the researchers will examine autonomous driving and advanced driver-assist technologies, as well as inspection and mapping from moving platforms.
When Driver Error Becomes Programming Error
Inside Science (02/26/15) Joel N. Shurkin
If automated automobiles become practical and widely adopted, then car accidents will be the result of programming errors instead of driver errors, which makes the assignment of responsibility in litigation a challenge. At a recent meeting of the American Association for the Advancement of Science, Stanford University researchers announced the production of an automated vehicle that can compete with champion amateur drivers on a racetrack. The car uses global-positioning systems, computer-driven controls, and programmed rules to drive and navigate. Stanford professor Chris Gerdes says its computerized thinking process raises important concerns. For example, because such a car is programmed to obey all traffic rules and not violate laws, there may be limitations to its usefulness. One example is a programmed vehicle's inability to cross double lines to get around an illegally parked car, because such a maneuver would technically break the rules. University of South Carolina in Columbia professor Bryant Walker Smith thinks with the advent of automated cars, the onus of liability will shift more to manufacturers than consumers, with the costs ultimately passed on to consumers. He also notes if the cars improve safety, it is likely the number of accident-related lawsuits will decline.
UK Researchers Are Building Robotic Pants
Computerworld (02/26/15) Sharon Gaudin
Scientists at the University of Bristol have developed robotic pants with built-in artificial muscles designed to aid the elderly or people with disabilities. The soft robotic clothing gives users added strength and balance to prevent falls and let them move around more easily. The material also can give users bionic strength, helping them stand up, climb stairs, and walk more steadily. The team combined software robotics with three-dimensional printing and nanotechnology to enable the exoskeleton to work in coordination with the user's own muscles. Bristol researcher Jonathan Rossiter says the technology has the potential to address many of the issues associated with existing devices used by people with mobility problems, as well as reducing healthcare costs. He says the robotics might even replace crutches, stair lifts, and wheelchairs. "This is the first time soft robotics technologies have been used to address the many rehabilitation and healthcare needs in one single type of wearable device," Rossiter notes.
New Notre Dame Paper Focuses on Degree Centrality in Networks
Notre Dame News (02/25/15) William G. Gilroy
University of Notre Dame researchers have developed the Node Prominence Profile (NPP), a methodology that accurately predicts the future degree centrality of nodes by incorporating both the macroscopic and microscopic properties of a social network. "Our method can be effectively used in a variety of applications that rely on inferring a nodes' importance in the future; for example, predicting future important customers [support the advertisement strategy] or identifying individuals on the fringes that rise to be centric in adversaries' network [early elimination of important targets in terrorists network]," says Notre Dame professor Nitesh Chawla. The researchers found NPP can significantly outperform current state-of-the-art methods in predicting node degree centrality. "Our method, NPP, is helpful in the prediction of an individual's future importance because it reconciles the trade-offs between two important principles that drive the evolution of social networks--preferential attachment and triadic closure," Chawla says. Although the researchers focused on applying NPP to social networks, they think it also can be applied in biological, disease, protein interaction, food, and transportation networks.
QR Codes Engineered Into Cybersecurity Protection
University of Connecticut (02/26/15) Colin Poitras
University of Connecticut researchers led by professor Bahram Javidi want to use quick response (QR) codes to protect national security. They are using advanced three-dimensional optical imaging and extremely low-light photon counting encryption to transform a conventional QR code into a high-end cybersecurity application that can be used to protect the integrity of computer microchips. The researchers found they were able to compress information about a chip's functionality, capacity, and part number directly into the QR code so it can be obtained by the reader without accessing the Internet, which Javidi says is an important cybersecurity breakthrough because linking to the Internet greatly increases vulnerability to hacking or corruption. The researchers also applied an optical-imaging mask that scrambles the QR code design into a random mass of black-and-white pixels. Another layer of security is then added through a random phase photon-based encryption, which converts the snowy image into a darkened image with just a few random dots of pixilated light.
MIT News (02/27/15) Julia Sklar
Massachusetts Institute of Technology (MIT) senior Sheldon Trotman enjoys developing apps and computer programs that solve inefficiencies in the real world. For example, Trotman founded a group of technology consultants who worked on design, development, and marketing, offering their services to clients with a business model for an undeveloped product. Trotman also developed an idea for making use of latent processing power in mobile phones. He used this idea to win an Intel-sponsored hackathon. "I just see inefficiencies everywhere, and I want to fix them," Trotman says. He also designed an app that lets restaurant managers and employees pick up and drop shifts in real time, avoiding the hassle of making phone calls and sending emails looking for last-minute replacements. In the future, Trotman plans to work on streamlining the process of investing. His first research experience at MIT was in the Humans and Automation Laboratory, where he assisted in simulating human interactions with autonomous vehicles. "I'm definitely still stuck on artificial intelligence," Trotman says. "It's just another way to make people more effective at what they're already good at doing: unstructured problem solving."
Linguists Tackle Computational Analysis of Grammar
UChicago News (IL) (02/25/15) Benjamin Recchie
University of Chicago (UC) researchers are studying natural language morphology in an attempt to develop computers that are better at understanding human language. The researchers are using the Research Computing Center's (RCC) Midway supercomputing cluster to analyze corpa, which are standard bodies of written language that can contain billions of words taken from many different genres of writing. "A typical scenario for us is that, given some raw data, we have some intuition about certain patterns in the data, and we collaborate with RCC to create visualization tools to display data in a way that enables us to explore these patterns," says UC researcher Jackson Lee. The visualization shows what words occur most often before and after it in a natural language corpus. "The construction of this visualization tool grew out of the observation that overall word distribution patterns are sensitive to the specific distribution of individual words, and we need a tool to 'see' what the grammar of a given word really looks like," Lee says. He notes a better understanding of natural language morphology can lead to better designed human-machine interfaces and a better way to search large databases.
Megan Smith Wants to 'Debug' Tech's Diversity Problem
NextGov.com (02/23/15) Jack Moore
The lack of representation of women and minorities in the U.S. technology sector is a problem Obama administration chief technology officer Megan Smith wants to solve. For example, less than a third of workforces at major U.S. tech companies are women, according to a Bloomberg analysis last summer. Meanwhile, federal statistics estimate women hold only about 30 percent of information technology jobs government-wide, even though they compose about 44 percent of the federal employee pool. Still, Smith is optimistic change is looming, especially in Silicon Valley, where the problem is often most pronounced. "Of all the industries, [tech is] an industry that is data-driven and it is innovative--and it moves fast once it sees what the problem is," she says. "And I think there's a waking up going on." Smith says studies show a more diverse workforce leads to better products, companies, and financial performance, and it is the government's function to convene the various stakeholders together to work out a solution. Code for Progress' Aliya Rahman sees a need for policy setting to encourage more diversity efforts, such as standardization of ways for measuring certain skill sets "that essentially handles risk mitigation both for the job seeker and the employer."
Open Source Tool Maps Global Humanitarian Responses
Government Computer News (02/24/15) Mark Pomerleau
The U.S. State Department touted the one-year anniversary of the MapGive open data platform during Open Data Day on Feb. 21. The department says the mapping and satellite imagery capabilities of MapGive have enhanced human preparedness response. Volunteers used MapGive to create base-map data for first responders ahead of a typhoon in the Philippines in 2013, and the effort aided preparations for another typhoon a year later. The State Department describes MapGive's imagery to the crowd (IttC) methodology as the platform's key element, enabling high-resolution satellite imagery to be published in ways that can be easily integrated into OpenStreetMap, a collaborative project to create a free and editable map of the world. IttC methodology enables easier viewing and mapping for volunteers with varying degrees of expertise. Moreover, the department calls MapGive the flagship of its Open Government Plan, serving as a "nexus of expertise" for its domestic bureaus and diplomatic posts, as well as other agencies. The State Department also plans to continue to develop the platform as a tool for digital diplomacy.
Data Mining Indian Recipes Reveals New Food Pairing Phenomenon
Technology Review (02/25/15)
Indian Institute of Technology (IIT) researchers created a flavor network in which food ingredients are linked if they appear together in the same recipe. The researchers used the online cooking database TarlaDalal.com to download more than 2,500 recipes, containing 194 ingredients, to create the network. The researchers wanted to discover, using big data phenomena such as clustering effects, if ingredients sharing flavor compounds occur in the same recipe more often than if the ingredients were chosen at random. The researchers found Indian cuisine is characterized by strong negative food pairing; they also found specific ingredients dramatically effect food pairing. For example, the presence of ingredients such as cayenne pepper, green bell pepper, coriander, garam masala, tamarind, ginger, and cinnamon strongly biases the flavor-sharing pattern of Indian cuisine toward negative pairing. "Our study reveals that spices occupy a unique position in the ingredient composition of Indian cuisine and play a major role in defining its characteristic profile," IIT researcher Anupam Jain said. "Our study could potentially lead to methods for creating novel Indian signature recipes, healthy recipe alterations, and recipe recommender systems," the researchers say.
Jennifer Granick on New Cybersecurity Bills and Priorities to Achieve a More Secure Internet
Stanford Lawyer (02/26/15)
In an interview, Jennifer Granick, director of civil liberties at the Stanford Center for Internet and Society, discusses three cybersecurity bills recently proposed by the White House and her views on how Stanford and other universities can promote better security standards. Granick says she is dubious of the idea in current and proposed legislation that threat information-sharing laws need to include indemnification from privacy laws in order to encourage companies to participate. She says there is broad consensus that information sharing can be achieved without violating privacy laws. The government should instead focus on finding other ways of incentivizing such sharing, even though Granick is skeptical a government-led information-sharing effort will have much success because government spying activities have hurt the government's credibility in the eyes of many companies. Granick sees a proposed federal breach notification law as promising, but believes it should be strengthened to set a stringent national standard. She also strongly opposes a bill that would update and enhance penalties outlined by the Computer Fraud and Abuse Act, saying the new penalties are unlikely to deter criminals and instead will have negative impacts on security researchers and average Internet users. Finally, she says Stanford and other universities can help the cause of cybersecurity by advocating for improvements to the security of critical infrastructure and by encouraging better security practices across government and private industry.
Abstract News © Copyright 2015 INFORMATION, INC.
To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.