Association for Computing Machinery
Welcome to the December 18, 2015 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


At UN, China Tries to Influence Fight Over Internet Control
The New York Times (12/16/15) Dan Levin; Paul Mozur

China has largely spearheaded a move to include multilateral or state control in the final version of the United Nations (UN) General Assembly's policy document on Internet governance presented on Wednesday. It is nonbinding for member states, but it delineates policies and grants authority to UN entities where China has substantial clout. The Ten-Year Review of the World Summit on the Information Society also supports an international precedent that governments can argue certifies their own agendas. The first summit's outcome document set agendas on closing the digital divide between developed and developing nations. However, with social media and hacking ascending in the last decade, China and other countries have urged a renegotiation of previous Internet governance pacts that excluded all nongovernment stakeholders. Despite the exclusion of many of its proposed insertions and deletions of language into the latest document, China appears mollified the document acknowledges "a leading role" for governments in cybersecurity issues relating to national security and refers to the UN Charter, which enshrines precepts of state sovereignty and nonintervention by the UN in domestic matters. Meanwhile, China is hosting its World Internet Conference, a global lobbying initiative to push the concept that each country should have the unrestricted right to regulate cyberinfrastructure and activities, including censorship, in its territory.
View Full Article - May Require Free Registration | Return to Headlines | Share Facebook  LinkedIn  Twitter 


GAO Finds Gender Disparities in Federal STEM Grants
FierceGovernmentIT (12/17/15) Chris Nehls

The U.S. Government Accountability Office (GAO) has uncovered evidence of problems among several federal agencies in reporting on gender disparities in awarding research grants in science, technology, engineering, and math (STEM) disciplines. GAO examined six agencies responsible for funding 90 percent of STEM grants over the last three years, and found indications of gender disparities in STEM fields by the U.S. Department of Defense (DOD) and the Department of Energy (DOE). The agency reports a dearth of linked proposal and award data at some DOD and DOE offices, as well as the U.S. National Aeronautics and Space Administration, obstructed investigators from analyzing whether other programs financed projects regardless of applicants' gender. The goal of the study was to obtain insights on whether prejudices in federal grant award-making in STEM fields may be contributing to broad gender disparities among university faculty in related departments and in gender imbalances in STEM degree programs. Three female U.S. House Representatives asked GAO to determine whether STEM grants to universities were in compliance with Title IX gender discrimination protections. GAO is urging the U.S. Justice Department to improve Title IX information sharing across STEM grant-making agencies, and for DOD and the Department of Health and Human Services to perform internal compliance reviews.


Universities Expand Curriculum to Meet Data Scientist Demand
InformationWeek (12/16/15) Lisa Morgan

Market forces and technology innovations are fueling a growing need for data scientists that universities are attempting to fulfill. "Computational thinking is another core part of the curriculum for a well-educated individual whether or not they become a programmer so they can understand the nature of what's involved and apply critical thinking to data analysis and data analytics," notes Lehigh University's Dan Lopresti. Examples of initiatives include the University of Oxford's continuing education course merging data science and the Internet of Things so students can understand how they interoperate and subsequently make predictions. Meanwhile, Lehigh University is responding to demand with its interdisciplinary Data X initiative, which focuses mainly on computer science skills, data science, data analytics, data mining, and machine learning. Initial areas of concentration include consumer analytics, digital media, and connected health. Miami University's Farmer School of Business has broadened its information systems and data analytics offerings, setting up a center for analytics and data science. The institution also offers an analytics major, which is a joint program between the Farmer School and the college of arts and sciences. Another school promoting data science is Drake University, which established a cross-college program combining business and arts and sciences courses by leveraging its expertise in business, computer science, and math.


Not Tor, MIT's Vuvuzela Messaging System Uses 'Noise' to Ensure Privacy
Network World (12/17/15) Tim Greene

Massachusetts Institute of Technology (MIT) researchers' experimental Vuvuzela messaging system offers more privacy than The Onion Router (Tor) by rendering text messages sent through it untraceable. MIT Ph.D. student David Lazar says Vuvuzela resists traffic analysis attacks, while Tor cannot. The researchers say the system functions no matter how many parties are using it to communicate, and it employs encryption and a set of servers to conceal whether or not parties are participating in text-based dialogues. "Vuvuzela prevents an adversary from learning which pairs of users are communicating, as long as just one out of [the] servers is not compromised, even for users who continue to use Vuvuzela for years," they note. Vuvuzela can support millions of users hosted on commodity servers deployed by a single group of users. Instead of anonymizing users, Vuvuzela prevents outside observers from differentiating between people sending messages, receiving messages, or neither, according to Lazar. The system imposes noise on the client-server traffic, which cannot be distinguished from actual messages, and all communications are triple-wrapped in encryption by three servers. "Vuvuzela guarantees privacy as long as one of the servers is uncompromised, so using more servers increases security at the cost of increased message latency," Lazar notes.


Deep-Learning Algorithm Predicts Photos' Memorability at "Near-Human" Levels
MIT News (12/16/15) Adam Conner-Simons

Researchers at the Massachusetts Institute of Technology's Computer Science and Artificial Intelligence Laboratory (CSAIL) say they have developed an algorithm that can predict how memorable or forgettable a given image is almost as accurately as humans. The MemNet algorithm builds on previous work by CSAIL researchers to develop a similar algorithm for facial memorability. The researchers fed the algorithm, running on a deep-learning network, tens of thousands of images from several different datasets that had been given a "memorability score" by human subjects. When they pitted MemNet against human subjects by asking both to predict how memorable people would find a never-before-seen image, the algorithm performed 30-percent better than existing algorithms and within a few percentage points of the average human performance. The researchers plan to use MemNet to build an app that can tweak images to make them more memorable. "It's like having an instant focus group that tells you how likely it is that someone will remember a visual message," says lead author Aditya Khosla, who presented the research this week at the International Conference on Computer Vision (ICCV) in Santiago, Chile. Other potential applications for the technology include developing more effective teaching resources and more compelling advertisements.


How Someday Robots May Run to the Rescue--Literally
National Science Foundation (12/14/15) Bobbie Mixon

A research team in Ann Arbor, MI, is working to revolutionize walking robot control algorithms. Jessy Grizzle, a robotics engineer at the University of Michigan, wants bipedal robots to be able to step over and around obstacles, and on uneven walking surfaces, without tipping over, to make transitions from different walking tasks with speed and to walk much faster. Grizzle says robots need a keen sense of balance in order to walk rapidly over rough terrain, without having to rely too heavily on precise measurements of the ground profile. Working with the U.S. National Science Foundation, Grizzle's goal is to radically update the feedback control algorithms that enable robots to stand, walk, and step over obstacles. He also plans to develop mathematical designs for robot balance that can quickly be transferred from one walking robot to another. Last summer, a Georgia Institute of Technology researcher used Grizzle's method and reported it helped enable a humanoid robot to walk 10 times more efficiently than previous walking robots. The technology could be used by robots conducting search-and-rescue missions in dangerous environments.


'Fog' Computing Harnesses Personal Devices to Speed Wireless Networks
Princeton University (12/16/15) Joseph Sullivan

Researchers from Princeton University's Edge Lab announced the formation of a new nonprofit group, the Open Fog Consortium, at the recent Internet of Things World Forum in Dubai. The group, which includes scientists and business leaders from ARM, Cisco, Dell, Intel, and Microsoft, is dedicated to creating the basic infrastructure for what is known as fog computing. Similar to cloud computing, fog computing involves harnessing the computing resources of mobile and Internet-connected devices. The Open Fog Consortium is particularly interested in fog networking, in which fog computing methods offload much of the demand placed on communications networks onto the devices connecting to the networks. The consortium has opened its membership to companies and universities worldwide and expects to deliver reports detailing best practices for the new technologies they develop. Mung Chiang, director of the Edge Lab and co-founder of the Open Fog Consortium, says the consortium will help address common problems experienced by edge networks, including connection loss, bandwidth bottlenecks, and high latency. Denny Strigl, former CEO of Verizon Wireless, says the new consortium, "will make it faster and easier to develop technologies, products, and services that can be deployed by as many users as possible."


Now AI Machines Are Learning to Understand Stories
Technology Review (12/14/15)

Artificial intelligence (AI) machines have displayed rapid progress in face and speech recognition, thanks in part to the creation of vast databases necessary to train neural networks. Makarand Tapaswi at the Karlsruhe Institute of Technology in Germany and colleagues are now turning their attention to more complex reasoning tasks for AI machines, such as understanding movies. The team has created a database about films that should serve as a test arena for deep-learning machines and their abilities to reason about stories. The ability to answer questions about a story is an important indicator of whether or not it has been understood, and the researchers plan to create multiple choice quizzes about movies that consist of a set of questions along with several feasible answers, only one of which is correct. The researchers first accumulated plot synopses from Wikipedia for about 300 movies, and then they connected this to the movie itself. Movies clearly show information that can answer certain questions, but they do not always contain the information to answer questions about why things happen, for which additional knowledge of the world is sometimes required. Deep neural nets need large databases to help them learn, and the researchers also will find out whether this database is big enough to help constrain modern AI machines.


U.K. and Singapore Collaborating to Address Cyber Threats
Engineering & Physical Sciences Research Council (12/16/15)

Britain's Engineering & Physical Sciences Research Council (EPSRC) and Singapore's National Research Foundation on Wednesday announced the results of a joint research call to fund six new projects over the next three years. The projects will develop new solutions designed to enhance the resilience of systems and infrastructure to cyberattacks. The grant call seeks to strengthen knowledge and capabilities in cybersecurity and foster closer collaboration in cybersecurity research between the two countries. Six proposals were chosen covering research areas in intrusions, data analytics, human factors, and sector and applications. The University of Oxford and the National University of Singapore will examine security and privacy in smart grid systems, while the University of Kent and the National University of Singapore will study vulnerability discovery using abduction and interpolation. The University of Surrey and Singapore Management University will work on computational modeling and automatic non-intrusive detection of human behavior-based insecurity. Imperial College and the National University of Singapore will focus on machine learning, robust optimization, and verification. Imperial College and Singapore University of Technology and Design will work on security by design for interconnected critical infrastructure, and the University of Southampton and Nanyang Technological University will develop cybersecurity solutions for smart traffic control systems.


Improved Robotic Testing Systems
University of Stavanger (12/14/15) Leiv Gunnar Lie

University of Stavanger researchers say they have developed new mathematical models that provide better and less expensive testing of robotic systems. The researchers, led by Stavanger Ph.D. candidate Morten Mossige, say the models could reduce development costs and improve the quality of the robotic system, while providing increased "uptime" for the robots. First the research aims to develop automatic testing methods for robotic systems. Automatic testing can identify design flaws during the development and modification of control solutions for robots. The second part of the research aims to develop models for generating scheduling for a series of tests, which calculate the sequence of tests to be performed and the computer on which the tests should be completed. The new models also account for the fact that some tests cannot be performed simultaneously, even if they are conducted on different machines. Both models are based on "Constraint Programming," which means they are designed to operate fully automatically on a test server, and they take into consideration the length of time that will be needed to identify a good solution.


U of W Research Could Revolutionize the Way We Drive
Windsor Star (Canada) (12/15/15) Grace Macaluso

University of Windsor (UWindsor) researchers are programming technology designed to reduce roadway collisions. The technology connects to a vehicle's navigational system and collects data about acceleration, speed, steering angle, velocity, and spatial location, according to UWindsor professor Arunita Jaekel. The researchers are developing algorithms that turn this information into safety warnings and give drivers alerts about hazards. "Hopefully, we can build the research group up and position ourselves to be a major player in" the development of autonomous vehicles, Jaekel says. The researchers are working with industry partners such as Google, Apple, Arada Systems, and several automakers. In addition, the U.S. and Canadian governments are considering mandating vehicle-to-vehicle communication. "Before you can have self-driving cars, the first step is to have vehicles communicate with each other even when there is a human driver in the vehicle," Jaekel says. The researchers believe this technology will reduce the risk of traffic accidents. "If the on-board device alerts them to the braking before they see it's needed, everyone slows down together and this drastically reduces the risk of multiple collisions," says UWindsor researcher Jordan Willis.


Data Visualization Expert to Build the Top System in the Nation at the University of Hawaii
UH News (HI) (12/14/15)

The University of Hawaii at Manoa (UH Manoa) says it will build the best U.S. data-visualization system with the help of a grant from the U.S. National Science Foundation (NSF). NSF will provide $600,000 and UH Manoa will provide $257,000 to develop the new Cyber-enabled Collaboration Analysis Navigation and Observation Environment (CyberCANOE). The UH system already has two smaller CyberCANOE systems, located at UH Manoa and UH West Oahu, which will serve as the template for the new, larger system. CyberCANOE consists of a ring of flat-screen displays that surround the user to create the sense of existing in a virtual space. It was designed by UH Manoa professor Jason Leigh, who is also founder and director of the university's Laboratory for Advanced Visualization and Applications (LAVA), and it is based on similar systems Leigh developed during his time at the University of Illinois at Chicago. Construction of the new CyberCANOE system is expected to take three years. Once complete, nearly 1,000 UH researchers, fellows, and students from a wide range of disciplines will be able to use the CyberCANOE for their large-scale data-visualization needs.


Social Media News Consumers at Higher Risk of 'Information Bubbles,' IU Study Says
IU Bloomington Newsroom (12/14/15) Kevin Fryling

An Indiana University (IU) study suggests people who seek out news and information from social media are at a higher risk of becoming trapped in a "collective social bubble" compared to those who get their news and information from search engines. The study is based on a method the IU researchers developed that devises a score based on the distribution of news stories users click through to across millions of websites. If a user clicks primarily on links leading to one or a small number of websites, they will receive a lower score than users clicking on links that lead to a large number of different sites. The researchers used this method to analyze an anonymous database of some 100,000 Web searches by users at IU, a dataset of about 18 million clicks by more than half a million users of the AOL search engine in 2006, and 1.3 billion public posts containing links on Twitter. The researchers found users clicking on news stories in social media received lower scores than those who sought out news stories through search engines. "Our analysis shows that people collectively access information from a significantly narrower range of sources on social media compared to search engines," says lead researcher Dimitar Nikolov.


Abstract News © Copyright 2015 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe