Association for Computing Machinery
Welcome to the February 6, 2015 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).


FCC Plans Strong Hand to Regulate the Internet
The New York Times (02/04/15) Steve Lohr

U.S. Federal Communications Commission (FCC) chairman Tom Wheeler has spent the last year developing new rules for ensuring net neutrality, and on Wednesday he detailed his proposal to regulate consumer Internet service as a public utility. Provisions to safeguard consumer privacy and guarantee Internet service to remote and disabled users were unveiled as well. Wheeler's plan would give the FCC enforcement authority to oversee the interconnect market for the first time. FCC officials say the open Internet order would legally empower the agency to ensure no Web content is obstructed and the Internet is not fragmented into pay-to-play fast lanes for customers who can afford it and slow lanes for those who cannot. In addition, Wheeler intends to place mobile data service under the order and its Title II authority. Cable and telecom firms have pledged to fight the expected approval of the FCC regulations in court. Republicans in Congress are floating draft legislation that would uphold the ban on content blocking and fast/slow network lanes, but bar the FCC from issuing utility-style rules. Backers of the Title II model say a strong regulatory scheme will make sure business innovation and diversity of expression are sustained.
View Full Article - May Require Free Registration | Return to Headlines | Share Facebook  LinkedIn  Twitter 

A.I. Is Getting Smarter
Computerworld (02/05/15) Sharon Gaudin

Researchers gathered at the Association for the Advancement of Artificial Intelligence 15 conference last week believe artificial intelligence (AI) research is on an upswing after decades of ups and downs. Lynne Parker of the University of Tennessee and the U.S. National Science Foundation says AI in previous eras was hampered by unrealistic expectations and over time interest in the field dimmed as ambitious predictions failed to materialize. However, the researchers say the field is now experiencing a major renaissance. Research labs at universities and other institutions and major investments from the private sector are helping to drive innovations in several AI fields, such as natural language processing, speech and object recognition, computer vision, machine translation, and neural networks. These innovations are occurring alongside innovations in robotics, which will lead to robots that move more fluidly and naturally. Parker says the sort of household robots many people envision are still a decade or more off, and one of the major hurdles to developing them and other advanced systems will be a holistic approach to AI that integrates innovations in all of the different areas. Another major obstacle will be legal issues surrounding AI, most immediately in regard to autonomous vehicles.

Crowdsourcing America's Cybersecurity Is an Idea So Crazy It Might Just Work
The Washington Post (02/05/15) Dominic Basulto

Crowdsourced cybersecurity is a concept gaining ground, and its practical application would involve free, transparent sharing of computer code used to identify cyberthreats between the public and private sectors. One example initiated last December was the U.S. Army Research Lab's addition of the free Dshell forensic analysis network to the popular GitHub code sharing website. The Lab's William Glodek says the shared code would "help facilitate the transition of knowledge and understanding to our partners in academia and industry who face the same problems." Glodek also wants to "give back to the cyber community, while increasing collaboration between the Army, the Department of Defense, and external partners to improve our ability to detect and understand cyberattacks." Such efforts could be complemented with more recruitment of white hat hackers into the government's cybersecurity programs, while Silicon Valley could play a key role in the crowdsourcing of intelligence threats. Successful cybersecurity crowdsourcing will need to overcome pitfalls such as the risk that such openness might lead to enemy infiltration of government cyberdefense systems. Another issue is people's distrust of the intelligence community as fallout from the U.S. National Security Agency surveillance scandal.
View Full Article - May Require Free Registration | Return to Headlines | Share Facebook  LinkedIn  Twitter 

Western Spy Agencies Secretly Rely on Hackers for Intel and Expertise
The Intercept (02/04/15) Glen Greenwald

Documents provided by former U.S. National Security Agency contractor Edward Snowden detail the efforts of the U.S., Canadian, and U.K. intelligence agencies to spy on the espionage activities of hacker groups and to monitor chatter in the hacker and security spaces. One document details the efforts of Communications Security Establishment Canada and Menwith Hill Station to exploit "a target-rich data set being stolen by hackers." The document dubs the hacker's email-stealing intrusion set INTOLERANT and says the intelligence agency suspects the group is backed by a state actor. INTOLERANT spied on diplomatic corps, human-rights activists, and journalists, largely in China, India, and Afghanistan, and the intelligence agency spied on INTOLERANT, gathering up the data it was stealing. Another document outlines the U.K. Government Communications Headquarters (GCHQ) program LOVELY HORSE, the purpose of which is to monitor and index public discussion among hackers on Twitter and other social media. Another GCHQ document details efforts to monitor and collect information from open source sites, including blogs, websites, chat venues, and Twitter, as well as IRC chat rooms and Pastebin pages. One note in the documents about hacker collective Anonymous reveals the intelligence agencies consider the group to be only a minor threat.

MAPPING Explores Internet's Impact on Society
CORDIS News (02/02/15)

European researchers are working on the MAPPING project, a study that aims to create an understanding of the many and varied aspects of the recent developments on the Internet and their consequences. "A major challenge is how to enable information discovery [on the one hand] and also enable information protection [on the other]," says British Business Federation Authority CEO Patrick Curry. "The answer, if there is one, is 'it depends,'" Curry says. "It depends on lots of things, but first we need to understand some more basics on what the Internet is rather than how it is used." As part of the MAPPING project, the researchers will monitor innovation policies, business models, and the legal framework surrounding the Digital Agenda for Europe. They also will suggest improvements, and eventually will create an action plan with policy guidelines taking into consideration the interests and opinions of all stakeholders. The work relies on the Policy Watch Internet governance initiative, and two efforts that will consist of finding the right balance between privacy, personality, and business models, as well as issuing recommendations with regards to intellectual property rights protection in the European Union. MAPPING's conclusion and recommendations will be compiled in a Road Map, which could be important for the future of Internet technologies in Europe.

European Project Launches the World's First Real-Time 'Mixed Reality' Ski Race
University of Southampton (United Kingdom) (02/02/15)

The European Commission is funding an interactive mixed reality downhill ski race as part of the 3D LIVE project. The research involves a professional skier racing down a real mountain against two virtual reality gamers. The racers will use a three-dimensional (3D) tele-immersion platform and wear new Oculus Rift virtual reality technology that enables them to compete against each other by racing down the same ski slope at the same time. "By using advanced sensor and gaming technologies to create and manipulate 3D information in real time, the platform can deliver truly interactive experiences closely linked to real world activities," says University of Southampton researcher Michael Boniface. The technology could lead to new types of live games that combine digital and real interaction in many different competitive sports, according to Collaborative Engineering in Italy project coordinator Marco Conte. He also notes 3D LIVE Project researchers have developed similar experiences for golf and jogging.

Computational Linguistics Reveals How Wikipedia Articles Are Biased Against Women
Technology Review (02/02/15)

For several years Wikipedia has been criticized for the ways that women involved with it are marginalized or minimized. A new study from researchers at the Leibniz Institute for the Social Sciences, ETH Zurich, and the University of Koblenz-Landau seeks greater insight into the issue by analyzing women's representation in the articles of six different language versions of Wikipedia. The researchers first compared the proportion of articles about men and women to those of other databases of notable people, including the Massachusetts Institute of Technology's Pantheon, and found Wikipedia had slightly more even proportions of articles about men and women compared to the other databases. However, articles about women on Wikipedia are more likely to link to articles about men than to articles about women, and articles about women are more likely to use words highlighting the gender of their subject. In other words, an article about a notable woman will see more instances of words such as "woman" or "female" than an article about a man will contain words such as "man" or "male." The researchers say this is evidence of implicit bias among Wikipedia's writers and in particular is evidence of a perception of maleness as being the null gender.

Building Trustworthy Big Data Algorithms
Northwestern University Newscenter (01/29/15) Emily Ayshford

Northwestern University researchers recently tested latent Dirichlet allocation, which is one of the leading big data algorithms for finding related topics within unstructured text, and found it was neither as accurate nor reproducible as a leading topic modeling algorithm should be. Therefore, the researchers developed a new topic modeling algorithm they say has shown very high accuracy and reproducibility during tests. The algorithm, called TopicMapping, begins by preprocessing data to replace words with their stem. It then builds a network of connecting words and identifies a "community" of related words. The researchers found TopicMapping was able to perfectly separate the documents according to language and was able to reproduce its results. Northwestern professor Luis Amaral says the results show the need for more testing of big data algorithms and more research into making them more accurate and reproducible. "Companies that make products must show that their products work," Amaral says. "They must be certified. There is no such case for algorithms. We have a lot of uninformed consumers of big data algorithms that are using tools that haven’t been tested for reproducibility and accuracy."

Stanford Researchers Use Big Data to Identify Patients at Risk of High-Cholesterol Disorder
Stanford University (01/29/15) Tracie White

Stanford University researchers have launched a project designed to identify hospital patients who may have a genetic disease that causes a deadly buildup of cholesterol in their arteries. The project uses big data and software that can learn to recognize patterns in electronic medical records and identify patients at risk of familial hypercholesterolemia (FH), which often goes undiagnosed until a heart attack strikes. The project is part of a larger initiative called Flag, Identify, Network, Deliver FH, which aims to use innovative technologies to identify individuals with the disorder who are undiagnosed, untreated, or undertreated. For the project, researchers will teach a program how to recognize a pattern in the electronic records of Stanford patients diagnosed with FH. The program then will be directed to analyze Stanford patient records for signs of the pattern, and the researchers will report their findings to the patients' personal physicians, who can encourage screening and therapy. "These techniques have not been widely applied in medicine, but we believe that they offer the potential to transform healthcare, particularly with the increased reliance on electronic health records," says Stanford professor Joshua Knowles. If the project is successful at Stanford, it will be tested at other academic medical centers.

Uber and CMU Collaborating on Robotically Driven Cars in Lawrenceville
Pittsburgh Post-Gazette (02/02/15) Dan Majors

Uber and Carnegie Mellon University (CMU) are jointly creating a robotics research lab and technology center at the RIDC Chocolate Factory in Pittsburgh with the goal of developing new aspects of mapping, vehicle safety, and technology with an eye toward autonomous taxi fleet development. "This is yet another case where collaboration between the city and its universities is creating opportunities for job growth and community development," says Pittsburgh Mayor Bill Peduto. The partnership will include funding from Uber for faculty chairs and graduate fellowships and will enable Uber to tap CMU faculty, staff, and students, both on campus and at its National Robotics Engineering Center. "We are very proud of our long history of being first in the world in robotics, and this really cements what's going on here with Pittsburgh as a robotics city with this great new partnership coming into play," says CMU School of Computer Science dean Andrew Moore. Uber and CMU will hold an event in Pittsburgh to formally kick off the partnership in the coming weeks. "This is kind of a very synergistic thing," notes Uber's Jeff Holden. "Robotics is at a really interesting cusp. It's basically exploding right now as a major technology area, and the industry is starting to build around that."

Forecasting the Flu Better
UCSD News (CA) (01/29/15) Inga Kiderra

A research team at the University of California, San Diego (UCSD) says it has refined and improved the predictions of Google Flu Trends (GFT). The researchers report using social network analysis and combining the power of GFT's big data with traditional flu-monitoring data from the U.S. Centers for Disease Control and Prevention (CDC). "Our innovation is to construct a network of ties between different U.S. health regions based on information from the CDC," says UCSD doctoral student Michael Davidson. The team considered which places in previous years reported the flu at about the same time. "That told us which regions of the country have the strongest ties, or connections, and gave us the analytic power to improve Google's predictions," Davidson says. The researchers say they can predict the spread of flu a week into the future with as much accuracy as GFT can display levels of infection right now. "We hope our method will be implemented by epidemiologists and data scientists, to better target prevention and treatment efforts, especially during epidemics," Davidson says.

New App Would Monitor Mental Health Through 'Selfie' Videos, Social Media
University of Rochester NewsCenter (01/29/15) Leonor Sierra

Researchers at the University of Rochester led by professor Jiebo Luo are developing technology to transform computers and smartphones into personal mental health monitoring devices. The team's program would analyze selfie videos as the person engages with social media. The system would scrutinize the video data for clues such as heart rate, blinking rate, eye pupil radius, and head movement rate. At the same time, the program would analyze what users post on Twitter, what is read, how fast they scrolled, their keystroke rate, and their mouse click rate. The team trained the system by sending social media messages to induce emotions in a test group, and then having the technology gauge the reactions of the participants. Luo says the program is "unobtrusive; it does not require the user to explicitly state what he or she is feeling, input any extra information, or wear any special gear." The system currently rates emotions as positive, neutral, or negative, but Luo says the researchers want to give it a greater degree of sensitivity to define them further, such as sadness or anger for negative emotions.

The Next Decade in Tech: Three Defining Forces to Watch
Tech Republic (01/29/15) Jason Hiner

The world has routinely been shaped by unexpected sea changes in technology, such as the integrated circuit or the rise of Google, and Tech Republic believes there are three trends in technology forming today that will profoundly influence the broader world. The first is the Internet of Things, or the proliferation of Internet-connected devices, and some devices are likely to be so exotic they would seem outlandish today, such as an Internet-connected tattoo the color of which could be controlled by a smartphone. Another major trend is micro-personalization, or the collision of big data with user interfaces, where vast volumes of personal data are used to craft deeply personalized interfaces. Depending on how this trend develops, it could strengthen or help to correct certain maligned aspects of society, such as political polarization. Finally, there is clean tech, the advent of new technologies that will help to enable the shift to cleaner, renewable sources of electricity. These technologies will include smart grids and new storage technologies, and are anticipated to be a vibrant sector of the tech world for decades to come.

Abstract News © Copyright 2015 INFORMATION, INC.
Powered by Information, Inc.

To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe