Welcome to the February 19, 2014 edition of ACM TechNews, providing timely information for IT professionals three times a week.
ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).
HEADLINES AT A GLANCE
Revamped Computer Science Classes Attracting More Girls
San Francisco Chronicle (02/18/14) Kristen V. Brown
A growing number of universities are overhauling their computer science courses to attract more women, and the efforts appear to be working. Last spring, a University of California, Berkeley introductory computer science class enrolled more women than men for the first time. Berkeley professor Dan Garcia says he broadened the class' scope beyond programming to focus on the impact and relevance of computing in the world, and included pair exercises. Although Garcia says the course redesign was not exclusively for the purpose of drawing greater female participation, the changes removed some of the aspects of computer science that tend to repel women. As universities strive to draw more women into technology fields, they walk a fine line of trying to appeal without diminishing the technical aspects of a subject or making women feel targeted. A 2012 University of Michigan study found that gender-neutral role models were actually more effective than feminine role models in capturing the interest of middle school girls in science and math fields. Women appear to misperceive computer science, according to a 2008 ACM study, which found college-bound girls associated computing with words such as "typing," "math," and "boredom." However, many women change their notions about computer science after studying the subject, particularly when exposed to its creative aspects and ability to impact the world.
Half of Americans Want to Live in a Smart City With Driverless Cars
Computerworld (02/18/14) Sharon Gaudin
Almost 50 percent of Americans want to live in a city where all vehicles are driverless, and 33 percent think that might happen in the next 10 years, according to an Intel survey of 12,000 people in eight countries. The survey also found that 40 percent of respondents think driverless vehicles would reduce the number of traffic accidents, while 38 percent said it would decrease traffic congestion, and 34 percent said it would reduce carbon emissions. Intel's Steve Brown says he is surprised by the survey's results. "They're probably overly optimistic, but it's nice to see that they're excited about the idea and think it will happen soon," Brown says. "I think it tells us that people are excited about a future that has some intelligence in it to make the world more convenient, more efficient, and safer." The survey also found that 54 percent of respondents would be willing to let an intelligent system determine what route everyone on the road would take to their destinations if it meant overall commute time would be reduced by 30 percent. In addition, 50 percent of Americans said they would allow the government to put a sensor on their cars to help them with intelligent parking.
MIT News (02/19/14) Larry Hardesty
Researchers at the Massachusetts Institute of Technology (MIT) and the University of Connecticut say they have developed a set of caching strategies for massively multicore chips that significantly improves chip performance while reducing energy consumption. The researchers say the new strategy results in average gains of 15 percent in execution and energy savings of 25 percent. The caches on multicore chips are typically arranged in a hierarchy. Each core has its own private cache, which may itself have several levels, while all of the cores share the last-level cache (LLC). "One scenario where an application does not exhibit good spatiotemporal locality is where the working set exceeds the private-cache capacity," says MIT's George Kurian. The researchers say their design solves that problem. When an application's working set exceeds the private-cache capacity, the MIT researchers' chip would split it up between the private cache and the LLC. Data stored in either place would stay there, no matter how recently it has been requested. However, if two cores working on the same data are constantly communicating in order to keep their cached copies consistent, the chip would store the shared data at a single location in the LLC. The cores would then take turns accessing the data, instead of clogging the network with updates.
At Newark Airport, the Lights Are on, and They’re Watching You
The New York Times (02/17/14) Diane Cardwell
Lighting systems that can monitor the public are among a wave of data-collection technologies that bring tremendous benefits but also pose privacy concerns. The Port Authority of New York and New Jersey, for example, is using such a lighting system at its Newark Liberty International Airport. The technology uses light-emitting diode fixtures, sensors, video cameras, and a wireless network that sends data to software that analyzes the activity. The software detects long lines, recognizes license plates, flags suspicious activity, and sends alerts to staff. Originally designed to help organizations reduce energy use, these lighting systems can improve the management of energy, security, traffic, and people. Networked lighting systems are being tested by governments for traffic control, monitoring carbon dioxide, detecting when garbage cans are full, and broadcasting sound. Use of the technology is expected to grow as the systems become more advanced and less expensive. For example, malls could use the technology to send smartphone alerts to arriving customers to let them know where to find an open parking space or offer targeted deals. Although the technology has great potential, experts warn privacy implications should be carefully considered before the tools achieve widespread adoption.
New Skills Are Needed to Work on Internet of Things
The Washington Post (02/16/14) Mohana Ravindranath
The market for the Internet of Things could create demand for IT specialists that can engineer new products and process the data they collect. A 2011 McKinsey report estimates the U.S. faces a shortage of up to 190,000 people with deep data analytics skills, as well as a shortfall of 1.5 million managers and analysts. In response to this worker shortage, General Electric (GE) has been training data specialists for the past few years, says GE's Marco Annunziata. Until the global IT workforce produces enough workers who specialize in data science and software or hardware engineering, "we need to start developing them, to some extent," Annunziata says. "We will have more and more need for people who are a combination of data scientists and operation managers--people who have both an understanding of how to use data, how to use analytics, and also an understanding of their own business lines." Although Cisco also is looking for similar hires, the company targets candidates who can collaborate with people in other industries. Meanwhile, some universities, such as the University of California, Berkeley, have developed data-science programs to prepare students to work on Internet of Things projects. Carnegie Mellon University, the Massachusetts Institute of Technology, and Columbia University also have developed similar programs.
Flood Hack Event Looks to Help Relief Efforts
BBC News (02/17/14) Dave Lee
A recent Flood Hack event in London called on about 100 major technology company engineers and independent developers to use newly available government data to create useful resources and digital services. The event resulted in calls for more data to be made freely available. Until this weekend gathering, flood data held by the United Kingdom's Environment Agency was only available to those willing to pay a licensing fee. "The main issue we come up with is not just about money, it's about giving up control," says journalist Michael Cross. He says Flood Hack demonstrated the data could be used to create innovative public services. The information included readings, updated every 15 minutes, from every flood sensor in the UK, effectively giving the hackers live data on the situation across the country. Twenty hacks were presented to a panel of judges at the end of the event, and notable entries were given nominal prizes. One of the stand-out entries was a method of instantly reporting damaged flood defenses, a problem plagued by hoax reports. Many of the hacks used FloodVolunteers.co.uk, a website that aims to create a database of available helpers that could assist those in flood-hit areas.
Hackathons on the Rise
Associated Press (02/17/14) Martha Mendoza
This year a record 1,500 hackathons are planned around the world, up from just a handful in 2010. Although Yahoo is recognized as holding the first official hackathon in 2005, Facebook CEO Mark Zuckerberg gets credit for helping broaden the definitions by urging his staff to "hack" by "building something quickly or testing the boundaries of what can be done." For example, a new Facebook option that gives users more than 50 ways to identify their gender beyond male and female was conceived during a company hackathon four months ago. As hackathons have become more popular, a set of rules has come together. Teams are usually comprised up of a small number of people, and although designs, ideas, and even mock-ups can be worked on in advance, everyone starts writing code at the same time. In addition, teams own whatever they create. Similar to the tech industry itself, hackathon participants are mostly men, but some organizers are trying to change that. The AT&T Developer Summit hackathon had an unusually high number of female participants after organizers promised an extra $10,000 to any team with a majority of women.
Data Links Quick Fix
University of Isfahan researchers say they have developed software that can fix 90 percent of broken links in the Web of data, assuming the resources are still on the website's server. The researchers say their method is based on the source point of links and a way to discover the new address of the digital entity that has become detached. The technique creates a superior and an inferior dataset, which enables users to create an exclusive data graph that can be monitored over time in order to identify changes and trap missing links as resources become detached. "The proposed algorithm uses the fact that entities preserve their structure event after movement to another location," says University of Isfahan researcher Mohammad Pourzaferani. "Therefore, the algorithm creates an exclusive graph structure for each entity." The researchers tested the algorithm on two snapshots of DBpedia containing nearly 300,000 person entities. The algorithm identified almost 5,000 entities that changed between the first and second snapshot, and successfully relocated nine out of 10 broken links.
Facebook Tells the Computer Who You Love
Cornell Chronicle (02/12/14) Bill Steele
Cornell University researchers have developed algorithms that can correctly identify a person's spouse, fiance, or other romantic partner, based on a map of Facebook friends, with about 70-percent accuracy. "We are trying to build up a sort of chemistry kit for finding different elements of a network," says Cornell professor Jon Kleinberg. He says the method works best when the couple is married, and works better the longer the relationship has been in force. However, if the algorithm does not select the person who is the relationship partner, there is a significantly increased chance that in a month or two the couple will break up, according to the researchers. The researchers tested their algorithms on anonymized data from 1.3 million randomly selected Facebook users aged 20 or older who listed their status as "married," "engaged," or "in a relationship." The algorithms combine embeddedness and dispersion, as well as the dispersion of mutual friends, to determine if two people are in a relationship. The team will present their research at the ACM Conference on Computer Supported Cooperative Work and Social Computing taking place this week in Baltimore.
CMU Partners With Yahoo! for $10-Million Machine Learning Initiative
Pittsburgh Post-Gazette (02/12/14) Deborah M. Todd
Researchers at Carnegie Mellon University (CMU) and Yahoo! have launched Project InMind, a five-year, $10-million partnership that gives university researchers access to a mobile toolkit of Yahoo's real-time data services and its infrastructure in order to advance machine learning and personalization of smartphone apps. Project InMind aims to create customized services that can anticipate users' needs and interests on an ongoing basis, whether a user is at home playing video games or navigating the streets of a foreign country. "This feels like the next very large step in a journey towards a grand dream...where computers will work in very close partnership with humans in ways that are very natural--anticipating what people want at the right time in the right place, being very responsive to requests, being able to deal with ambiguity and uncertainty, and also being very supportive and friendly rather than--perhaps--intimidating," says Yahoo Labs' Ron Brachman. As part of the program, CMU students and faculty will use a Yahoo!-sponsored fellowship program to support research in machine learning, mobile technologies, human-computer interaction, personalization, novel interaction techniques, and natural-language processing. In addition, the mobile toolkit will function as a test space for machine-learning algorithms designed to pick up on what is useful to users at a given time.
Connected Cars: Apps, Networks, and Storage on Wheels
Government Computer News (02/12/14) Patrick Marshall
University and commercial labs are developing mobile apps designed to help make transportation safer, but there needs to be a better way to integrate the disparate stovepipes of information services within the transportation segment. Clemson University professor Kuang-Ching Wang believes software-defined networking (SDN) is the solution. His research team has examined the current bottleneck for connected-vehicle technology and concludes there is no sustainable funding model for dedicated short-range communications (DSRC) technology. At the same time, other forms of wireless connectivity are in many places. With SDN, programmers can upload rules for handling network traffic to the logic chips in routers and switches, which will enable data from DSRC systems to easily be integrated with smartphone sensor data collected via Wi-Fi or Bluetooth networks. Most analysts believe the integration will take place within the vehicle, and more automakers are building vehicles that are simply platforms. Meanwhile, University of Nevada, Las Vegas (UNLV) professor Pushkin Kachroo says researchers at UNLV are developing smartphone applications that monitor seatbelt use in passing vehicles, and others that collect data on road service conditions and transmit it to transportation managers for potential action.
IBM Opens the Door to 400Gbps Internet
ZDNet (02/12/14) Nick Heath
IBM researchers have developed technology that could help achieve Internet speeds of 200 Gbps to 400 Gbps. The researchers say the technology features an ultra-fast and energy-efficient analog-to-digital converter (ADC) that could enable data centers to share information at four times the speed currently possible. They say that at these speeds, 160 GB, the equivalent of a two-hour, 4K ultra-high-definition movie or 40,000 songs, could be downloaded in only a few seconds. Although the ADC is only part of the network infrastructure needed to achieve the higher data transmission speeds, developing the ADC was a significant step, says IBM's Pier Andrea Francese. In addition, he notes the ADC can work at these speeds without boosting energy consumption to unacceptable levels. The ADC was developed as part of IBM's work to build technologies that will support the Square Kilometer Array (SKA), an array of up to 3,000 radio telescopes that will gather cosmic emissions in an attempt to see the universe a few hundred million years after the Big Bang.
How HPC Is Hacking Hadoop
HPC Wire (02/11/14) Nicole Hemsoth
The Hadoop open source framework increasingly is being used with high-performance computing (HPC) environments, particularly for data-intensive scientific computing applications. Almost all major HPC system vendors and many software vendors are offering significant Hadoop enhancements, customized distributions, and sometimes new product lines. HPC systems can be modified to work with Hadoop for the purpose of providing more streamlined data management and processing on certain problems. The San Diego Supercomputer Center's (SDSC) Glenn K. Lockwood is renowned for his work on Hadoop for large-scale systems, particularly the Gordon flash-based data-intensive computing system at SDSC. Lockwood is experimenting with Hadoop clusters on Gordon and writing Hadoop applications in Python with Hadoop Streaming. "Although traditional supercomputers and Hadoop clusters are designed to solve very different problems and are consequentially architected differently, domain scientists are becoming increasingly interested in learning how Hadoop works and how it may be useful in addressing the data-intensive problems they face," Lockwood says. Users can launch a Hadoop cluster by submitting a single pre-made job script to the batch system on Gordon with which they are already familiar, eliminating the need to learn a new cloud API or be a systems administrator, says Lockwood; this has made it significantly easier for domain scientists to experiment with Hadoop's potential role in their analyses.
Abstract News © Copyright 2014 INFORMATION, INC.
To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.