ACM TechNews
Association for Computing Machinery
Welcome to the September 19, 2014 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


Boosting Hispanic Share of Tech Workforce Could Be Key to Closing STEM Gap
NextGov.com (09/18/14) Janie Boschma

The Obama administration has set a goal of increasing student exposure to opportunities in the science, technology, engineering, and math (STEM) fields by 50 percent and the number of STEM college graduates by 1 million in the next decade. Cecilia Munoz, director of the White House's Domestic Policy Council, says encouraging greater Hispanic participation in STEM fields is a crucial part of that goal. Hispanics accounted for 15 percent of the U.S. labor force in 2011, but only 7 percent of the STEM workforce. In addition, 17 percent of the U.S. population is Hispanic, but only 10 percent of STEM bachelor degrees are earned by Hispanics. Munoz says the government sees this as "room for growth," and is trying to tap that potential with investments in higher-education programs that encourage Hispanics to pursue STEM careers. One such program is at the Georgia Institute of Technology, which boasts one of the highest rates of Hispanic STEM graduates in the country. Provost Rafael Bras says this is because the school actively supports Hispanic students pursuing STEM through recruitment, mentorships, scholarships, and other efforts. He says the school works to boost the success of Hispanic students through constant attention. "If the opportunity is not there, we can make it," Bras says.


Scientists Twist Radio Beams to Send Data
USC News (09/16/14) Robert Perkins

Scientists at the University of Southern California (USC) have twisted radio beams together to send data at high speeds without some of the problems experienced with optical systems. They reached data transmission rates of 32 Gbps across 2.5 meters of free space in a basement lab, which is fast enough to transmit more than 10 90-minute-long high-definition movies in one second and is 30 times faster than LTE wireless. "Not only is this a way to transmit multiple spatially collocated radio data streams through a single aperture, it is also one of the fastest data transmission via radio waves that has been demonstrated," says USC professor Alan Willner. The team passed each beam, carrying its own independent stream of data, through a spiral phase plate that twisted them into a unique orthogonal DNA-like helical shape. A receiver at the other end of the room then untwisted and recovered the different data streams. Radio offers the advantage of wider, more robust beams, which are better able to cope with obstacles between the transmitter and receiver, and it is not as affected by atmospheric turbulence. The researchers say the technology could be used in ultra-high-speed links for the wireless backhaul that connects base stations of next-generation cellular systems.


Three Pilot Projects Receive Grants to Improve Online Security and Privacy
NIST News (09/17/14) Jennifer Huergo

The U.S. National Institute of Standards and Technology announced it will issue nearly $3 million in grants to support three pilot projects as part of the National Strategy for Trusted Identities in Cyberspace (NSTIC) program it oversees. It is the third round of NSTIC grants since the program was launched in 2011 to support public- and private-sector collaboration toward the creation and adoption of secure, efficient, interoperable, and easy-to-use online credentials. The first grant will go to GSMA and its project to enable consumers and businesses to use mobile devices for identity and access management. The pilot project involves all four major U.S. mobile carriers and is based on the organization's Mobile Connect Initiative. The second grant will go to Confyrm, which is developing methods to mitigate the financial and reputation effects of criminals creating fake online accounts or hijacking existing accounts. The Confyrm project involves a major email provider, a major mobile operator, and multiple e-commerce sites. The final grant will go to MorphoTrust USA, which is developing a method of translating state IDs into online credentials in partnership with the North Carolina Departments of Transportation Health and Human Services. The goal is to create a license-based credential that will enable residents to apply for state food benefits online.


For Electronics Beyond Silicon, a New Contender Emerges
Harvard University (09/16/14) Caroline Perry

Harvard University researchers have used correlated oxide to achieve reversible change in electrical resistance of eight orders of magnitude, which they note offers performance comparable to the best silicon switches. Correlated oxide also can function equally well at room temperature or a few hundred degrees above it, allowing it to be integrated into existing electronic devices and fabrication methods. The researchers say the breakthrough establishes correlated oxide as a promising semiconductor for future three-dimensional integrated circuits as well as for adaptive, tunable photonic devices. They have developed a new transistor, made primarily of an oxide called samarium nickelate, which achieves an on/off ratio that is comparable to that of state-of-the-art silicon transistors. "We have just discovered how to dope these materials, which is a foundational step in the use of any semiconductor," says Harvard professor Shriram Ramanathan. "By a certain choice of dopants--in this case, hydrogen or lithium--we can widen or narrow the band gap in this material, deterministically moving electrons in and out of their orbitals." The new orbital transistor involves having protons and electrons move in or out of the samarium nickelate when an electric field is applied. "This is a material that can challenge silicon," Ramanathan says.


Failing Students Saved by Stress-Detecting App
New Scientist (09/16/14) P. 2986 Paul Marks

Dartmouth College professor Andrew Campbell and colleagues recently conducted a 10-week experiment with 48 students to see if data gathered from their smartphones could be used to gauge their states of mind, and their likelihood to excel or struggle academically. The researchers developed the StudentLife app to track readings from smartphone sensors in such areas as physical activity levels, sleep patterns, frequency and duration of conversations, and global-positioning system location. Students who excelled interacted frequently with other people and have longer conversations, while depressed students were less social with others and exhibited excessive or disrupted sleep patterns. "We found for the first time that passive and automatic sensor data, obtained from phones without any action by the user, significantly correlates student depression level, stress, and loneliness with academic performance over the term," Campbell says. The researchers found over the course of the semester the students' workloads increased, stress heightened, and sleep, chat, and physical activity levels declined. The University of Cambridge's Cecilia Mascolo notes accessing such comprehensive data will be controversial. "You need to constrain this to a very specific application that will benefit people, and with the user always in control of their data," she says.


Improved Tools for Online Social Innovation Platforms
CORDIS News (09/15/14)

The European Union-funded CATALYST project aims to develop and test collective intelligence tools and make them available as open source solutions to any online community, especially those concentrating on social innovation, sustainability, and citizen engagement. The CATALYST suite will provide a broad spectrum of capabilities, including collective sensing, through sense-making, ideation, decision-making, and collective action. Although the CATALYST package will not be ready for testing until the end of the year, the team has launched an "open call for collaboration," urging communities to test the tools. The researchers want communities from around the world to test the tools while engaging in large-scale discussions around a pre-defined societal issue. Interested communities can apply to test the CATALYST tools, which will be made freely available with the necessary level of support on how to use them and how to develop a community of participants. Among the members of the CATALYST consortium are five community partners tasked with curating large online communities on social innovation and citizen engagement platforms and two research institutes, as well as the Open University and the University of Zurich.


Reducing Traffic Congestion With Wireless System
MIT News (09/15/14) Larry Hardesty

Massachusetts Institute of Technology (MIT) researchers have developed RoadRunner, a system that employs global-positioning system-style turn-by-turn directions to steer drivers around congested streets. The researchers used data models to compare RoadRunner with a system currently being used in Singapore, which charges motorists with dashboard-mounted transponders a toll for entering congested areas. In the models, RoadRunner generated an 8-percent boost in average car speed during periods of peak congestion. "With our system, you can draw a polygon on the map and say, 'I want this entire region to be controlled,'" says MIT graduate student Jason Gao. "You could do one thing for a month and test it out and then change it without having to dig up roads or rebuild gantries." The researchers also tested the system on 10 cars in Cambridge, MA, which was sufficient to assess the efficiency of the communications system and of the vehicle-routing algorithm. The Cambridge test produced reliable data about the system's performance for use in simulations. RoadRunner assigns each region a maximum number of cars and routes additional cars around the region via turn-by-turn voice prompts. "RoadRunner offers the possibility of decentralizing as many decisions as possible at the lower level, without excluding that global decisions be made at the upper level," says Jean Bergounioux, secretary general of ATEC ITS France, a French industrial research consortium.


Cutting the Cloud Computing Carbon Cost
EurekAlert (09/12/14)

Researchers in Algeria have developed an algorithm to optimize cloud computing systems for energy use and reduce their carbon footprint. The algorithm is designed to control the virtual machines (VMs) running on computers in a cloud environment so energy use of the core central-processing units (CPU) and memory capacity can be reduced as much as possible without affecting overall performance. The approach is divided into two phases, with the first being the selection of VMs using the modified minimization of migration algorithm. The solution is based on upper and lower physical resources thresholds, according to University of Oran researchers Jouhra Dad and Ghalem Belalem. The second phase is the allocation of the migrated VMs, which uses the modified multidimensional knapsack problem. The researchers' study shows virtualization of processes and live migration of VMs within the cloud service using their algorithm of selection and allocation enables consolidation of different tools and applications to consume less CPU capacity and memory. The algorithm lowers energy demand on servers by enabling several VMs to run on a single remote computer accessible to users without compromising performance.


Teaching Computers the Nuances of Human Conversation
University of California Santa Cruz Newscenter (09/11/14) Melissae Fellet

University of California, Santa Cruz professor Marilyn Walker says natural-language processing has become very good at recognizing human speech, but it still struggles to understand its subtle, yet crucial, nuances. In particular, it cannot yet pick up on the ways that word choice and sentence construction reveal a speaker's personality and the social relationships between people. Walker says developing software that can do that would be a major step along the road to creating companion robots and other naturalistic technology. One of the projects being worked on by Walker and her colleagues analyzes thousands of conversations from online debate forums. Using crowdsourced summaries of these conversations--which touch on a variety of topics from politics to pets to technology--and analyzing them linguistically, the researchers hope to tease out information about how sarcasm is conveyed in text, how people present and support arguments, and other ways people communicate. The goal is to create software that will be able to summarize online conversations and readily identify what position any given participant is taking. Walker says such a tool could make debates less polarized by enabling people to identify and understand opposing arguments.


Mapping Big Data
SDSU News (09/11/14) Michael Price

Researchers at San Diego State University (SDSU) are applying geographic mapping principals to large sets of unstructured data in hopes of gaining insights into the relationships between them. The mapping techniques being developed at SDSU's Center for Information Convergence and Strategy (CICS) focus on large sets of data that are in mutually unintelligible formats. The data can include everything from journal articles to blog posts, and CICS' mapping is meant to make the parallels and connections between them readily and quickly apparent. An example is a map the researchers made by feeding about 2 million medical journal articles into their algorithm. The map it produced enabled CICS researchers to discover that articles discussing obesity also touched on exercise, blood pressure, insulin, aging, heart rate, and school lunches, among other topics. "How do you read a million papers? You can't. But this gives you a quick topographic overview of a new field," says SDSU professor Andre Skupin, co-director of CICS. He says CICS will serve as a resource for SDSU researchers and those in the community, but notes government agencies and public interest groups also have expressed interest.


Multiscreen Social TV Would Enrich Traditional Viewing Experience
Phys.Org (09/11/14) Lisa Zyga

Tablet computers and smartphones could function as a second screen and help facilitate a more socially interactive experience for people watching TV. A team of researchers led by Nanyang Technological University's Yonggang Wen has proposed synchronizing such a second screen with the TV and integrating geolocation social media to create a social TV system. The researchers say their approach is easy to implement because users would take a screen shot of the TV to capture a code and then carry out a quick flip gesture to trigger the transmission process. The method takes less than five seconds, compared to 40 seconds for similar technology that would require users to visit a Web page and log on. Users would be able to view a wide variety of features on the tablet or smartphone to accompany a TV show. A "social sense subsystem" would crawl social media associated with a show and provide related social discussions from Twitter, Facebook, Google+, and other sources of geosocial and group-watching functionalities. Crawlers that seek out keywords, key users, and manually selected known accounts identify pertinent content. The approach would require an efficient method for distributing large amounts of user-generated content in order to minimize bandwidth consumption.


Open Source Project Promises Easy-to-Use Encryption for Email, Instant Messaging and More
IDG News Service (09/15/14) Lucian Constantin

The Pretty Easy Privacy (PEP) project aims to create free tools that simplify the encryption of online forms of communication by solving the complexity associated with the exchange and management of encryption keys. PEP researchers want to integrate the technology with existing communication tools on different desktop and mobile platforms. Most PEP software will be released under the GNU General Public License version 3 and will be free to use, but the researchers also will develop business products that will be commercialized. Although the PEP engine relies on existing open source technologies, its primary goal is to provide "no hassle" privacy through a "zero-touch" user experience. PEP automatically generates encryption keys for the user or imports them from a local PGP client. The system also can discover the keys for the user's communication partners. "The PEP engine is doing exactly what a hacker does when he or she is using PGP: create a good keypair with reliable algorithms, handle it safely, manage public keys of other people, and operate the crypto solution in the best known way to keep it safe," says PEP founder Volker Birk. The researchers note instead of a centralized infrastructure, the technology uses peer-to-peer technology to exchange data.


Sensors Everywhere Could Mean Privacy Nowhere, Expert Says
Purdue University News (09/16/14) Steve Tally

Purdue University professor Eugene Spafford, executive director of the Center for Education and Research in Information Assurance and Security, says the Internet of Things will involve microprocessors and sensors placed essentially everywhere, permanently recording data on people, often without their knowledge. Spafford, immediate past chair and a current member of the U.S. Public Policy Council of ACM, says such data collection can be useful in such areas as energy use, healthcare decisions, and financial decisions. "The downside is that we give up a lot of our privacy, and, in fact, maybe all of it," he warns. Spafford notes Internet microprocessors are already present in such devices as the Nest thermostat, which learns how homeowners prefer to heat or cool their homes. He says there also are Internet-connected refrigerators that inform users when groceries need to be purchased and can create a shopping list. However, Spafford notes such information will enable companies to know when a person is at home, how many people live there, and other details. He believes it is essential that companies furnish consumer information similar to what drug companies provide with medications. "We need a notice of the level of some of these observations, and which of these observations should we be allowed to opt out of," Spafford says. "There needs to be greater transparency about what is done with the information that's collected, the accuracy of the data, and where it's going."


Abstract News © Copyright 2014 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: technews@hq.acm.org
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe