Association for Computing Machinery
Welcome to the October 23, 2013 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


NASA Says First Space Internet Test 'Beyond Expectations'
Computerworld (10/22/13) Sharon Gaudin

The first tests of a laser communication system by the U.S. National Aeronautics and Space Administration (NASA) have far surpassed the agency's expectations. A NASA lunar probe recently began a month-long test of a laser communication system that could eventually become an outer space Internet. "It's been beyond what we expected," says NASA's Don Cornwell. "We obviously expected it would work well, but this is even better...everything going better than we thought it would. We're running these systems error-free." Cornell says the laser communications system could form the building blocks of an outerspace Internet. "This is the beginning of that," he says. "I think we could have that with delay tolerant networking." NASA hopes to use similar systems for faster satellite communications and deep space communications with robots and human exploration crews in the future. Two-way laser communications systems can deliver six times more data with 25 percent less power than the best radio systems currently in use today, and weigh half as much, Cornwell notes. "Oh, it's going to enable a lot of things," he says, "but the big benefit is you can send back more data from wherever you are."


Intel Opens Parallel Computing Centers for Exascale Push
eWeek (10/22/13) Jeffrey Burt

Intel recently opened Parallel Computing Centers around the world and sent out requests for other companies to collaborate in trying to reach exascale levels of computing by the end of the decade. Intel's Raj Hazra says reaching this goal will require new hardware and software that leverage the computing nodes and entire system and work at all levels, from workstations to high-end supercomputers. "Through these centers, Intel hopes to accelerate the creation of open standard, portable, scalable, parallel applications by combining computational science, hardware, programmer tools, compilers, and libraries, with domain knowledge and expertise," Hazra says. The new Parallel Computing Centers also will help get workloads ready for parallel computing, says James Reinders, Intel's director of parallel programming evangelism. "The need for neo-heterogeneous computing is enormous," Reinders says. "It combines the promise of heterogeneous to deliver better compute density, compute performance, and lower power consumption, while including the benefits of neo-heterogeneous computing to maintain programming flexibility, performance, and efficiency for developers." Hazra says the first five Parallel Computing Centers will be at Cineca in Italy, Purdue University, the Texas Advanced Computing Center at the University of Texas in Austin, the University of Tennessee, and Zuse Institut Berlin in Germany.


IBM Unveils Computer Fed by 'Electronic Blood'
BBC News (10/18/13) James Morgan

IBM has released a prototype of a computer modeled after the human brain that uses liquid both to fuel and cool the system. The new "redox flow" system uses "electronic blood" to bring power into the computer and remove heat. IBM researchers aim by 2060 to fit a one-petaflop computer onto a desktop, although such a system would currently fill half a football field. "We want to fit a supercomputer inside a sugarcube. To do that, we need a paradigm shift in electronics--we need to be motivated by our brain," says IBM Zurich lab's Bruno Michel. "The human brain is 10,000 times more dense and efficient than any computer today." The human brain's efficiency stems from its use of a single network of capillaries and blood vessels for heat and energy, Michel says. "Ninety-nine percent of a computer's volume is devoted to cooling and powering. Only 1 percent is used to process information, and we think we've built a good computer," says Michel, noting that 40 percent of the brain's volume is devoted to functional performance while only 10 percent is dedicated to energy and cooling. Michel's bionic computing architecture features a three-dimensional design, with a tall stack of chips and memory storage units integrated with processors.


Private-Sector IT Pros to See 5.6 Percent Average Pay Boost in 2014
NextGov.com (10/18/13) Brittany Ballenstedt

The average base compensation for U.S. IT professionals is expected to rise an average 5.6 percent in 2014, with tech salaries seeing the largest gains among all fields researched, according to Robert Half Technology's 2014 Salary Guide. Mobile applications developers will see average increases of 7.8 percent next year, with salaries ranging from $92,750 to $133,500, while software developers will see average increases of 7.7 percent, with salaries ranging from $80,250 to $127,250, according to the report. Cybersecurity professionals also will continue to see higher-than-average salaries in 2014, with most in the six-figure range. In addition, the report notes retaining in-demand tech professionals can be challenging. As a result, many organizations provide IT professionals with access to low-cost healthy meals and snacks, as well as on-site health services, professional development opportunities, generous family leave policies, and a green work environment. "To create a work environment that's satisfying and enhances productivity-–and to stand apart from competitors-–many tech firms are offering creative incentives, along with generous compensation packages," the report says.


Scientists Use Social Media--Vacation Photos From Flickr--to Study How People Use Natural Areas for Tourism
Stanford Report (CA) (10/18/13)

Crowdsourced social media can be used to measure the benefits and value of natural areas such as parks and beaches. Scientists affiliated with the Natural Capital Project at Stanford University have utilized information from 1.4 million geo-tagged images from the Flickr photo-sharing site, and the user profiles associated with them, to see where people were going and where they were coming from. They compared this information with data from on-site surveys at 836 recreational sites around the world, and determined the Flickr information can serve as a reliable indicator of how many people visit a tourist attraction each year and when they are visiting. "Information from crowdsourced social media is revolutionizing the way we study people and understand their choices," says Spencer Wood, lead author of the research. Researchers previously had to rely on local surveys and head counts to get this level of visitation information, but the use of social media is faster, more affordable, and better for looking at changes over time and space. The approach enables researchers to determine visitation rates and values for tourism and recreation without conducting local surveys. The researchers say crowdsourced social media also could lead to better management of natural areas.


Social Media Tested as Job-Relevant Learning Resource Rather Than Distraction to Tough Computer Science Learning
WSU News (10/21/13) Tina Hilding

Washington State University (WSU) researchers have developed a Facebook-style public forum designed to improve collaboration and learning. The platform aims to enable students to share small pieces of computer code and help each other learn. The researchers also will develop guidelines and best practices to help instructors use social programming assignments in their classes. In the future, the researchers plan to expand the application to other science, technology, engineering, and math courses. "This is important because it helps students overcome their social isolation, lets them into their peers' problem-solving activities, and brings them together with others who share their struggles," says WSU professor Chris Hundhausen. The researchers are testing the platform in introductory computer science courses at WSU, comparing data from Pacific University and Mesa Community College to see how providing social support could work in different educational environments. Hundhausen also plans to test the platform's use in other science and math classes. "This project will help us understand how learning and participation proceed in a social problem-solving environment, as well as its impact on student success," he says.


Atomically Thin Device Promises New Class of Electronics
Northwestern University Newscenter (10/21/13) Sarah Ostman

Northwestern University researchers have created a p-n heterojunction diode, an interface between two types of semiconducting materials, by integrating two atomically thin materials, molybedenum disulfide and carbon nanotubes. The technology could help lead the way to a new generation of computing based on nanoscale electronics. "The p-n junction diode is among the most ubiquitous components of modern electronics," says Northwestern professor Mark Hersam. "By creating this device using atomically thin materials, we not only realize the benefits of conventional diodes but also achieve the ability to electronically tune and customize the device characteristics." The p-n junction diode forms the basis of several technologies, including solar cells, light-emitting diodes, photodetectors, computers, and lasers. The p-n heterojunction diode also is highly sensitive to light, an attribute that enabled the researchers to create an ultrafast photodetector with an electronically tunable wavelength response. "We anticipate that this work will enable new types of electronic functionality and could be applied to the growing number of emerging two-dimensional materials," Hersam says.


Managing the Deluge of 'Big Data' From Space
NASA News (10/17/13) Whitney Calvin

Researchers at the U.S. National Aeronautics and Space Administration's (NASA) Jet Propulsion Laboratory (JPL) are developing new strategies for managing large and complex data streams. "We are the keepers of the data, and the users are the astronomers and scientists who need images, mosaics, maps, and movies to find patterns and verify theories," says JPL's Eric De Jong, principal investigator for NASA's Solar System Visualization project. He says there are three aspects to managing data from space missions, including storage, processing, and access. The researchers are developing software tools to better store the data, such as cloud computing techniques and automated programs for extracting data. "We don't need to reinvent the wheel," says JPL's Chris Mattmann. "We can modify open source computer codes to create faster, cheaper solutions." The researchers also are developing new ways to visualize the data. "Data are not just getting bigger but more complex," De Jong says. "We are constantly working on ways to automate the process of creating visualization products, so that scientists and engineers can easily use the data." He notes that as these new tools evolve, so will the ability to make sense of big data projects.


Forget Captcha, Try Inkblots
InformationWeek (10/17/13) Mathew J. Schwartz

Carnegie Mellon University researchers have developed Generating panOptic Turing Tests to Tell Computers and Humans Apart (Gotcha), a password mechanism based on a randomized puzzle-generation protocol that involves computer-human interaction. Gotcha works by generating an inkblot and then asking the user to enter a text description. The site then stores both the inkblot and description for whenever the user returns, at which point it displays the inkblot and asks the user to recognize their previous description from multiple potential selections. The researchers say the system "relies on the usability assumption that users can recognize the phrases that they originally used to describe each inkblot image." Gotcha could be used to prevent attackers from grabbing password files from servers, then cracking them offline, which continues to be a pervasive problem. "Any adversary who has obtained the cryptographic hash of a user's password can mount an automated brute-force attack to crack the password by comparing the cryptographic hash of the user's password with the cryptographic hashes of likely password guesses," the researchers note. They say by using Gotchas, businesses could "mitigate the threat of offline dictionary attacks against passwords by ensuring that a password cracker must receive constant feedback from a human being while mounting an attack."


Software Uses Cyborg Swarm to Map Unknown Environs
NCSU News (10/16/13) Matt Shipman

North Carolina State University researchers have developed software that could be used to map unknown environments based on the movement of a swarm of insect cyborgs. The process involves attaching electronic sensors to insects, such as cockroaches, releasing them into a hard-to-reach area, and controlling them remotely. For example, the biobots could be dispatched into a collapsed building where global positioning system technology does not work. The biobots would be allowed to spread out, and when signals via radio waves indicate that they are getting too close to each other, a signal can be sent to command them to keep moving until they find more walls or other unbroken surfaces. The software then uses an algorithm to translate the biobot sensor data into a rough map of the area, which would be helpful to first responders. The location of radioactive or chemical threats could be determined by equipping insects with relevant sensors. After tests involving computer simulations, the team is now testing the software using robots and then will test it with biobots.


Free Broadband Via Gaps in Spectrum Gets Biggest Trial
New Scientist (10/17/13) Paul Marks

European Union researchers are trying to help white space radio technology to fulfill its potential. The need to ensure that it does not interfere with TV signals is the main reason white space technology development has been delayed. Until now, white space technology trials only have been on a small scale and have been unable to prove how viable the technologies are. However, over the next six months, European organizations will launch the first mass citywide and rural trials. For example, in Glasgow, researchers will test both fixed and mobile reception of wireless Internet signals over white space frequencies. University of Strathclyde researchers will use sensors spread across the city to test the technology's use in "smart city" deployments. "TV white space will be an absolutely key enabling technology for intelligent living," says Intel Europe's Martin Curley. "It is essentially free to use, it is underutilized, and you get great broadband performance from it." Bell Labs and Intel are developing white space microchips that will fit in a smartphone to offer free local wireless services.


Cybersecurity Jobs: Young Adults Show Little Interest in the Field
eWeek (10/20/13) Sean Michael Kerner

Few young adults are interested in cybersecurity careers, according to a study from Raytheon and Zogby Analytics. Based on a survey of 1,000 U.S. adults ages 18 to 26, just 24 percent of respondents expressed a desire to pursue a cybersecurity career. By contrast, 40 percent of respondents were interested in careers in the entertainment business, while 32 percent were interested in being an app designer/developer. The study found that men more than women gravitated toward careers in cybersecurity. Lack of awareness is one possible explanation for the disinterest in cybersecurity, as 82 percent of respondents noted that their high school guidance counselors never mentioned to them the possibility of a career in cybersecurity. The survey also asked about workplace incentives in helping determine what kinds of jobs young adults want to pursue. The need for interesting work was cited by 69 percent of respondents, while 65 percent said competitive pay is critical. Analysts say the impact of not having enough new talent in the pipeline could be fueling the increase in cyberattacks. The International Information Systems Security Certification Consortium reported earlier this year that existing IT security professionals feel swamped due to staffing shortages.


What Makes a Data Visualization Memorable?
Harvard University (10/16/13) Caroline Perry

Harvard University and Massachusetts Institute of Technology (MIT) researchers have offered a new take on the debate over data visualization, stating that the same design elements that garner so much criticism, such as too much text, excessive ornamentation, gaudy colors, and clip art, also can make a visualization more memorable. "We like to believe our memories are unique, that they're like the soul of a person, but in certain situations it's as if we have the same algorithm in our heads that is going to be sensitive to a particular type of image," says MIT's Aude Olivia. The researchers collected more than 5,000 charts and graphics from scientific papers, design blogs, newspapers, and government reports and manually categorized them by a wide range of attributes. Displaying them for brief glimpses, the researchers tested the influence of features such as color, density, and content themes on users' ability to recognize which ones they had seen before. "A visualization will be instantly and overwhelmingly more memorable if it incorporates an image of a human-recognizable object--if it includes a photograph, people, cartoons, logos--any component that is not just an abstract data visualization," says Harvard professor Hanspeter Pfister. Unusual types of charts also were found to be more memorable.


Abstract News © Copyright 2013 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe