Welcome to the July 3, 2013 edition of ACM TechNews, providing timely information for IT professionals three times a week.
Please note: In observance of the Independence Day holiday, TechNews will not be published on Friday, July 5. Publication will resume Monday, July 8.
ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).
HEADLINES AT A GLANCE
Colleges Work to Retain Women in STEM Majors
U.S. News & World Report (07/01/13) Kelsey Sheehy
Higher education institutions are working to encourage women to pursue majors in science, technology, engineering, and math (STEM) through efforts such as increased mentoring and all-female residence halls. Women represent only 25 percent of STEM degree holders, and many colleges are launching outreach programs to boost female enrollment. "We're really trying to build that community so if they are the only woman in a class or on a project team, they don't feel like the only one," says Tricia Berry, director of the Women in Engineering program at the University of Texas at Austin, where women account for only 10 to 15 percent of enrollment in some engineering majors. UT-Austin and Virginia Tech are among the schools designing learning communities that place new students in residence halls with more experienced female engineering mentors. Role models also are important for encouraging women to enter STEM fields. Michigan Technological University electrical engineering Ph.D. candidate Kaitlyn Bunker says she was inspired by both her mechanical engineer father and biographies that she read of award recipients at a Society of Women Engineers conference. "I realized these women were working on cutting-edge technologies and a lot of them have Ph.D.s," Bunker says. "That made me decide to go for one and to one day be one of these women."
Scientists Undertake Effort to Launch Video Data-Sharing Library for Developmental Science
New York University (07/02/13) James Devitt
Researchers at New York University (NYU) and Pennsylvania State University (PSU) have created Databrary, a Web-based video data library sponsored by the U.S. National Science Foundation and the National Institutes of Health. Databrary aims to encourage widespread data sharing in the developmental and behavioral sciences, where video is commonly employed but rarely shared. The library includes tools that enable researchers to store and openly share videos and related information about their studies. "By creating tools for open video data-sharing, we expect to increase scientific transparency, deepen insights, and better exploit prior investments in developmental and behavioral research," says NYU professor Karen Adolph. The researchers say video data sharing is a groundbreaking approach to big data efforts in scientific behavioral research. "Video can be combined with other data sources like brain imaging, eye movements, and heart rate to give a more complete and integrated picture of the brain, body, and behavior," says PSU professor Rick Gilmore. The researchers note that the Databrary project is the first large-scale, open data-sharing system that enables behavioral scientists to share and re-use research video files. The project also involves enhancing an existing open source software tool that researchers can use to score, explore, and analyze video recordings.
Man Invents New Language for Turning Graphics Chips Into Supercomputers
Wired News (07/02/13) Klint Finley
Indiana University Ph.D. candidate Eric Holk has developed Harlan, a programming language for creating applications that run on graphics processing units (GPUs). "GPU programming still requires the programmer to manage a lot of low-level details that often distract them from the core of what they’re trying to do," Holk says. "We wanted a system that could manage these details for the programmer, letting them be more productive and still getting good performance from the GPU." Computer calculations usually are run on the central processing unit (CPU), which handles a single thread of computations at one time. In contrast, the GPU processes multiple threads simultaneously, but with each thread executing more slowly. However, programs can be written to leverage parallelism to enable multiple threads to run more quickly, in a manner similar to a supercomputer. As GPUs become increasingly powerful and flexible, they are being used for a range of applications, such as modeling physical systems and improving smartphones. Harlan provides programming abstractions associated with higher-level programming languages. For programmers working on less-sophisticated applications, Holk is working on a GPU version of Mozilla's Rust programming language.
Teaching a Computer to Play 'Concentration' Advances Security, Understanding of the Human Mind
NCSU News (07/01/13) Matt Shipman
North Carolina State University (NCSU) researchers have developed ACT-R, software designed to play the game Concentration. The researchers say the software could lead to better computer security and a greater understanding of how the human mind works. In the game, multiple matching pairs of cards are placed face down in random order, and players are asked to flip over two cards, one at a time, to find the matching pairs. If a player flips over two cards that do not match, the cards are placed back face down. The researchers were able to either accelerate ACT-R's decision-making, which led it to play more quickly but make more mistakes, or allow ACT-R to take its time, which led to longer games with fewer mistakes. As part of the study, 179 human participants played Concentration 20 times each, including 10 times for accuracy and 10 times for speed. The researchers modified the game's parameters to determine which set of variables resulted in gameplay that most closely resembled the gameplay of the human study participants. "Ultimately, this sort of information could one day be used to develop tools to help software designers identify how their design decisions affect the end users of their products," says NCSU professor David Roberts.
Supercomputing Needs to Deliver 'Real-Life Value,' Senator Says
Computerworld (07/01/13) Patrick Thibodeau
The United States, China, Europe, Japan, and possibly Russia are all competing to be the first to develop an exascale computing system. U.S. Sen. Dick Durbin (D-IL) is co-sponsoring legislation in the U.S. Senate to spend $580 million on exascale development over the next three years. There is a similar bill in the House, sponsored by Rep. Randy Hultgren (R-IL), but it does not include a specific exascale funding recommendation. In order to win support for the legislation, high-performance computing systems will have to deliver results that will get notice, according to Durbin. He recently attended the dedication ceremony for Argonne National Laboratory's Mira supercomputer, an IBM Blue Gene/Q system that is the world's fifth-fastest computer. "We need to translate this computer into real-life value, [and] we need to incentivize those in public life who are willing to invest in the research," Durbin says. Mira is being used to design better concrete with a reduced carbon footprint, and to improve batteries, especially those used in automobiles, says Argonne's Rick Stevens. Durbin notes that other countries "are dreaming of getting ahead of the U.S. when it comes to basic research."
Pruning the Power Grid
MIT News (07/01/13) Jennifer Chu
Massachusetts Institute of Technology (MIT) researchers have developed an algorithm that identifies the most dangerous pairs of failures among the millions of possible failures in a power grid. The algorithm pares down all of the possible combinations to the sets most likely to cause widespread damage. The researchers tested their algorithm on data from a mid-sized power grid model consisting of 3,000 components, and within 10 minutes the algorithm had labeled 99 percent of the failures as relatively safe. The remaining 1 percent represented pairs of failures that would likely result in large blackouts if left unmonitored. The speed of the new algorithm is unmatched by similar conventional alternatives, says MIT professor Konstantin Turitsyn. "This algorithm can be used to update what are the events--in real time--that are the most dangerous," he says. Turitsyn says the algorithm identifies spheres of influence around a power failure. If two failures are relatively close, spheres of influence can overlap, intensifying the response and raising the probability of a catastrophic cascade. "This algorithm, if massively deployed, could be used to anticipate events like the 2003 blackout by systematically discovering weaknesses in the power grid," says Columbia University professor Daniel Bienstock.
Move Over Messi, Here Come the Robots
Associated Press (06/30/13) Toby Sterling
RoboCup is a soccer tournament in which more than 1,000 soccer-playing robots from 40 countries compete against each other. The tournament's ultimate goal is to defeat the human World Cup winners by 2050, creating technology along the way that will have applications beyond the realm of sports. To achieve that goal, the tournament organizers have created multiple competition classes, such as small robots, large robots, humanoid robots, and virtual robots, with plans to merge their techniques into one team of androids that can defeat the humans. In all of the divisions, there is no human interference allowed once the game starts except for substitutions, when humans are allowed to remove a bot that has broken down. The bots use different kicks for passing and shooting, and they communicate their position to each other via wireless Internet connections. One method several teams are using in this year's competition is path planning, in which the ball is passed toward open space as a robot moves to intercept it. Meanwhile, all contestants in the standard platform division use the same humanoid robot, made by Aldebaran Robotics, equipped with glowing eyes that change color to show emotion.
Large-Scale Quantum Chip Validated
USC News (06/28/13) Robert Perkins
University of Southern California (USC) scientists say the D-Wave processor housed at the USC-Lockheed Martin Quantum Computing Center appears to be operating as a quantum processor. The scientists verified the quantum effects of the first commercial quantum optimization processor in a demonstration that involved a small subset of the chip's 128 qubits. "Using a specific test problem involving eight qubits, we have verified that the D-Wave processor performs optimization calculations [that is, finds lowest-energy solutions] using a procedure that is consistent with quantum annealing and is inconsistent with the predictions of classical annealing," says the center's Daniel Lidar. Performing largely as hoped, the chip showed the potential for quantum optimization on a larger-than-ever scale. "Our work seems to show that, from a purely physical point of view, quantum effects play a functional role in information processing in the D-Wave processor," notes USC researcher Sergio Boixo. Lockheed Martin purchased the quantum processor from D-Wave nearly two years ago, and placed it at USC's engineering school for further testing.
Training Intelligent Systems to Think on Their Own
Lehigh University (06/28/13) Kurt Pfitzer
Lehigh University professor Hector Munoz-Avila has won a three-year U.S. National Science Foundation grant to develop autonomous agents that dynamically identify and select their own goals. Computing devices and software programs already use algorithms to make intelligent decisions, notes Munoz-Avila, who focuses on the incipient field of goal-driven autonomy (GDA) technology. In the near future, Munoz-Avila believes that algorithm-powered agents will analyze complex problems, establish the most effective intermediate goals, and take steps toward long-term solutions. During this process, agents will adjust to unexpected situations and learn from their mistakes without human intervention. "For a long time, scientists have told agents which goals to achieve," Munoz-Avila says. "What we want to do now is to develop agents that autonomously select their own goals and accomplish them." GDA agents could have useful applications in areas such as military planning, robotics, security, and electrical grid control systems. Munoz-Avila's project will focus on enabling GDA agents to acquire more knowledge of their domains and to generalize their success in other applications. He says the importance of GDA agents having the ability to diagnose environmental discrepancies and act intelligently is growing as autonomous computing devices and software proliferate in today's society.
Breakthrough in Internet Bandwidth: New Fiber Optic Technology Could Ease Internet Congestion, Video Streaming
Science Daily (06/27/2013)
Boston University professor Siddharth Ramachandran recently discovered a method of improving the stability of optical vortices, which could lead to significantly increased Internet bandwidth. Ramachandran designed an optical fiber that can generate optical vortices, which are laser light beams in which light spins instead of traveling in a straight line along the beam path, allowing greater data flow. Prior to Ramachandran's discovery, optical vortices, also known as orbital angular momentum (OAM) beams, were thought to be unstable in fiber. Ramachandran and his University of Southern California colleague Alan Willner released a paper proving the stability of the beams in optical fiber, and describing this development's potential to increase Internet bandwidth. "Our discovery, of design classes in which they are stable, has profound implications for a variety of scientific and technological fields that have exploited the unique properties of OAM-carrying light, including the use of such beams for enhancing data capacity in fibers," Ramachandran says. The researchers demonstrated that they could transmit data through a one-kilometer fiber in 10 different colors for each OAM mode, which leads to a transmission capacity of 1.6 terabits per second.
DARPA Challenge: Build Virtual Robots
InformationWeek (06/27/13) Patience Wait
The U.S. Defense Advanced Research Project Agency (DARPA) has selected nine teams to move to the second round of the DARPA Robotics Challenge (DRC). In an attempt to explore how virtual robots might respond to disasters and emergencies, DARPA has charged participants with coding software that can direct a virtual robot through a series of qualifying tasks in a virtual environment that presents obstacles in a simulated suburban setting. "The robot must find a way to get into a utility vehicle and drive it, then how to locomote over muddy and uneven terrain, then how to pick up a fire hose and attach it, then turn it on," says DRC program manager Gill Pratt. He notes that teams had access to the DRC Simulator, which was developed by the Open Source Robotics Foundation. "This simulator runs in real time and does sophisticated tasks," Pratt says. "This is the first time we're able to run a robot at tasks in a simulated environment." The next round of trials is set for December, and the top teams will receive funding to prepare for the DRC finals in December 2014. The robots will face an end-to-end disaster scenario, and the prize will be $2 million.
Vulnerability Severity Scores Make for Poor Patching Priority, Researchers Find
Dark Reading (06/25/13) Robert Lemos
A vulnerability's ranking on the Common Vulnerability Scoring System does not offer a useful guide for patching priority, according to two Italian researchers who plan to discuss their findings at Black Hat Security Briefings later this year. University of Trento professor Fabio Massacci and Ph.D. student Luca Allodi considered the CVSS scores of vulnerabilities drawn from the National Vulnerability Database and compared them with information from the Exploit Database and information from Symantec about which vulnerabilities are being targeted in the wild. The researchers found that a CVSS score did not correlate well with odds that a vulnerability was being exploited. Those vulnerabilities for which exploits are being sold in the underground were more likely to be attacked in the wild, compared to those that simply exist as proofs of concept in the Exploit Database. They also found that the more complex an exploit, the less likely it would be exploited. "If your vulnerability is in an exploit kit, then patch it," Allodi says. "And if it is easy to exploit, then patch." Allodi says more complex vulnerabilities should be addressed based on the possible damage they might cause.
Microscopy Technique Could Help Computer Industry Develop 3-D Components
NIST News (06/25/13) Chad Boutin
Through-focus Scanning Optical Microscopy (TSOM) could become a crucial tool for improving computer chips. For decades, computer chips have resembled city maps in that their components are essentially flat, but designers are now looking to follow city planners and build upwards, with a new generation of chips that feature three-dimensional (3D) structures that stack components atop one another. The move would enable designers to pack more components onto chips, but it would require a whole new dimension of measurement capability to ensure they are made to the right shapes and sizes. "Now, we will need to measure all sides of a three-dimensional structure that has more nooks and crannies than many modern buildings," says the U.S. National Institute of Standards and Technology's (NIST) Ravikiran Attota. Developed at NIST several years ago, TSOM uses a conventional optical microscope and collects two-dimensional images at different focal positions, forming a 3D data space. Software then extracts brightness profiles from the multiple out-of-focus images and uses the differences between them to construct the TSOM image. "Our simulation studies show that TSOM might measure features as small as 10 nm or smaller, which would be enough for the semiconductor industry for another decade," Attota says.
Abstract News © Copyright 2013 INFORMATION, INC.
To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.