Association for Computing Machinery
Welcome to the April 3, 2013 edition of ACM TechNews, providing timely information for IT professionals three times a week.

ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).


Obama to Unveil Initiative to Map the Human Brain
New York Times (04/02/13) John Markoff; James Gorman

President Barack Obama is launching the Brain Research Through Advancing Innovative Neurotechnologies (BRAIN) research initiative, which aims to record and map brain circuits to advance neuroscience and improve treatments for Alzheimer’s disease, epilepsy, and traumatic brain injuries. The Obama administration has designated BRAIN as a grand challenge of the 21st century and committed $100 million in initial funding to the project in 2014. The U.S. National Institutes of Health (NIH), the Defense Advanced Research Projects Agency, and the National Science Foundation will work on the project, and an NIH working group led by Rockefeller University's Cori Bargmann and Stanford University's William Newsome will develop a plan, time frame, specific goals, and cost estimates. New technology will be created to simultaneously record as many as hundreds of thousands of neurons, and novel theoretical approaches, mathematics, and computer science will be necessary to handle the resulting data volume, Newsome says. In addition, Obama will require a study of the ethical impact of the anticipated neuroscience advances in question. Proponents believe the initiative's impact on science and technology could be equal to that of the Human Genome Project and the launch of the Sputnik satellite in the 1950s.

Stanford to Collaborate With edX to Develop a Free, Open Source Online Learning Platform
Stanford Report (CA) (04/03/13)

Beginning in June, Stanford University will collaborate with edX, the nonprofit online learning enterprise founded by Harvard University and the Massachusetts Institute of Technology, to advance the development of edX's open source learning platform, which provides free and open online learning tools. As part of the collaboration, Stanford will integrate features of its existing Class2Go open source online learning platform into the edX platform. "This collaboration brings together two leaders in online education in a common effort to ensure that the world's universities have the strongest possible not-for-profit, open source platform available to them," says Stanford's John Mitchell. He notes that Stanford will continue its partnerships with Coursera, Venture Lab, and other providers. However, "we will focus our development efforts on a single, open source platform which makes the most efficient use of our time and resources," Mitchell says. EdX also will support the community of developers contributing to the enhancement of the platform by providing an environment for developer collaboration. "We believe the edX platform--the Linux of learning--will benefit from all the world's institutions and communities," says edX president Anant Agarwal.

Visa Demand Jumps
Wall Street Journal (04/02/13) Miriam Jordan; Sara Murray; Amir Efrati

U.S. employers are expected to quickly reach a cap on the annual allocation of applications for skilled foreign worker visas or H-1Bs, amid signs of an economic rebound. Demand for the program increased over the past several years as companies ramped up hiring, and this year the visa limit is expected to be reached within a week of becoming available. Attorney Mark Koestler cites a misconception that only technology giants use the visas, as there is ample evidence that they also are used widely by many small companies. For years, high-tech companies have urged Congress to raise the limit on visas for skilled foreigners, but critics claim the visa program displaces eligible U.S. workers and that companies hire foreigners because they take less pay and fewer benefits. Economic Policy Institute analyst Daniel Costa says the speed at which the visa quota is being depleted does not necessarily reflect actual demand for workers, as demand will always exist for employees "who can be legally underpaid compared to similarly situated American workers." However, firms using H-1Bs contend the United States is not producing enough science, technology, engineering, and math professionals to encourage growth and innovation.

New Hybrid Memory Cube Spec to Boost DRAM Bandwidth by 15X
Computerworld (04/02/13) Lucas Mearian

The Hybrid Memory Cube Consortium recently announced the final specifications for 3D dynamic random-access memory (DRAM), which is designed to boost performance for networking and high-performance computing applications. Hybrid Memory Cube technology stacks multiple volatile memory dies on top of a DRAM controller. The DRAM is connected to the controller via Vertical Interconnect Access technology, a method of passing an electrical wire vertically through a silicon wafer. "We took the logic portion of the DRAM functionality out of it and dropped that into the logic chip that sits at the base of that 3D stack," says Micron's Mike Black. "That logic process allows us to take advantage of higher performance not only interact up through the DRAM on top of it, but in a high-performance, efficient manner across a channel to a host processor." The logic layer serves as both the host interface connection and the memory controller for the DRAM sitting on top of it, Black says. Moreover, Hybrid Memory Cube technology reduces the tasks that a DRAM must perform so that it only drives the through-silicon vias, which are connected to much lower loads over shorter distances, notes analyst Jim Handy.

End of the Line for Roadrunner Supercomputer
Associated Press (04/01/13) Susan Montoya Bryan

The Roadrunner supercomputer, housed at Los Alamos National Laboratory (LLNL), will be decommissioned this Sunday because it has been replaced by smaller, faster, more efficient, and less-expensive systems. In 2008, Roadrunner became the first system to break the petaflop barrier by processing just over a quadrillion mathematical calculations per second. The supercomputer has been used over the last five years to model viruses and unseen parts of the universe, to better understand lasers, and for nuclear weapons work. Los Alamos is currently using a supercomputer called Cielo, which is slightly faster than Roadrunner, takes up less space, and costs about $54 million, compared to Roadrunner's $121 million cost. In the next 10 to 20 years, it's expected that the world's supercomputers will be capable of breaking the exascale barrier, or one quintillion calculations per second, notes LLNL's Kevin Roark. Roadrunner is still among the 25 fastest supercomputers in the world. "Roadrunner got everyone thinking in new ways about how to build and use a supercomputer," says LLNL's Gary Grider. "Specialized processors are being included in new ways on new systems and being used in novel ways. Our demonstration with Roadrunner caused everyone to pay attention."

It’s in the Algorithm: Extremely Tight Races in Major League Baseball Chase This Year
Network World (03/29/13) Michael Cooney

New Jersey Institute of Technology (NJIT) researchers have used a mathematical analysis to compute the number of regular season games each Major League Baseball team should win in 2013. The model predicts the probability of a team with given hitters, bench, starting pitchers, lineup, relievers, and home field advantage winning a game against another team. "The numbers indicate that only one game might separate the first and second place teams in both the National League's East and West divisions, with the Atlanta Braves (94 wins) edging out the Washington Nationals (93 wins) in the East and the Los Angeles Dodgers (88 wins) coming in just ahead of the San Francisco Giants (87 wins) in the West," says NJIT professor Bruce Bukiet. The American League also should have some tight playoff races. "While the Detroit Tigers should have the best record in baseball (102 wins) and run away with the Central division, with the next best team (the Chicago White Sox) more than 20 wins behind, the other two divisions could end up in ties," Bukiet says. Last season, Bukiet's model picked six of the 10 post-season teams.

Paving the Road for Women in Computer Science at BYU
The Universe (04/01/13) Gabriel Meyr

Brigham Young University (BYU) is trying to make its computer science program more accessible to women. Junior Christine Kendall is working with clubs and faculty to encourage women to study technical fields. “I feel like because of culture or society, men, in general, have more confidence in technical areas," Kendall says. "I think it’s really easy to feel like an outsider and that you don’t belong, especially when you hear all your classmates talking about their personal projects and what they’ve done on the side.” Kendall and computer science professor Jay McCarthy met with faculty about allowing introductory students to sort themselves by experience level, because a gender gap in experience exists even at the basic level, Kendall says. Although introductory curriculum will remain the same, different sections are likely to be offered for students with varying levels of experience. Kendall borrowed this approach from Harvey Mudd College, which increased its ratio of women in computer science from 14 to 46 percent over three years by offering tiered introductory courses. Harvey Mudd also sent women to the annual Grace Hopper Celebration of Women in Computing, and Kendall and McCarthy are seeking corporate sponsorships to send BYU women to the conference.

Digital Shrinks Find Depressed Faces and Body Language
New Scientist (03/28/13) Niall Firth

Automatic systems that analyze gestures and facial expressions might improve the challenging task of diagnosing depression. One such system, SimSensei, is a digital avatar that interviews people to determine their state of mind using facial-recognition technology and depth-sensing cameras integrated with Microsoft Kinect to capture and analyze body language. University of Southern California researchers identified characteristic movements that signal possible depression by interviewing non-depressed volunteers and those who had been diagnosed with depression or post-traumatic stress disorder. Focusing a high-definition webcam on a subject's face and tracking body movements with Kinect, the team noted that depressed people are more likely to fidget and drop their gaze, and smile less than average. Another automatic depression diagnosis system is underway at the University of Canberra, which is working with the Black Dog Institute to develop a machine-vision system that looks for distinctive facial expressions, slower-than-usual blinking, and specific upper-body movements. Meanwhile, University of Pittsburgh researchers are studying changes in facial expression as a person receives depression treatment. These new systems will be tested in October, when global researchers gather at the ACM Multimedia conference in Barcelona, Spain, to participate in a contest to discover the most accurate depression diagnosis system.
View Full Article - May Require Free Registration | Return to Headlines | Share Facebook  LinkedIn  Twitter 

Swarming Robots Could Be the Servants of the Future
University of Sheffield (03/28/13) Paul Mannion

The Sheffield Center for Robotics is involved in a robot swarms project that could benefit the medical field, industry, and the military. Researchers are working to program a group of 40 robots. In a demonstration, the swarm carried out simple fetching and carrying tasks by gathering around an object and working together to push it across a surface. The robots also can form a single cluster after being scattered across a room, and organize themselves by order of priority. University of Sheffield researchers developed relatively simple programming to control the robots. For example, when asked to group together, each robot only needs to work out if there is another robot in front of it. The key is to limit the amount of information needed to perform tasks, says Roderich Gross from Sheffield's Natural Robotics Lab. "That's important because it means the robot may not need any memory, and possibly not even a processing unit, so this technology could work for nanoscale robots, for example in medical applications," Gross notes.

University of Arkansas Researchers Study How to Link Visual Identification Technology With RFID
RFID Journal (03/28/13) Claire Swedberg

The University of Arkansas' RFID Research Center and the Center for Advanced Spatial Technologies (CAST) are collaborating to research the retail applications of emerging visual identification technologies (VIT) and possible uses of VIT systems to complement radio frequency identification (RFID) technology. VIT uses cost-effective 2D and 3D optical-imaging technologies to recognize objects by color, shape, and size without the need for bar codes or product numbers. This could enable VIT-based systems in stores to quickly identify products on shelves, add those goods to inventory lists, ensure that products are correctly placed, and remove products from inventory after checkout. The team will study how RFID read data obtained from ultrahigh-frequency readers could be used with 3D VIT-based data pertaining to the objects' positions within a store to improve item-level inventory. The researchers also hope to discover ways to integrate an RFID system with VIT technology to improve inventory control on store shelves and at sales terminals. CAST has created software that analyzes data collected by optical-imaging hardware, which could be used for geospatial location and mapping, as well as for object shape analysis, to help users with cameras identify items and their location in relation to other objects.

Scientists Use AI Cobweb Analysis to Determine Spider Species (03/28/13) Nancy Owano

Pattern recognition can be used to determine which species of spider has spun a cobweb. The University of Las Palmas de Gran Canaria's Carlos Travieso and colleagues have used an artificial intelligence (AI) recognition system with special software to analyze the images provided from a spider expert. The researchers applied pattern-recognition techniques to the pictures until they were confident they could reliably identify the type of web. The team was able to achieve an accuracy rate of 99.6 percent. Their method involved principal component analysis (PCA), independent component analysis, Discrete Cosine Transform, Wavelet Transform, and discriminative common vectors such as features extractors, with PCA providing the best performance. The research tool extends from a way of measuring spider species to understanding and measuring biodiversity. The team plans to use the AI recognition system on other species of animals and insects.

Robot Ants Successfully Mimic Real Colony Behavior
EurekAlert (03/28/13)

Robotic ants have replicated the movement of individual ants from their nests to different food sources. Researchers at the New Jersey Institute of Technology and the Research Center on Animal Cognition used a swarm of sugar cube-sized robots that could leave light trails for detection, with two light sensors mimicking the role of the ants' antennae. The research focused primarily on how Argentine ants, when part of a colony, behave and coordinate themselves in both symmetrical and asymmetrical labyrinthine pathways. In nature, ants leave chemical pheromone trails. The researchers' model revealed that the robots did not need to be programmed to identify and compute the geometry of the network bifurcations. "This research suggests that efficient navigation and foraging can be achieved with minimal cognitive abilities in ants," says lead author Simon Garnier. "It also shows that the geometry of transport networks plays a critical role in the flow of information and material in ant as well as in human societies."

Penn Engineers Enable ‘Bulk’ Silicon to Emit Visible Light for the First Time
Penn News (03/26/13) Evan Lerner

University of Pennsylvania researchers have enabled "bulk" silicon to emit broad-spectrum visible light, a breakthrough they say could lead to devices that have both electronic and photonic components. Silicon is normally a poor emitter of light because it turns added energy into heat, which makes integrating electronic and photonic circuits a challenge. Semiconductors usually must cool down after excitation before heating up again and releasing the remaining energy as light. However, the nanowires developed by the Pennsylvania researchers, combined with plasmonic nanocavities, can eliminate the cool-down period, opening the possibility of producing light from semiconductors. "If we can make the carriers recombine immediately, then we can produce light in silicon," says Pennsylvania professor Ritesh Agarwal. He says the research could translate into a broad bandwidth for possible operation in a photonic or optoelectronic device. “If you can make the silicon emit light itself, you don’t have to have an external light source on the chip,” Agarwal notes. “We could excite the silicon electrically and get the same effect, and we can make it work with wires from 20 to 100 nanometers in diameter, so it’s very compatible in terms of length scale with current electronics.”

Abstract News © Copyright 2013 INFORMATION, INC.
Powered by Information, Inc.

To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe