Association for Computing Machinery
Welcome to the October 30, 2013 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).


First-of-Its-Kind Online Master's Draws Wave of Applicants
Wall Street Journal (10/30/13) Douglas Belkin

Since implementing a new low-cost online master's program in computer science based on massively open online course (MOOC) technology, Georgia Tech has received almost twice as many applications for the program in the past three weeks as its residential program receives in a year. Unlike most MOOCs, the course is not offered for free, with applicants paying about $6,600 to participate, compared with approximately $44,000 for residential students. The number of U.S.-resident applicants for the Georgia Tech program also was 14 times higher than those for the residential class. College of Computing dean Zvi Galil estimates that 79 percent of the online master's applicants were U.S. citizens, versus 9 percent of the residential applicants. This trend indicates a strong demand among adult students to receive an education while also staying at home, or being employed or raising a family, says Udacity CEO Sebastian Thrun. Udacity created the program in partnership with Georgia Tech and AT&T, and also announced the Open Education Alliance, which lets students earn a free certificate based on courses developed with the company's partners. "I think this is symptomatic of a lot of what we're going to be seeing in the future," says the Cornell Higher Education Research Institute's Ronald Ehrenberg.
View Full Article - May Require Paid Subscription | Return to Headlines | Share Facebook  LinkedIn  Twitter 

Software Beats CAPTCHA, the Web's 'Are You Human?' Test
New Scientist (10/28/13) MacGregor Campbell

California-based startup Vicarious says it has developed software that can successfully crack any text-based Completely Automated Public Turing test to tell Computers and Humans Apart (CAPTCHA) program, defeating Google's reCAPTCHA program 90 percent of the time. The software employs virtual neurons arranged in a network modeled after the human brain. The network begins with nodes that detect real-world input, such as whether a specific pixel in an image is black or white, and the next nodal layer fires only if the nodes identify a specific pixel configuration. A third layer fires only if its nodes recognize pixel arrangements cohering into whole or partial shapes, and the process repeats on between three and eight nodal levels, with signals passing between up to 8 million nodes. The network eventually makes a best guess on the letters contained in the image. The strength of each neural link is determined by training the network with solved CAPTCHAs and videos of letters in motion, enabling the system to create its own representation of particular letters. Vicarious CEO Scott Phoenix hopes the company's innovation will lead to more human-like artificial intelligence, and he says the company plans to apply the tool toward beating more Turing tests.

'Li-Fi' Via LED Light Bulb Data Speed Breakthrough
BBC News (10/28/13)

Several U.K. institutions are collaborating on the ultra-parallel visible light communications (Li-Fi) project. The researchers have used a micro light-emitting diode (LED) light bulb to transmit data at 3.5Gbit/s via each of the three primary colors--red, green, and blue--that make up white light, which means that data transfer speeds of more than 10Gbit/s is possible. The tiny micro-LED bulbs enable streams of light to be beamed in parallel, each multiplying the amount of data that can be transmitted at any one time. The researchers used Orthogonal Frequency Divisional Multiplexing to enable micro-LED light bulbs to handle millions of changes in light intensity per second, effectively behaving like an extremely fast switch. The technique enables large chunks of binary data to be transmitted at high speed. Li-Fi will be less expensive and more energy efficient than existing wireless radio systems because of the ubiquity of LED bulbs and the fact that lighting infrastructure is already in place. Another advantage is that evenly spaced LED transmitters could provide much more localized and consistent Internet connectivity throughout buildings, says the University of Edinburgh's Harald Haas.

'Mundane' Classes Put Thousands Off Computer Science
Engineering and Technology Magazine (10/28/13) Edd Gent

Reading University is launching what is believed to be the United Kingdom's first free online university programming course, as the school's vice-chancellor Sir David Bell warns that failing to address technology training will harm the UK's tech companies. Fewer undergraduates today have computer skills and interest in programming than those growing up with the first generation of home computers in the 1980s and early 1990s, Bell says. "The UK will suffer for its lost generation of computer programmers," he says. "We risk the UK's high-tech firms struggling to remain competitive in a highly globalized industry." Bell, who supports reforms that would replace the Information and Communications Technology (ICT) curriculum with computer science, claims "Ministers and the wider school system got it completely wrong in teaching mundane office skills in ICT, instead of learning how to build software and we've now got to play catch up." Reading's "Begin Programming" massively open online course filled 10,000 places in less than 24 hours when registration began last month. The seven-week program teaches the basics of Java and asks students to build a game from scratch. The course aims to provide programming exposure to teenagers interested in pursuing computer programming degrees, as well as adults.

Making Robots More Like Us
New York Times (10/28/13) John Markoff

Robots are moving beyond the industrial phase in which they are separate from humans, and moving more freely in the world as they operate independently in a manner that is similar to humans. Universities and industrial laboratories increasingly are designing robots to work with, rather than to replace, humans. France, for example, awarded $13.8 million to French robot maker Aldebaran to create the five-foot humanoid robot Romeo to care for older people and provide home assistance. The human form is important to collaborative robots for social and technical reasons, because people prefer machines that resemble themselves and the environment is filled with devices such as handles that are designed for human operation. Advances in computer vision, processing power and storage, low-cost sensors, and algorithms enable robots to plan and move in disordered environments. Carnegie Mellon University professor Manuela M. Veloso has created mobile robots called CoBots that can deliver mail, lead visitors to appointments, and get coffee. Veloso says her robots have "symbiotic autonomy," because they depend on humans and are programmed to wait and ask for human assistance to operate elevators, for example, since they lack arms. Robots also are being designed to move with greater agility and without threatening humans.

National Robotics Initiative Grant Will Provide Surgical Robots With a New Level of Machine Intelligence
Research News @ Vanderbuilt (10/25/13) David Salisbury

Researchers at Vanderbilt University, Carnegie Mellon University, and Johns Hopkins University (JHU) are collaborating on the Complementary Situational Awareness for Human-Robot Partnerships project, which "advances our shared vision of human surgeons, computers, and robots working together to make surgery safer, less invasive, and more effective," says JHU professor Russell Taylor. One project goal is to restore the type of awareness surgeons have during open surgery, which they have lost with the advent of minimally invasive surgery because they must work through small incisions in a patient's skin. The researchers want to create a system that acquires data from several different types of sensors as an operation is underway, and integrates them with pre-operative information to produce dynamic, real-time maps that track the position of the robot probe and show how the tissue in its vicinity responds to its movements. The researchers developed methods that enable snake-like surgical robots to explore the shapes and variations in stiffness of internal organs and tissues. Their method relies on Simultaneous Localization and Mapping, which enables mobile robots to navigate in unexplored areas. "We will design the robot to be aware of what it is touching and then use this information to assist the surgeon in carrying out surgical tasks safely," says Vanderbilt professor Nabil Simaan.

Researchers Integrate Social Science in Cybersecurity Project
The Tartan (10/28/13)

Carnegie Mellon University (CMU) researchers are helping to develop methods for computers to make security decisions in cyberspace by investigating psychological and human factor issues. Researchers have developed techniques that enable computers to distinguish between real and false cyberattacks, and this capability could lead to computer systems that respond with human decision making and without physical human intervention. The research is part of the five-year, $23.2 million Models for Enabling Continuous Reconfigurability of Secure Missions project, which uses human behavior models to enable computers to predict the motivations of users, defenders, and attackers. The project uses human behavior models to detect attacks, measure and manage risk, and alter the environment to optimize results. Humans are integral to maintaining cybersecurity, notes CMU professor Lorrie Cranor. "Their behavior and cognitive and psychological biases have to be integrated as much as any other component of the system that one is trying to secure," she says. The Army Research Laboratory, Pennsylvania State University, Indiana University, the University of California, Davis, and the University of California, Riverside also are participating in the project.

Can Quantum Cryptography Work in the Real World?
Government Computer News (10/28/13) William Jackson

Battelle Memorial Institute researchers say they have developed the first production system for quantum distribution of cryptographic keys and are planning to create a 400-mile link enabling quantum-key distribution (QKD) between Columbus, Ohio, and Washington, D.C., by 2015. Although practical QKD systems have been around for about 10 years, limitations in the range and scalability of the systems have limited their use in the United States to research, says Battelle's Don Hayford. Nevertheless, European banks and government agencies have been using QKD for several years. Still, the U.S. National Institute of Standards and Technology, which has been doing quantum cryptography research for years, does not yet think it is a viable option for production systems. "We use AES 256-bit encryption as part of our overall QKD solution," Hayford says. In addition, the system uses an existing gigabit metro-area fiber-optic ring, and Battelle added only about a mile of new fiber to connect to the ring. Battelle is collaborating with ID Quantique to develop repeaters, known as trusted nodes, to broaden the range and to enable multiple links, which are expected to facilitate extension of the QKD network to Battelle's D.C. offices.

Researchers Turn to Technology to Discover a Novel Way of Mapping Landscapes
UC Magazine (10/28/13) Dawn Fuller

University of Cincinnati doctoral student Jacek Niesterowicz and professor Tomasz Stepinski have developed and applied machine vision-based algorithms to improve existing methods for mapping landscapes. The researchers say their approach is the first to use machine vision to build a new map of landscape types. The researchers focused on a very large map of land cover/land use, called the National Land Cover Database 2006. Using the new techniques, the researchers were able to discover and differentiate 15 distinctive landscape types, including separating forests by their domination of different plant species. Cincinnati's Jacek Niesterowicz says the information gleaned by auto-mapping of landscape types could be useful for a wide variety of fields, ranging from geographic research to land management, urban planning, and conservation. "The good thing about this method is that it doesn't need to be restricted to land cover or other physical variables--it can be applied as well to socio-economic data, such as U.S. Census data, for example," Niesterowicz says. Stepinski notes their approach offers a new way to conduct geographic research. "By leveraging technology developed in the field of computer science, it's possible to make geography searchable by content," he says.

Spintronics Goes Through Blue Period
Live Science (10/27/13) Jesse Emspak

Researchers at Harvard University and the London Center for Nanotechnology have developed a method involving ink dye that could lead to the creation of future computing devices and ways of exploring quantum mechanics. The researchers vaporized a sample of copper phtalocyanine, and condensed the vapor onto a substrate in an ultra-thin layer. They then put the dye in a magnetic resonance spectrometer, which creates a magnetic field that aligns the electrons' spins, placing them in an up or down state. Although spin experiments have been conducted before, the new experiment resulted in the spins staying aligned for a longer time. At minus 450 degrees Fahrenheit, the spins stayed parallel to the field for 59 milliseconds, and the superposed state lasted 2.6 milliseconds. "It was a longer period than we had any right to expect," says London Center for Nanotechnology director Gabriel Aeppli. However, it is still unclear how long a quantum bit would have to maintain its superposed state in an actual quantum computer, according to Harvard's Marc Warner. "A theorist might say we need a qubit with a lifetime of minutes, but in practice it is never entirely clear what is and isn't possible in a particular system," Warner says.

I Am Woman, Watch Me Hack
New York Times (10/22/13) Catherine Rampell

The ranks of women in computer science are low and dwindling, and many industry observers blame the field's gender disparity on a public image problem. Although women represented 29 percent of bachelor’s degrees in computer and information sciences in 1990-91, the figure dropped to 18 percent 20 years later. Currently, only 0.4 percent of female college freshmen say they plan to major in computer science, and women account for just a quarter of all Americans in computer-related occupations. Many young people do not encounter computer scientists in their daily lives and do not understand the field. Many observers believe that TV shows such as "CSI" and "Bones," which feature women in forensics as leading characters, have helped turn forensic science into a primarily female field in recent years. Few characters in computer science or engineering occupations appear in today's family films, children's shows, and prime-time programs, and among those that do, the ratio of men to women is 14.25 to 1 in family films and 5.4 to 1 in prime time, according to the Geena Davis Institute on Gender in Media. The Labor Department projects 1.4 million computer-related job openings this decade, and experts say the women of tomorrow could help fill these jobs domestically while earning high salaries in flexible positions.

Lessons From the First Major Computer Virus
Intel Free Press (10/25/13)

The first major computer virus, the Morris Worm, in 1988 brought down about 10 percent of computers connected to the Internet, introducing widespread security concerns about networked computers for the first time, says Purdue University professor Eugene Spafford in an interview coinciding with the 25-year anniversary of the virus. Cornell graduate student Robert Tappan Morris released the worm, which he said was intended to estimate the Internet's size but instead replicated itself in a denial-of-service attack. Spafford helped analyze the Morris Worm, with two of the eight computers in his lab affected by the virus. "Prior to the Worm, we were kind of part of a closed community where a lot of people assumed good intent and responsible behavior," Spafford says. "The majority were academics who never really thought about the idea that others might have bad intent." The World Wide Web did not yet exist, and only 50,000 to 60,000 machines were connected to the Internet, many of which were shared machines devoted to research and small business. Spafford, who also serves as chair of USACM, says viruses were inevitable and that security is "really risk reduction and not elimination," noting that many more ways to attack are available today than in the past.

Abstract News © Copyright 2013 INFORMATION, INC.
Powered by Information, Inc.

To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe