Association for Computing Machinery
Welcome to the May 22, 2013 edition of ACM TechNews, providing timely information for IT professionals three times a week.

ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).


MOOC Provider edX More Than Doubles Its University Partners
Chronicle of Higher Education (05/21/13) Jeffery R. Young

EdX announced that 15 additional universities have agreed to offer free massive open online courses, bringing the total membership to 27 institutions. The new partners include five universities in the United States, six in Asia, three in Europe, and one in Australia. EdX, a nonprofit provider of MOOCs founded by Harvard University and the Massachusetts Institute of Technology, aims to help colleges use technology to rethink campus education and deliver online courses. "What we hope to get out of our partnership with edX is actively learning from and building upon each other’s educational innovations," says Kyoto University professor Toru Iiyoshi. Several professors recently have raised questions about the implications of free online courses, especially as colleges run pilot projects in which they ask students to watch edX video lectures and use edX professors. "It’s a good thing that people are debating and discussing all the issues of this transformational technology," says edX president Anant Agarwal. "The way we look at it is this is increasing choice." Agarwal notes there currently are more than 900,000 people enrolled in edX programs.
View Full Article - May Require Paid Subscription | Return to Headlines | Share Facebook  LinkedIn  Twitter 

Ecologists Warn of Overreliance on Unvetted Computer Source Code by Researchers
Phys.Org (05/17/13) Bob Yirka

Microsoft Research's Lucas Joppa and a group of fellow scientists warn that there is an overreliance on source code that has not been properly vetted, which is resulting in increasingly incorrect research efforts. The scientists say researchers are relying on existing software to perform their research, despite the fact that no one has peer reviewed the software itself. The main problem is that the software is generally written by the researchers themselves, instead of by trained software engineers. Another problem the scientists note is that there can be disagreement between equations that have been worked out by researchers and the way they are implemented in software. The scientists suggest making source code open source and requiring it to be peer reviewed before journals will accept research articles based on its use. They also suggest that journals could help by publishing more articles educating researchers about the problem and how to handle it.

Is This Virtual Worm the First Sign of the Singularity?
The Atlantic (05/17/13) Alexis C. Madrigal

A new open source project called OpenWorm aims to create a life-like simulation of the roundworm Caenohabditis elegans, which would be the first of its kind in executable biology. OpenWorm began in 2010 when software engineer Giovanni Idili sent a tweet to the Twitter account for the Whole Brain Catalog, a project aiming to make mouse brain data more usable, setting a New Year's resolution to simulate the whole C. elegans brain of 302 neurons. Stephen Larson, one of the Brain Catalog's founders, replied with an offer to help and the effort spread, coordinated via Google Hangouts by scientists worldwide. The physical laboratory environment for C. elegans, a petri dish with agar, is standardized and easy to model on a computer. For the worm body, the team contacted California Institute of Technology's Christian Grove, who donated a three-dimensional atlas of the worm to serve as a model. Although the team is making progress, the challenges of simulating a brain are immense. For example, the group is using the Hodgkin-Huxley model of how neurons work, but the question remains as to whether the behavioral realism of the organism would improve in a meaningful way with more detailed simulations of the neurons.

UAB Research Finds New Channels to Trigger Mobile Malware
UAB News (05/16/13) Meghan Davis

University of Alabama at Birmingham (UAB) researchers have discovered new hard-to-detect methods for triggering mobile device malware that could eventually lead to targeted attacks launched by large groups of infected mobile devices in the same geographical area. These attacks could be triggered by music, lighting, or vibration. The UAB researchers used music to set off malware hidden in mobile devices from 55 feet away in a crowded hallway. "We showed that these sensory channels can be used to send short messages that may eventually be used to trigger a mass-signal attack," says UAB professor Nitesh Saxena. “While traditional networking communication used to send such triggers can be detected relatively easily, there does not seem to be a good way to detect such covert channels currently.” The researchers also successfully activated malware, at various distances, using music videos, lighting from a TV, vibrations from a subwoofer, and magnetic fields. "We need to create defenses before these attacks become widespread, so it is better that we find out these techniques first and stay one step ahead," says UAB researcher Shams Zawoad.

Robots Learn to Take a Proper Handoff by Following Digitized Human Examples
Disney Research (05/20/13) Jennifer Liu

Researchers at Disney Research, Carnegie Mellon University, and the Karlsruhe Institute of Technology have developed a method that allows a humanoid robot to receive an object handed to it by a person with something similar to natural, human-like motion. The researchers used motion-capture data with two people to create a database of human motion. The robot quickly searches the database and recognizes what the person is doing to make a reasonable estimate of where the person is likely to extend their hand. However, the researchers say it is not enough to develop a technique that enables the robot to efficiently find and grasp the object. "If a robot just sticks out its hand blindly, or uses motions that look more robotic than human, a person might feel uneasy working with that robot or might question whether it is up to the task," says Disney's Katsu Yamane. The researchers developed a hierarchical data structure that enables a robot to access a library of human-to-human passing motions with the speed necessary for robot-human interaction. Yamane says additional work is necessary to expand the database for a wider variety of passing motions and distances.

3D Modeling Technology Offers Groundbreaking Solution for Engineers
University of Sheffield (05/16/13)

Software developed at the University of Sheffield will make it easier for engineers to develop real-world safety assessments of structures and foundations. The software is designed to directly identify three-dimensional (3D) collapse mechanisms and provide margin of safety information. The software uses the same basic approach of a method for directly identifying two-dimensional (2D) collapse mechanisms, which Sheffield researchers developed in 2007. "The software we have developed means that engineers should in the future be able to model real-world geometries much more easily than before, obviating the need to idealize a complex 3D problem as a much simpler 2D problem," says Sheffield professor Matthew Gilbert. "This should lead to more reliable assessment of margin safety and, ultimately, save companies time and money on projects."

Security Risks Found in Sensors for Heart Devices, Consumer Electronics
University of Michigan News Service (05/16/13) Nicole Casal Moore

An international research team demonstrated the ability to forge an erratic heartbeat with radio frequency electromagnetic waves, exposing a vulnerability in the sensors used in medical devices, Bluetooth microphones, and computers in Web-based phone calls. "We found that these analog devices generally trust what they receive from their sensors, and that path is weak and could be exploited," says University of Michigan researcher Denis Foo Kune. The researchers tested cardiac defibrillators and pacemakers in open air to determine which radio waveforms could cause interference. They then exposed the medical devices to those waveforms in a saline bath and a patient simulator. The researchers found that in both cases, an attacker would need to be within five centimeters to cause interference. "The problem is that emerging medical sensors worn on the body, rather than implanted, could be more susceptible to this type of interference," says Michigan professor Kevin Fu. The researchers propose that software could ping the cardiac tissue to determine whether the previous pulse came from the heart or from interference. "This type of interference can be prevented with shields and filters like those seen today in military-grade equipment," notes Korea Advanced Institute of Science and Technology professor Yongdae Kim.

Glasgow Scientists Create Single-Pixel Camera for 3D Images
BBC News (05/16/13)

A system using detectors that have a single pixel to sense light offers a low-cost way to create three-dimensional (3D) images. The detectors can see frequencies beyond visible light, giving the system the ability to sense wavelengths far beyond the capabilities of digital cameras, which have imaging sensors that use millions of pixels. The single-pixel detectors are much less expensive than current systems, according to scientists who developed the technology at the University of Glasgow. They say their approach could lead to new uses for 3D imaging in medicine and geography, and the system could be a valuable tool in a wide range of industries. "A series of projected patterns and the reflected intensities are used in a computer algorithm to produce a 2D image," says Glasgow professor Miles Padgett, who led the team. A 3D image is created by combining images from detectors in four different locations using the "shape from shade" technique. "Our single-pixel system creates images with a similar degree of accuracy without the need for such detailed calibration," Padgett notes.

Parcels Find Their Way to You Via the Crowd
New Scientist (05/16/13) Hal Hodson

Microsoft researchers have developed a concept for a crowd-powered delivery system called TwedEx, which would deliver packages to consumers without requiring them to deviate from normal routes. TwedEx would deliver packages via a chain of people based on an algorithm and aggregated Twitter location data to determine the fastest route while also lowering costs, says Microsoft Research's Eric Horvitz. He notes that TwedEx is based on existing user routes and only requires that senders write the recipient's unique identifier on the package, such as their Twitter handle. TwedEx forecasts users' average movements from past Twitter data to determine which people to hand a package to at intermediate locations based on the package's final destination. Users inform the network that they have a package, and the system deduces an optimal route and provides each person in the chain with information about where, when, and to whom the next handoff should occur. Simulations of the system have been highly successful, with packages typically making it across the country in only five hours, Horvitz says. The most likely initial scenario for TwedEx would be for the distribution of vaccines in developing countries, says TwedEx contributor Adam Sadilek.

New Insights Into How Materials Transfer Heat Could Lead to Improved Electronics
University of Toronto (05/16/13)

Researchers at the University of Toronto and Carnegie Mellon University (CMU) say they have developed new insights into how materials transfer heat, a breakthrough that could lead to smaller, more powerful electronic devices. The researchers investigated a tool to measure the thermal and vibrational properties of solids. They studied materials in which heat is transferred by atomic vibrations in packets called phonons. "In an analogy to light, phonons come in a spectrum of colors, and we have developed a new tool to measure how different color phonons contribute to the thermal conductivity of solids," says CMU's Jonathan Malen. The researchers say the tool will give both industry and academia a better understanding of how an electronic device's ability to dissipate heat shrinks with its size, and how materials can be organized at the nanoscale to change their thermal conductivity. For example, the researchers demonstrated that as silicon microprocessors continue get smaller, their operating temperatures will be further challenged by lower thermal conductivity. "Our modeling work provides an in-depth look at how individual phonons impact thermal conductivity," says Toronto researcher Dan Sellan.

Stacking 2-D Materials Produces Surprising Results
MIT News (05/16/13) David L. Chandler

Massachusetts Institute of Technology researchers have made progress on the longstanding challenge of developing a band gap property in graphene, which is essential for using the material to make transistors and other electronic devices. Graphene is a carbon-based material with a structure only one atom thick that has intrigued scientists with its unique electronic properties, strength, and minimal weight. By placing a sheet of graphene on top of another one-atom-thick material called hexagonal boron nitride, the researchers created the band gap needed to develop transistors and other semiconductor devices. Combining graphene with boron nitride, an effective insulator that blocks electrons, results in a high-quality semiconductor. Furthermore, the researchers found that they could adjust the properties of the semiconductor by rotating one sheet relative to the other, allowing for a spectrum of materials with varied electronic characteristics. However, the researchers say additional work is needed to increase the band gap, because it is currently too small for practical electronic devices. "The ability to induce a zero-field band gap in graphene may one day allow its use as a switch in transistor applications, providing a viable and inexpensive alternative to silicon electronics," says Rutgers University professor Eva Andrei.

A Master's-Level Computer Science Degree, Delivered Via MOOCs
ZDNet (05/15/13) Joe McKendrick

The Georgia Institute of Technology, College of Computing, plans to offer the first online Master of Science degree in computer science that can be earned via a massive open online course (MOOC) format. The degree will be delivered through the Udacity MOOC platform, and AT&T will provide financial support. Students enrolled in the program will pay a fraction of the cost of traditional on-campus master's programs. Total tuition for the program is expected to be less than $7,000. Although Master of Science computer science courses will be available for free on the Udacity site, only students granted admission to Georgia Tech will receive credit. There will be separate credentials for students who successfully complete courses but do not qualify for full graduate standing. A pilot program will begin in late 2013, and initial enrollment will be limited to a few hundred students.

Exhaustive Computer Research Project Shows Shift in English Language
University of Illinois at Urbana-Champaign (05/15/13) Dusty Rhodes

Through a collaborative venture with the University of Illinois' Institute for Computing in Humanities, Arts, and Social Science, and the HathiTrust Research Center, Illinois English professor Ted Underwood analyzed more than 4,200 digitized books to conclude that the English language experienced dramatic shifts between the 18th and 19th centuries. The effort involved the development of data-mining programs that tallied the words and sorted the genres of the volumes. Underwood used data from Google Books to find the 10,000 most frequently used words, and then employed a Web-scraper and to trace each word's etymology, sort them by date of entry into the English language, and split them into pre- and post-Norman Conquest clusters. Mapping the ratio of formal to informal words used in books each year from 1700 to 1899 verified Underwood's belief that a Latinate diction set in around 1800, but genre-sorting the books revealed a major shift from formal to less-formal English around 1775. The shift coincides with the emergence of the literature style of writing, leading Underwood to surmise that "our concept of literature as fictive and aesthetic really emerges in the late 18th century."

Abstract News © Copyright 2013 INFORMATION, INC.
Powered by Information, Inc.

To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe