Welcome to the August 24, 2011 edition of ACM TechNews, providing timely information for IT professionals three times a week.
Also, please download our new ACM TechNews iPhone App from the iTunes Store by clicking here and our new ACM TechNews iPad App by clicking here.
HEADLINES AT A GLANCE
The New Big Data
Technology Review (08/22/11) Erica Naone
Industry research and academic labs will present the latest big data techniques during the 17th ACM conference on Knowledge Discovery and Data Mining (KDD). Data mining was a concern of only the scientific community until the explosive growth of the Internet, says Usama Fayyad, executive chair of ACM's Special Interest Group on Knowledge Discovery and Data Mining and former chief data officer at Yahoo! Businesses amassing large volumes of data about customers online began to understand the power of data mining and started to invest in the field. One goal of the conference is to bring new data mining techniques to the attention of business, so they can apply them more quickly, says conference chair Chid Apte. The KDD organizers also hope to provide academics with a better sense of the most pressing challenges for business. "People are able to come up with a lot more predictive models, and more importantly score them [to determine how well they work]," Fayyad says. "It takes analysis to a level that's truly beyond human brain comprehension."
Data Are Traveling by Light
Researchers at Fraunhofer Institute for Telecommunications’ Heinrich Hertz Institute (HHI) have developed a new transfer technology for video data as part of the European Union-sponsored OMEGA project. The researchers were able to transfer data at 100 Mbits/s without any losses, using light-emitting diodes (LEDs) in the ceiling. "This means that we transferred four videos in HD quality to four different laptops at the same time," says HHI's Anagnostis Paraskevopoulos. The researchers use a modulator to turn the LEDs off and on very quickly, transferring the data as 1s and 0s, while lighting the room at the same time. The system uses a simple photo diode on the laptop as a receiver. "The diode catches the light, electronics decode the information and translate it into electrical impulses, meaning the language of the computer," says HHI's Klaus-Dieter Langer. Visible light communication is best suited to be another option for data transfer where radio transmission networks are not desired or not possible, according to the researchers. The researchers are currently trying to establish higher bit rates for the system. "Using red-blue-green-white light LEDs, we were able to transmit 800 Mbit/s in the lab," Langer says.
Researchers Identify First Flaws in the Advanced Encryption Standard
Katholieke Universiteit Leuven (08/17/11)
Katholieke Universiteit Leuven and Microsoft researchers have discovered a weakness in the AES algorithm after a long-term cryptanalysis project. The AES algorithm is used by millions of people worldwide to protect their Internet banking, wireless communications, and hard disk data. AES also is used in more than 1,700 U.S. National Institute for Standards and Technology (NIST)-validated products, and has been validated by NIST, ISO, and IEEE, in addition to being approved by the U.S. National Security Agency for protecting top secret information. The new attack, which applies to all versions of AES even if it is used with a single key, shows that finding the key of AES is four times easier than previously thought. The new attack should not be viewed as a significant threat because it simply means that the number of steps to find the key for AES-128 is an eight followed by 37 zeroes. However, this is the first significant flaw that has been found in the widely used AES algorithm.
Kids Learn Computer Programming at Hack the Future
San Francisco Chronicle (08/21/11) Michael Cabanatuan
Hack the Future is an effort to extend the creative tradition of hacking, which involves programmers getting together to create code, share ideas, and learn from each other, to school-age kids. The most recent Hack the Future camp included about 75 kids in fifth through 12th grade, being mentored by about 25 tech-industry workers. The hackathon included different stations where the kids were encouraged to work on their own, but were able to get advice or lessons when needed. The stations included activities such as learning how to control a robot, create video games, or solder computer components together. "This whole thing is inspired by a lot of young engineers wanting to share their knowledge," says Pong creator Al Alcorn, who served as a mentor. One of the most popular stations is run by Primer Labs CEO Alex Peake, who teaches young students how to create their own video games. "Video gaming is the entry point into computer science, and computer science is the entry point to all science," Peake says. Another station allows students to learn how to control a 5-foot-tall robot from Willow Garage.
Music in the Air
Monash University (08/22/11)
Monash University researchers have developed Nodal, new generative software for composing music, interactive real-time improvisation, and a musical tool for experimentation. "Nodal uses a unique visual representation which allows the composer to edit and interact with the music generation process as the composition plays," says Monash professor Jon McCormack. Nodal is based on a user-created network, consisting of nodes, represented by musical events, and edges, represented by connections between events. Clusters of graphical nodes represent musical structure, such as a piece of music. Within the network the composer creates, virtual players generate musical patterns by moving over the geometric structure, which defines pitch, rhythm, and sequence. "It is great for improvisation, quickly creating new musical ideas and thinking about musical structure in unconventional ways," McCormack says. The system has a built-in synthesizer and is compatible with any MIDI hardware or software instrument, making it perfect for educational environments. Though Nodal was initially developed as a new method for writing music, the technology also can be used to explore broader musical phenomena found in nature.
Better 'Photon Loops' May Be Key to Computer and Physics Advances
National Institute of Standards and Technology (08/22/11) Chad Boutin
University of Maryland, Harvard University, and U.S. National Institute of Standards and Technology (NIST) researchers are close to developing a method for steering photons accurately through microchips, which could lead to more efficient information processors and eventually quantum computing. Though it is easy for scientists to send photons hundreds of miles through fiber-optic cables, it has been much harder to design a system that can send photons just a few nanometers across computer chips. "We run into problems when trying to use photons in microcircuits because of slight defects in the materials chips are made from," says NIST researcher Jacob Taylor. The researchers used multiple rows of resonators to build alternate pathways into the delay devices, which allow the photons to easily find their way around the defects. "We hope these devices will allow us to sidestep some of the problems with observing the physics directly, instead allowing us to explore them by analogy," says NIST researcher Mohammad Hafezi.
Computational Method Predicts New Uses for Existing Medicines
National Institutes of Health (08/17/11) Alisa Machalek
A recent National Institutes of Health (NIH)-funded computational study analyzing genomic and drug data has been able to predict new uses for existing medicines. "If we can find ways to repurpose drugs that are already approved, we could improve treatments and save both time and money," says NIH's Rochelle M. Long. The researchers, led by Stanford University's Atul J. Butte, focused on 100 diseases and 164 drugs, creating a computer program to search through the thousands of possible drug-disease combinations to find drugs and diseases whose gene expression patterns essentially canceled each other out. The researchers noticed that diseases with similar molecular processes clustered together in the analysis, as did drugs with similar effects. By studying unexpected members of these clusters, it could be possible to learn how certain diseases progress and how some drugs work at the molecular level, according to the researchers.
Could A Crypto-Computer in Your Pocket Replace All Passwords?
Forbes (08/17/11) Andy Greenberg
Cambridge University researcher Frank Stajano recently presented a paper on the Pico, a tiny computer that can be carried around and functions as the authenticator for potentially thousands of different services or devices. In addition to never having to remember passwords, Pico users would be immune from phishing attacks, choosing weak passwords, or even having a password stolen. "The user has a trustworthy device ... that acts as a memory prosthesis and takes on the burden of remembering authentication credentials, transforming them from 'something you know' to 'something you have,'" Stajano says. According to him, a Pico would be a small computing device with a radio and a camera, using public key cryptography to generate and store thousands of public and private key pairs, one for every app or gadget the user needs to unlock. The Pico’s camera would read a visual code on a login screen or device to identify it, and then send out a message over its radio to a remote login server, encrypting a message to it that only the service would be able to decrypt with a secret key. The system would not only confirm the identity of the user, but also the service or device the user wants to access.
Feng Awarded AMD Research Faculty Fellowship for Work in Heterogeneous Computing
Virginia Tech News (08/17/11) Steven Mackay
Virginia Tech professor Wu Feng recently received an AMD Research Faculty Fellowship by Advanced Micro Devices, supporting his research in heterogeneous computing, graphics processing units (GPUs), and accelerated processing units (APUs), which are a combination of central processing units and GPUs. These heterogeneous computing environments are expected to significantly speed up many general-purpose computation tasks normally executed by a computer's central processing unit. "Due in large part to this fellowship, we now routinely re-purpose the graphics processing unit, which traditionally has been used to drive video displays, to perform massively accelerated general-purpose computation," Feng says. Feng also is developing a toolbox of automated computer language translators that would allow an end user to move the application code written in one language into a different language. The fellowship will help conduct research to train a workforce that can take advantage of heterogeneous computing's potential, accelerating the way computers process information, according to Feng.
Computers Will be Able to Tell Social Traits From the Face
Autonomous University of Barcelona and Princeton University researchers, led by Barcelona's Mario Rojas, have developed software that uses machine learning to determine whether faces can be categorized based on their characteristics with more than 90 percent accuracy. The researchers studied to what extent facial characteristics are learnable from the point of view of computer science, specifically regarding nine facial trait judgments (attractive, competent, trustworthy, dominant, mean, frightening, extroverted, threatening, and likable). The researchers trained and tested the software on a set of synthetic facial images from a previous study. The dominant, threatening, and mean traits were found to be predictable with accuracies between 91 and 96 percent. The study also attempted to determine what information is computationally useful for the prediction task.
New Tool Allows First Responders to Visualize Post-Event Disaster Environments
Sandia National Laboratories (08/17/11) Mike Janes
During the U.S. Federal Emergency Management Agency's National Level Exercise 2011 (NLE-11) emergency preparedness officials and first responders used iPads and the Sandia National Laboratories-developed Standard Unified Modeling, Mapping, and Integration Toolkit (SUMMIT), which allowed them to view and modify accurate models of building damage and other post-event disasters. "The SUMMIT software tool ... will be a phenomenal training aid for all responders within our county," says Craighead County, Ark., emergency management director David Moore. "By having a graphical view of damaged areas, it's much easier to comprehend what's going on in the exercise and thus make smarter, firmer decisions." The SUMMIT software provided NLE-11 participants with an enhanced, three-dimensional virtual view of damage in the field, creating a new level of realism and common operating pictures for players in future exercises. SUMMIT improves the cycle of activities that emergency response teams perform, such as pre-event planning and equipping, training and exercises, and evaluation and improvement. SUMMIT's architecture will help emergency preparedness professionals at the federal, regional, and local levels tap into existing models to ensure consistency, accuracy, and robustness of exercise scenarios.
Virtual Touch Helps Keyhole Surgeons to 'Feel' Tumours
BBC News (08/17/11)
Leeds University researchers have developed tactile feedback technology that combines computer virtualization with a hand-held haptic device that simulates pressure on a surgeon's hand when touching human tissue remotely, allowing the doctors a virtual sense of feeling tumors during an operation. The new system could enable surgeons to handle a tumor remotely to determine if it is malignant or benign. The system works by varying feedback pressure on the user's hand when the density of the tissue being examined changes. "You can actually feel the response forces you would have felt on your hand," says Leeds researcher Earle Jamieson. During testing, the researchers simulated tumors in a human liver using a soft block of silicon embedded with ball bearings, which the user was able to locate using the haptic feedback system. "Three or four surgeons tried an early version of our system, and thought it was potentially very useful," Jamieson says.
C++ Upgrade Wins Unanimous Approval
InfoWorld (08/16/11) Paul Krill
Due to the recent unanimous approval of the C++ language upgrade, known as C++11, the International Organization for Standards (ISO) is allowing developers to create parallel algorithms for higher performance, according to the ISO C++ standards committee chair Herb Sutter. C++ is used to build the compilers and runtimes for almost all computing languages, all major Web browsers, and all major operating systems. C++11 features lambda functions, which are crucial to parallel algorithms and revolutionize the use of the existing Standard Template Library. "What's already there in the C++98 standard library will immediately become even easier to use," Sutter says. The new language also features types for portable, lock-free programming, auto and decltype functions, and smart pointers. The next step for the C++ standard is for compilers to conform to it in the next few years. "While that happens, the standards committee will continue to build on the great new language features already in C+11 by continuing to add to C++'s standard library so that a larger portable library [that does not require new language features beyond those already in C++11] will be available 'in the box' with every C++ implementation," Sutter says.
Abstract News © Copyright 2011 INFORMATION, INC.
To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.