Association for Computing Machinery
Welcome to the June 8, 2012 edition of ACM TechNews, providing timely information for IT professionals three times a week.

ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


'Big Data' From Social Media, Elsewhere Online Redefines Trend-Watching
Washington Post (06/07/12) Ariana Eunjung Cha

Trend analysis and forecasting is being rethought thanks to the explosion of data generated by social media and other services, with IBM estimating that about 2.5 quintillion bytes are created each day. Mining this data to extract trend insights through advanced computing and analytics "enables us to watch changes in society in real time and make decisions in a way we haven't been able to ever before," says Harvard University professor Gary King. For example, Barack Obama's reelection campaign employs scores of people using computers to monitor sentiment about candidates via Twitter. Big data's transformative power is being felt most profoundly in the finance industry, with such information being fed into computers that drive high-frequency trading algorithms. However, there is considerable pushback against big data mining by organizations worried that the way the data is used undermines personal privacy, a situation compounded by the lack of a definitive regulatory framework. Although the size of the data sets is such that they are routinely analyzed in aggregate, privacy proponents are concerned that information can still be linked to individuals. Information asymmetry, in which certain parties have an unfair advantage because they have better information than others, is another cause for worry.


Multi-Million Pound New National Supercomputer to Perform Astronomical Feats
University of Leicester (United Kingdom) (06/07/12) Ather Mirza

The University of Leicester recently was selected as one of four sites to host new national high-performance computing (HPC) facilities for theoretical astrophysics and particle physics research. The university is supporting the project by investing in a major upgrade of its data center to host the new facility. "We will now be able to carry out the largest and most detailed simulations of planets, stars, and galaxies that have ever been performed and answer questions that we could not even have asked just a few years ago," says Leicester researcher Mark Wilkinson. The supercomputer will be part of the Science & Technology Facilities Council DiRAC consortium, which provides HPC facilities for United Kingdom-based research institutes. "Leicester has a well-established reputation at the forefront of theoretical astrophysics worldwide and this will secure our position as a major international research center for computational astrophysics," says Leicester professor Andrew King. The facility also will be used to train the next generation of researchers in the use of the latest supercomputer technology.


Human Memory, Computer Memory, and Memento
IEEE Spectrum (06/05/12) Steven Cherry

University of Michigan professor John Laird describes the state, operator, and result (Soar) cognitive architecture as an artificial intelligence system that functions like a brain to solve problems. He says all matching rules fire in parallel in Soar, while selecting the next operator is the locus of decision making. "What we're trying to do in Soar is combine lots of rules at the same time, so when it's in a given situation, many rules will match, and instead of picking one, it will fire all of them, and instead of those doing actions, say, in the world, instead what they're doing, the first phase of that, is proposing separate actions," Laird notes. He says more memories have been added to Soar so that it can not only analyze rules to ascertain the next course of action, but also access these other memories that supply additional data about what to do next. Laird points out that a key element of Soar's problem-solving capability is a framework for episodic memory, a vital component of human-level cognition. He speculates that at some point the same type of task-specific or domain-specific knowledge included in IBM's Watson supercomputer will need to be incorporated into Soar.


Translation Tools Could Save Less-Used Languages
Technology Review (06/06/12) Tom Simonite

Researchers at Google and Microsoft are developing new translation technologies aimed at obscure languages. Google recently launched an experimental "alpha" support for a collection of five Indian languages by giving its software some direct lessons in grammar. Google's system was trained in grammar by giving it a large collection of sentences in which the grammatical parts had been labeled. Microsoft also is trying to help languages not in common use online so that their use is not sidelined and they do not become extinct, says Microsoft Research's Kristin Tolle. Microsoft's researchers recently launched Translation Hub, a Web site that helps anyone to create their own translation software. "Allowing anyone to create their own translation model can help communities save their languages," Tolle says. She notes that machine-translation systems have been developed for approximately 100 of the world's 7,000 languages. Translation Hub works by uploading source materials in two languages to be translated between. The machine-learning algorithms then use that material to translate any text written in the new language. Translation Hub also is designed to facilitate the translation of specialist technical terms, which many online translation tools do not handle well.


Internet Co-Creator Vint Cerf Welcomes IPv6 Elbow Room
CNet (06/05/12) Stephen Shankland

Google vice president and ACM president-elect Vint Cerf is eager for the improvements that the transition to Internet Protocol version 6 (IPv6) technology, which is large enough to assign a network address to 340 undecillion devices, will enable. One such improvement is a more direct architecture that is not obfuscated by the address-sharing of network address translation. Cerf also expects that machine-to-machine communication will be facilitated by IPv6. He notes that beginning around 1988, 100 percent per year growth in the number of hosts on the Internet--and nearly half as much for users--could be forecast, but in the last 12 years the compounded annual growth rate for users has been 15.5 percent. "Generally, though, it has been hard to make predictions," Cerf acknowledges. "The growth of mobiles has been dramatic as has the growth of smartphones connected to and making use of the Internet." Cerf also foresees an "Internet of things" that boasts a high degree of two-way interaction coming, as a result of the creation and movement of increasingly large volumes of data. "Shared databases and instrument data-gathering will produce information that needs to be pushed into the Net," Cerf says.


'Siri, Kill That Guy': Drones Might Get Voice Controls
Wired News (06/05/12) David Axe

Future U.S. Air Force drone operators could talk to a drone and receive a verbal response, similar to the Siri-style two-way voice exchange. Moreover, next-generation controls could include smarter, easier-to-interpret computer displays and tactile feedback, similar to vibrating controls such as the Xbox controller, that shake the drone operator's virtual cockpit if the robot detects incoming enemy fire. The current interface consists of computer screens, keyboards, and joysticks for steering robots, while input is limited to keystrokes and mouse and joystick movements transmitted via satellite. The Air Force Research Laboratory's (AFRL's) Mike Patzek says man-machine interfaces could replace this desktop-type environment in the next decade or so. The progress of the Air Force's research and its funding will determine how the interfaces evolve, but there is no dispute that flying robots will have a key role in U.S. air power in the years to come. "The fundamental issue is that the [robotic] systems are going to be more capable and have more automation," says AFRL's Mark Draper. "The trick is, how do you keep the human who is located in a different location understanding what that system is doing, monitoring and intervening when he or she needs to?"


NASA Starts Software Dev for Deep Space Rocket
InformationWeek (06/05/12) Patience Wait

The U.S. National Aeronautics and Space Administration's (NASA's) Marshall Space Flight Center is ready to start developing flight software for its Space Launch System (SLS), a heavy-lift rocket that will take humans on missions beyond the Earth's orbit by 2017. Boeing has delivered three testbed computers, and each will serve as a standalone version of the flight computers on board SLS. NASA says the real-time, embedded computers are capable of executing programs on a fixed schedule measured in milliseconds. Each testbed computer has three redundant processors that interpret data separately and "vote" to ensure all agree on a response to be sent. The flight computers compare answers and then send commands for execution. "The triple redundant processors make each computer reliable in the harsh radiation environment," says Boeing's Dane Richardson. "Similarly, the three computers working in concert make the vehicle reliable." NASA will use the testbed systems to develop the application software and advanced modeling and simulation of space flight conditions.


Nuclear Weapon Simulations Show Performance in Molecular Detail
Purdue University News (06/05/12) Emil Venere

High-performance computing researchers at Purdue University and the U.S. National Nuclear Security Administration's Lawrence Livermore National Laboratory are perfecting simulations that show a nuclear weapon's performance in precise molecular detail. The simulations might require 100,000 machines, a level of complexity that is essential to accurately showing molecular-scale reactions taking place over milliseconds. Purdue professor Saurabh Bagchi notes that "due to natural faults in the execution environment, there is a high likelihood that some processing element will have an error during the application's execution, resulting in corrupted memory or failed communication between machines." However, the researchers have developed automated methods to detect a glitch as soon as it occurs. "You want the system to automatically pinpoint when and in what machine the error took place and also the part of the code that was involved," Bagchi says. The researchers created an automated method for clustering the large number of processes into a smaller number of equivalence classes with similar characteristics, which makes it possible to quickly spot and pinpoint problems. "The recent breakthrough was to be able to scale up the clustering so that it works with a large supercomputer," Bagchi says.


Concordia's 3D Innovation Revolutionizes Visual Art
Concordia University (06/04/12) Clea Desjardins

Researchers from Concordia University and the Emily Carr University of Art + Design are developing three-dimensional (3D) drawing tools for a software system that enables animation artists to literally draw in thin air. Concordia's Leila Sujir and Sudhir Mudur and Emily Carr's Maria Lantin are researching intuitive free-hand drawing as well as creating low-cost, high-performance motion-tracking solutions. The researchers are using Sandde, a stereoscopic animation drawing device developed by Janro Imaging Laboratory (JIL). JIL is working to develop the user interface and hardware interface components for the Sandde animation software in collaboration with Concordia professors and graduate students. The researchers say Sandde has the potential to revolutionize video game design, performance art, and Hollywood 3D feature animations by facilitating the free movement of artists as they add 3D layers to their work. They say their year-long project will bring physicality back to the process of creating digital content.


SDSC Supercharges its ‘Data Oasis’ Storage System
UCSD News (CA) (06/05/12) Jan Zverina

The University of California, San Diego's San Diego Supercomputer Center (SDSC) recently launched its Data Oasis parallel file system, which has four petabytes of capacity, to handle the needs of its new Gordon supercomputer, as well as its Trestles and Triton high-performance computing systems. "We view Data Oasis as a solution for coping with the data deluge going on in the scientific community, by providing a high-performance, scalable storage system that many of today’s researchers need," says SDSC director Michael Norman. “We are entering the era of data-intensive computing, and that means positioning SDSC as a leading resource in the management of ‘big data’ challenges.” The storage system features a sustained 100 GB/s performance, which was necessary to support the data-intensive computing power of Gordon. “Big data is not just about sheer size, it’s also about the speed of moving data where it needs to be, and the integrated infrastructure and software tools to effectively do research using those data,” Norman says. "The capability of Data Oasis allows researchers to analyze data at a much faster rate than other systems, which in turn helps extract knowledge and discovery from these datasets."


Oh, That’s Near Enough
The Economist (06/02/2012)

The limits of Moore's Law, which states that about every two years, computer chips will become twice as fast and twice as small, may finally be in sight. Transistors currently measure as small as 22 nanometers in width, and as they shrink, keeping them cool and error-free becomes more difficult. Now researchers around the world are designing smaller, faster, and more energy-efficient "sloppy chips' that can handle errors in their operation. An international team of researchers at Rice University, the Swiss Center for Electronics and Microtechnology (CSEM), and Nanyang Technological University found that by reducing the operating voltage, sloppy chips could deliver equivalent performance to ordinary chips using 25 percent of the energy. Another technique, known as pruning, involves wiring chips so more power is delivered to more important areas, while areas that compute non-essential data are given less power or removed altogether. Tests at CSEM found that pruned circuits were twice as fast, consumed half as much energy, and were half the size of conventional circuits. CSEM also is developing an error-prone chip for audio-visual processing in mobile phones that dispatches different processing tasks to the appropriate circuitry. Another approach to managing errors, called asymmetric reliability, uses error-prone circuits for number crunching to save power and run faster, says Stanford University's Subhasish Mitra.


Big Health Benefits From a Little Small Talk, Researchers Find
University of Queensland (05/29/12)

University of Queensland researchers have developed Discursis, an automated computer visualization measurement technique that visually represents conversations and assists medical practitioners in understanding the structure, information content, and inter-speaker relationships of conversations. The researchers used Discursis to visually map and analyze conversational behavior in medical consultations. The research suggests that computer visualization measurement techniques such as Discursis can be a valuable tool for improving doctor-patient interactions, according to Queensland researcher and study co-author Daniel Angus. He says effective communication between healthcare professionals and patients has been shown to lead to better patient health outcomes. Discursis automatically builds an internal language model from a transcript, mines the transcript for its conceptual content, and generates an interactive visual account of the conversation. "We have managed to visualize and, through this, analyze doctor-patient interactions through detailed verbal positioning by speakers," Angus says. The system can identify effective doctor-patient communication as well as poor communication. "Using a checklist style of consultation can hinder patient outcomes, and doctors who spend too much time on 'off-topic' banter can build good rapport, but do so at the expense of developing the health narrative," Angus says.


Computer Program Spots Fake Product Reviews
InnovationNewsDaily (06/01/12)

Stony Brook University computer scientists have developed software that complements Cornell University research on identifying online fake product reviews. The Stony Brook approach only requires data on how reviews are distributed, which makes it easier to apply to many different sites. The team analyzed reviews on TripAdvisor and Amazon, graphed their distribution among different star ratings, and found that the presence of fake reviews led to unusual graph shapes. The researchers note that the highest-rated products appear to have a disproportionate number of people who logged on only once to give a rave review. The team developed three mathematical methods to ferret out fakers, but its best method was only 72 percent accurate, compared to the almost 90 percent rate of the program developed last year by Cornell researchers. The Cornell method involved hiring people to write fake reviews and creating a database the program could scan for patterns.


Abstract News © Copyright 2012 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe