Association for Computing Machinery
Welcome to the June 22, 2012 edition of ACM TechNews, providing timely information for IT professionals three times a week.

ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).


Next Cameras Come Into View
Wall Street Journal (06/21/12) Gautam Naik

Duke University researchers have developed Aware-2, an experimental digital camera that can generate a still or video image with a billion pixels, or about five times as much detail as can be seen by a person with 20/20 vision. Aware-2 captures more than 30 times as much picture data as the best existing consumer digital cameras, and enables users to zoom in on portions of the image in fine detail. However, the researchers say it might take years to fine-tune the technology for use in consumer devices. The system currently weighs about 100 pounds and is about the size of two stacked microwave ovens, and it takes about 18 seconds to take a picture and store the information on a disk. The researchers, led by Duke's David Brady, say the challenges are to reduce power consumption and shrink the device's electronics. Nevertheless, Columbia University researcher Shree Nayar says Aware-2 could be the first step toward making gigapixel cameras for general use. The device relies on a spherical lens equipped with almost 100 microcameras, each with a 14-megapixel sensor. The camera captures nearly 100 separate images with each picture and software is used to merge them together to make a composite image.

NIST’s BIG DATA Workshop: Too Much Data, Not Enough Solutions
CCC Blog (06/21/12) Kenneth Hines

Argonne National Laboratory Computation Institute director Ian Foster gave a keynote speech at the U.S. National Institute of Standards and Technology's recent BIG DATA Workshop, an event that brought together experts from academia, industry, and government to study key topics in support of the federal government's Big Data R&D Initiative. Foster says researchers and institutions can meet the needs of big data by accelerating discovery and innovation worldwide with a research information technology (IT) as a service program. In addition, he says IT professionals can leverage the cloud to provide millions of researchers with access to powerful tools, enable a massive shortening of cycle times in the research process, and reduce the research IT needs. Meanwhile, the U.S. National Science Foundation's (NSF's) Howard Wactlar says a paradigm shift is currently taking place, from hypothesis-driven research to data-driven research. Foster and others at the conference note that big data is not a new phenomenon, as NSF held a workshop on the topic in 1997. However, the urgency to develop tools and techniques to work with big data has increased as the amount and complexity of big data has ballooned.

Exascale Computing: The View From Argonne
HPC Wire (06/21/12)

In an interview, U.S. Argonne National Laboratory directors Rick Stevens, Michael Papka, and Marc Snir contextualize the challenges and advantages of developing exascale supercomputing systems. Snir stresses that building an exascale system by stitching together many petascale computers is impossible, and argues that exascale is needed to provide complex models to match hypothesis to evidence in increasingly complex systems. "As we transition to the exascale era the hierarchy of systems will largely remain intact, so the advances needed for exascale will influence petascale resources and so on down through the computing space," Papka says. Snir anticipates a 10-year window for exascale system deployment at best, and he notes that Argonne "is heavily involved in exascale research, from architecture, through operating systems, runtime, storage, languages and libraries, to algorithms and application codes." Papka says the U.S. Department of Energy exascale initiative opted for a development approach emphasizing co-design to ensure that the delivered exascale resources fulfill the requirements of the domain researchers and their applications. Stevens agrees that "we will not reach exascale in the near term without an aggressive co-design process that makes visible to the whole team the costs and benefits of each set of decisions on the architecture, software stack, and algorithms."

Facebook Study Reveals What Makes Someone a Leader
New Scientist (06/21/12) Jacob Aron

New York University (NYU) researchers recently studied what makes certain Facebook users more influential than others. Knowing what makes someone influential could help advertisers spread their products through social media, or help promote HIV testing in Africa. The researchers, led by NYU's Sinan Aral and Dylan Walker, studied influence by watching how the use of a film-rating application spread through Facebook users. Starting with a seed group of 7,730 users, the researchers designed the app to randomly send messages to the app users' friends, encouraging them to install the app. Analyzing the results combined with users' Facebook profile data revealed insights into which people are the most influential. The study found that men are 49 percent more influential than women, but women are 12 percent less susceptible to influence than men, and they exert 46 percent more influence over men than over other women. In addition, users older than 31 are 51 percent better at convincing their friends than those under 18. The research also found that single individuals are 113 percent more influential than those in a relationship and 128 percent more than those who define their relationship status as "it's complicated."

'Brave' Features Hair-Raising Animations
Inside Science (06/20/12) Emilie Lorditch

For Pixar's newest film Brave, the animators had to created a wild mane of curls for the heroine, Merida. The animators started creating the hair with a series of springs on a computer. In order to give the hair volume, the springs were entered on the computer screen in layers, with each layer varying the length, size, and flexibility of each curl. "There is this weird paradox where a 'spring' of hair needs to remain stiff in order to hold its curl but it also has to remain soft in its movement," says Pixar simulation supervisor Claudia Chung. The animators used a technique called "core curve and points" to represent the hair because it resembles a beaded necklace. When the character turns her head, the curls move along the curve, keeping their shape and flexibility. Another issue was determining how curly hair reacts to light and water. Cornell University professor Steve Marschner says lighting red, curly hair is particularly challenging. "It took us almost three years to get the final look for her hair, and we spent two months working on the scene where Merida removes her hood and you see the full volume of her hair," Chung says.

All Things Big and Small: The Brain’s Discerning Taste for Size
MIT News (06/20/12) Abby Abazorius

Massachusetts Institute of Technology (MIT) researchers have found that the brain organizes objects based on their physical size, with a specific region of the brain dedicated for large objects and another dedicated for small objects. The researchers say their findings could have a major impact on the field of robotics and lead to a greater understanding of how the brain organizes and maps information. The researchers took three-dimensional scans of brain activity during experiments in which participants were asked to visualize items of different sizes. The researchers evaluated the scans and found there are distinct regions of the brain that respond to large objects and small objects. Their findings could lead to new techniques for how robots handle different-sized objects. "Paying attention to the physical size of objects may dramatically constrain the number of objects a robot has to consider when trying to identify what it is seeing," says MIT's Aude Olivia. The researchers want to gain a better understanding of the brain's visual processes, which could lead to the development of machines and interfaces that can see and understand the world like humans do.

$27 Million Award Bolsters Research Computing Grid
University of Wisconsin-Madison (06/20/12) Chris Barncard

The U.S. Department of Energy's (DOE's) Office of Science and the U.S. National Science Foundation (NSF) recently committed up to $27 million to the Open Science Grid (OSG), a nine-member partnership to advance distributed high-throughput computing capabilities at more than 80 sites. "The commitment from the two agencies will take the capabilities and culture we've developed to more campuses throughout the United States," says OSG researcher and University of Wisconsin-Madison professor Miron Livny. "It is about advancing the state of the art to support education and research in more science domains and improve our ability to handle more data." The Office of Science will contribute up to $8.2 million for distributed computing efforts based at DOE national laboratories. The NSF will contribute the remaining balance of the funding, which will be used to promote distributed computing resources at U.S. universities. "Our close partnerships allow us to build on existing experience in working with and processing big data and the advanced networks needed to transport the massive datasets of the future," says OSG researcher Michael Ernst.

Researchers Set New Cryptanalysis World Record for Pairing-Based Cryptography
IDG News Service (06/19/12) Lucian Constantin

Researchers from Fujitsu Laboratories, the National Institute of Information and Communications Technology (NICT), and Kyushu University have set a cryptanalysis world record for pairing-based cryptography (PBC) by using new technologies such as optimization techniques, a new two-dimensional search algorithm, and parallel programming. The team cracked a 278-digit-long key used in a PBC system in 148.2 days by using 21 computers with a total of 252 cores. The researchers note the estimate for breaking PBC of this length was several hundred thousand years. A team from NICT and Hakodate Future University set the previous record in 2009 by cracking a 204-digit-long key, but the 278-digit-long key was considered to be hundreds of times more difficult. PBC can be used for identity-based encryption, keyword-searchable encryption, and other applications for which public-key cryptography is unsuitable. The research could encourage standards organizations and governments to determine a role for PBC in next-generation cryptography standards and the appropriate key length to use with this type of encryption.

Study: Wikipedia Perpetuates Political Bias
Washington Post (06/18/12) Suzy Khimm

Researchers at Northwestern University and the University of Southern California say the collective intelligence used to create Wikipedia articles generally produces biased information. The researchers analyzed a decade's worth of Wikipedia articles on U.S. politics and found that only a few of them were politically neutral. Large numbers of contributors did help make articles less biased, but the researchers found that most articles receive little attention and change only slightly from their original slant. Although Wikipedia is less biased and partisan than when it first launched, most of the content has not benefited from the true wisdom of the crowd, according to the researchers. However, the researchers based their conclusions on a very technical index for measuring political slant, which was developed to measure media bias. The index is based on how many times 1,000 phrases were used by Republicans and Democrats in the 2005 Congressional Record. Although the index determined that Democrats were more likely to use "civil rights," while Republicans were more likely to use "trade deficit," those tendencies might not still hold true in 2012.

In This Online University, Students Do the Teaching as Well as the Learning
Chronicle of Higher Education (06/18/12) Katherine Mangan

Although free online courses have been enticing students with the opportunity to learn from world-class professors at prestigious colleges, Peer 2 Peer University (P2PU) is questioning whether instructors are even necessary. The institution, where anyone can set up a course, is experimenting with ways that students can develop and use open courseware that is free on the Web. P2PU, which began offering courses in 2009, has about 33,000 registered users, says executive director Jan Phillipp Schmidt. Although P2PU has no accreditation, students can earn informal alternatives to diplomas, known as badges, to show what they have learned. P2PU's learning style reflects an approach that many classroom instructors have been taking for years as they have stepped away from the lectern to guide students working in small groups. "Everyone recognizes that education is about more than just content, and that it includes interactions with other learners and educators," says P2PU's Stephen E. Carson. P2PU's mentor program enables students who have completed a challenge to help other students who are struggling. "The people who come to P2PU are attracted by the opportunity to take learning into their own hands and to create their own university," Schmidt says.

University of Waterloo Engineers Unveil Two-Way Wireless Breakthrough
University of Waterloo (06/14/12) Carol Truemner

University of Waterloo engineering researchers have developed technology that enables wireless signals to be simultaneously sent and received on a single radio channel frequency. "This means wireless companies can increase the bandwidth of voice and data services by at least a factor of two by sending and receiving at the same time, and potentially by a much higher factor through better adaptive transmission and user management in existing networks," says Waterloo professor Amir K. Khandani. He says two-way wireless technology could lead to huge improvements in voice and data services, and boost wireless networks in terms of quality of service and efficiency. Moreover, the breakthrough could result in ultra-secure transmission. "The cost in hardware and signal-processing complexities and antenna size is very low and virtually the same as current one-way systems," Khandani notes. New applications for two-way technology include methods for interference management, security enhancements, and unique wireless connectivity called media-based. The approach involves embedding data in transmission media rather than in the transmitted radio-frequency signal.

Mexican Jumping Beans May Influence Robot Design (06/18/12) Lisa Zyga

The locomotion of the Mexican jumping bean could provide clues for designing future micro-robots with low intelligence and power needs. Georgia Institute of Technology engineers studied the motion of Mexican jumping beans and developed an algorithm based on their rolling, jumping, and flipping behavior. They then used the algorithm to program robots to move in a controlled direction. "Mexican jumping beans have many interesting abilities that are useful for robotics," says Georgia Tech's David Hu. He notes that they move within an armored shell but are able to sense their surroundings to find shadows and hide, and that they are almost spherical but have a notch that enables them to move uphill and across small obstacles. "All these abilities might be implemented in a much smaller robot," Hu says. For example, the research could lead to the design of solar-powered mechanical jumping beans that would be used as inexpensive sensors for detecting temperature gradients. Mechanical beans also could be programmed to respond to other gradients, such as chemical or light gradients, making them useful for sensing and surveillance applications.

P2P Comes to the Rescue of Internet Video
Europe's Newsroom (06/13/12)

VTT Technical Research Center of Finland researchers, working with a consortium of 20 industrial partners on the P2P-Next project, have developed NextShare, an open source peer-to-peer (P2P) video-streaming platform. The researchers designed, implemented, and tested algorithms and protocols to use P2P architecture to stream video. "The key difference with P2P applications for file sharing is that video data can't be broken into different packets and sent in any order, it has to be sent in sequence and maintain a certain level of quality of service," says P2P-Next project coordinator Jari Ahola. The researchers initially based the system on the BitTorrent protocol, and then developed their own open source protocol called Swift. The researchers tested the technology by building Swarmplayer, a Web browser plug-in that enables Internet users to access Internet video via the Firefox browser. The researchers also developed a set-top box to demonstrate how the technology could be incorporated into consumer electronic devices. The box has "social networking features so, for example, users can view Twitter comment feeds about what they are watching as they watch it," Ahola says. The researchers found that the P2P approach cuts bandwidth demands by at least 65 percent compared with the unicast streaming approach.

Abstract News © Copyright 2012 INFORMATION, INC.
Powered by Information, Inc.

To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe