Association for Computing Machinery
Welcome to the August 26, 2009 edition of ACM TechNews, providing timely information for IT professionals three times a week.

HEADLINES AT A GLANCE


Researchers Report Progress on Parallel Path
EE Times (08/24/09) Merritt, Rick

Researchers at the University of California, Berkeley, the University of Illinois, and Stanford University, with funding from Intel, Microsoft, and other companies, have been working to develop ideas about the future of chips with dozens or even hundreds of cores and the coding prototype parallel software that will be needed to capitalize on the multi-core chips' power. The researchers hope to demonstrate "reasonably complete" parallel software stacks running on simulators or prototype hardware within two years, and to provide enough solid work to create a road map to commercial use. "We believe within four years we can show companies like Intel and Microsoft how they can make their offerings better support parallel programming," says Marc Snir, co-director of the Universal Parallel Computing Research Center at the University of Illinois. "There is no silver bullet, but we hope we can make developing parallel software as easy as developing today's software." Berkeley Parallel Lab's David Patterson, former president of ACM, says the industry expects processors with 64 cores to be available by 2015, forcing the need for parallel software. Although researchers have not yet succeeded in creating a useful parallel programming model, Patterson believes the industry-wide effort will be more successful. "I think we've made significant strides," says Stanford University Pervasive Parallelism Lab director Kunle Olukotun. "We had this vision and have started to fill in the pieces and it feels like the vision [is] really coming to pass."


Moral Machines
AlphaGalileo (08/25/09)

Scientists at the Universidade Nova De Lisboa, in Portugal, and the Universitas Indonesia, in Indonesia, are researching artificial intelligence and the application of computational logic to create machines capable of making decisions. The researchers are investigating prospective logic as a way to program morality into a computer. They say prospective logic can model a moral dilemma and determine the logical outcomes of all possible decisions, potentially allowing for machine ethics. Machine ethics would enable designers to develop fully autonomous machines that can be programmed to make judgments based on a human moral foundation. For example, the researchers say that machine ethics could help psychologists and cognitive scientists find a new way of understanding moral reasoning in people, or extract fundamental principles from complex situations to help people decide what is right and wrong. The researchers have developed a system capable of working through the "trolley problem," an ethical dilemma proposed by British philosopher Philippa Foot in the 1960s in which a runaway trolley is about to hit five people tied to the track, but the subject can hit a switch that will send the trolley onto another track where only one person is tied down. The prospective logic program can consider each possible outcome based on different scenarios and demonstrate logically what the consequences of its decisions might be. The next step would be to give each outcome a moral weight so a prototype might be developed to make the best judgment as to whether to flip the switch.


Online Social Networks Leak Personal Information to Third-Party Tracking Sites
Worcester Polytechnic Institute (08/24/09) Dorsey, Michael

A Worcester Polytechnic Institute (WPI) study by professor Craig Wills found that the practices of many popular social networking sites can make personal information shared by users on their pages available to companies that track Web user browsing habits. The study, presented at the Workshop on Online Social Networks, part of ACM's recent SIGCOMM 2009 conference, described the method that tracking sites could use to directly link browsing habits to specific individuals. Wills says users are given a unique identifier when they sign up with a social networking site, and when social networking sites pass information to tracking sites about user activities, they often include the identifier, giving the tracking site a profile of Web browsing activities and the ability to link that profile to a user's personal information. Wills says this is a particularly troubling practice for two reasons. "First, users put a lot of information about themselves on social networking sites. Second, a lot of that information can be seen by other users, by default." A unique identifier could give a tracking site access to a user's name, physical address, email address, gender, birth date, education, and employment information. Wills says he does not know what, if anything, tracking sites do with unique identifiers given to them by social networking sites, and while the Web sites provide users with tools to protect themselves, the best way to prevent privacy leaks would be for social networking sites to stop making unique identifiers visible.


Fujitsu Aims for 10-Petaflop Supercomputer
IDG News Service (08/25/09) Niccolai, James

Fujitsu is building a 10-petaflop supercomputer based on its upcoming Sparc64 VIIIfx processor. The new processor has eight processor cores, with each running at 2 GHz and accessing 5MB of L2 cache memory. The update to the four-core Sparc64 VII chip released two years ago can reach 128 gigaflops while using 58 watts of power. The new Sparc64 VIIIfx processor is based on the same Sparc9 instruction set as other Sparc processors, but makes use of a set of supercomputing extensions known as HPC-ACE. Fujitsu hopes to have the supercomputer, which would be nearly 10 times more powerful than today's fastest system, ready for Japan's Institute of Physical and Chemical Research by early 2011. IBM is developing its own eight-core server processor, and plans to use the upcoming Power7 processor for a petascale supercomputer at the University of Illinois at Urbana-Champaign.


Research Trove: Patients' Online Data
New York Times (08/25/09) P. D1; Arnquist, Sarah

Since the inception of the Internet, patients have used the Web to share experiences and learn about diseases and therapies. But now proponents say that online communities could potentially transform medical research and enable patients to contribute, make queries, and help spearhead new discoveries. "Patients have been a tremendously underutilized resource," says Amy Farber, who founded the LAM Treatment Alliance to raise funds and link a worldwide network of scientists to research lymphangioleiomyomatosis (LAM), which she has. Frank Moss, director of the Massachusetts Institute of Technology (MIT) Media Laboratory, says that patients' everyday experiences in living with a disease are an immense source of unexploited information that, when aggregated, could support new theories and research pathways. "We're really turning patients into scientists and changing the balance of power between clinicians and scientists and patients," he notes. A number of private companies are gathering patient data and genetic information online to use in enlisting patients for clinical trials, performing research internally, or to sell to drug and biotechnology firms. Advocates of this crowdsourcing model say it democratizes research and puts patients in charge of their data, and that it is much more expedient and less expensive than traditional research. Johns Hopkins School of Medicine professor James Potash cautions that self-reported data generates significant research dilemmas, lacking the ability to validate information through face-to-face patient/doctor interaction. An MIT doctoral student is developing the LAMsight project, a Web site that lets LAM patients report information about their health, and then converts those reports into databases that can be tapped for observations about the disease.
View Full Article - May Require Free Registration | Return to Headlines


If You’re Not Seeing Data, You’re Not Seeing
Wired News (08/25/09) Chen, Brian X.

Developers are creating augmented reality games and applications for a variety of smartphones, enabling the phone's screen to display both the physical world and additional information, such as public transportation schedules, the price of houses, or Twitter messages posted nearby. "Augmented reality is the ultimate interface to a computer because our lives are becoming more mobile," says University of California, Santa Barbara (UCSB) professor Tobias Hollerer, head of UCSB's augmented reality program. "We're getting more and more away from a desktop, but the information the computer possesses is applicable in the physical world." Hollerer is developing Anywhere Augmentation, technology that would enable users to point a device at a city it's completely unfamiliar with, download the surroundings and output information in real time. Augmented reality devices will require superb battery life, computational power, cameras, and tracking sensors, and augmented reality software will need to include sophisticated artificial intelligence and three-dimensional modeling applications. Hollerer says creating a good augmented reality device using currently available technology would cost nearly $100,000. "We're doing as much as we can with the current technology," says Ogmento co-founder Brian Sezler. "This industry is just getting started, and as processing speeds speed up, and as more creative individuals get involved, our belief is this is going to become a platform that becomes massively adopted and immersed in the next few years."


Magic Ink Offers Full-Colour Printing in an Instant
New Scientist (08/25/09) Barras, Colin

Engineers at Seoul National University in South Korea have developed M-Ink, technology capable of producing full-color prints in a fraction of a second. M-Ink can be used to produce any color in the visible spectrum and could result in a new technique for fast and inexpensive full-color printing, says lead researcher Sunghoon Kwon. M-Ink contains magnetic nanoparticles 100 to 200 nanometers across, a solvation liquid, and resin. The nanoparticles disperse throughout the resin, giving the ink a brown appearance, but when an external magnetic field is applied, the nanoparticles instantly realign to fit the magnetic field lines, creating chain-like structures. The regularly spaced nanoparticle chains interfere with incoming light, so the light reflected from the surface is a specific color. Adjusting the magnetic field strength changes the spacing of the field lines, which changes the color. Multiple electromagnets can be used to create curves in the image. The solvation liquid creates a repulsive force between the magnetic nanoparticles to ensure that they do not clump together in the ink. Once the desired color is created, the nanoparticles can be fixed in place by exposing the ink to ultraviolet (UV) light, which cures the resin. Full-color images are created by using maskless lithography to expose only desired areas to the UV light and repeating the process with different magnetic fields and UV light patterns.


UC San Diego-Based Playpower Project Receives $180K MacArthur Digital Media and Learning Grant
UCSD News (08/20/09) Fox, Tiffany

The University of California, San Diego's Playpower project is an effort to use radically affordable computing technology to improve education access in the developing world. The Playpower project, which received a one-year, $180,000 grant from the John D. and Catherine T. MacArthur Foundation, was created to promote computer-aided learning through the use of $12 TV-computers called (TVCs). TVCs come with a TV monitor, keyboard, and 25-year-old videogame processor technology. "With this type of cheap, accessible technology, a child will be able to boot up a system and start learning how to program," says Jeremy Douglass, the project's co-principal investigator. "It gives them access to an entirely new realm, not only in terms of computer literacy and career skills, but also a sense of the way the modern world works." The grant will fund research and development in three areas: software and hardware development kits, an online development community, and a series of international workshops intended to educate user communities about TVCs and how to use them as a teaching tool. Derek Lomas, another co-principal investigator for the project, says the team will spend the next few months working with game-design companies, neuroscientists, and other communities to prototype and test learning games capable of being used in a variety of cultural contexts. "Our software kit will lower the technical barrier enough for people to create code frameworks that can be modified, but we also want to make sure there's good learning game design that can transfer around the world," Lomas says.


The Evolution of Retweeting
Technology Review (08/26/09) Grifantini, Kristina

Twitter has announced that it will establish a set format for retweeting, or reposting another user's message, for use within the micro-blogging service. Retweeting serves as a way to quickly spread ideas and comments to new groups of users. However, it is far less consistent than some quoting methods used on other discussion forums and many variations exist. Citing the user name also takes up space in the 140-character limit, so some users paraphrase or omit part of the original text, occasionally creating incorrect quotes. Twitter recently announced that it will implement a button to allow users to automatically repost another user's tweet, which will make it quicker and easier to accurately retweet. The button does not allow users to alter the original post, and will add the image and name of the quoted user to the tweet. Microsoft Research social media scientist Danah Boyd says the retweet button will not fit the needs of many users interested in retweeting, who do not want to simply relay the message but add their own comments to the quote and open the discussion to new users. However, Boyd says the button will open up retweeting to a wide audience of users who never retweeted before. Web developer Dan Zarrella says retweeting is an elegant viral mechanism. "The scale and data you can extract from [retweets] has never been possible with [other] viral or word-of-mouth communications," he says.


Managing Disasters With High-Tech Imaging Could Save Lives
Rochester Institute of Technology (08/19/09) Gawlowicz, Susan

The Information Products Laboratory for Emergency Response, a partnership between the Rochester Institute of Technology (RIT) and the University at Buffalo (UB), aims to improve disaster mitigation planning and real-time response and recovery efforts. The laboratory will unite university researchers, private-sector providers, and emergency response decision-makers in an effort that will focus on technology, policy, and business development. "The economic benefit of this initiative will be seen in the growth of disaster-related information products and workers who know how to use them," says RIT's Donald Boyd, the lead scientist on the project. Central to the lab's mission will be matching the needs of emergency responders with information products that combine remote sensing imagery with geographical information systems. Geospatial analysis technology can be used to show emergency responders what is happening during a disaster, where events are taking place, and how the crises may evolve over time. The first group of information products will focus on flood and fire mapping, and the following year additional products will be developed based on user needs. Lab researchers also will hold a series of workshops to unite disaster management experts and technology and service providers. "We could potentially go from fires to gas leak detection, environmental disasters, terrorist attacks," says RIT's Don McKeown. "A lot of the tools you have in your remote-sensing tool box are the same; it doesn't matter what the application is." The project also will fund an undergraduate student researcher in imaging science and computer science, as well as a doctoral candidate.


Telstra to Give Australian R&D a Boost
Computerworld Australia (08/26/09) Swan, Georgina

Telstra plans to offer greater support to outside researchers in Australia who are pursuing projects that are relevant to its telecommunications business. The company has introduced the External Research and Development Program in an effort to provide research groups with in-kind support, such as access to internal experts or facilitated meetings with suppliers. "We are inviting all groups involved in research and development--whether research institutions, universities, startups, established companies, or equipment vendors--to let us know about their projects," says Telstra chief technology officer Hugh Bradlow. Telstra will accept summaries on tech projects such as cloud computing, smart infrastructure, or environmentally-focused and urban development projects via its Web site. Groups involved in the selected projects will have an opportunity to present their work to a panel of top executives during the upcoming Telstra External R&D day. Telstra is especially interested in the innovation and value of the idea. The projects will determine the level of support and funding provided by Telstra. "We look forward to exploring with the research and development community how technology will be shaped in the future," Bradlow says.


New Iowa State Supercomputer, Cystorm, Unleashes 28.16 Trillion Calculations Per Second
Iowa State University News Service (08/18/09) Krapfl, Mike

Iowa State University's new supercomputer, dubbed Cystorm, is a Sun Microsystems machine that uses 3,200 computer processor cores to reach a peak performance of 28.16 trillion calculations per second. Cystorm has a peak that is five times faster than Iowa State's CyBlue, an IBM Blue Gene/L supercomputer that has 2,048 processors and can handle 5.7 trillion calculations per second. On a more realistic test of actual performance, Cystorm can perform 15.44 trillion calculations per second compared to 4.7 trillion per second for CyBlue, making it 3.3 times more powerful than the older supercomputer, which has been on campus since early 2006. "Cystorm is going to be very good for data-intensive research projects," says professor Srinivas Aluru, who heads the Cystorm project. "The capabilities of Cystorm will help Iowa State researchers do new, pioneering research in their fields." For example, the supercomputer will help computer engineers build a software infrastructure for identifying relevant information sources for decision-making. Cystorm also will be used for research into materials science, power systems, and systems biology. Iowa State made the purchase with the help of a $719,000 grant from the National Science Foundation.


Abstract News © Copyright 2009 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]




Unsubscribe
Change your Email Address for TechNews (log into myACM)