ACM TechNews (HTML) Read the TechNews Online at: http://technews.acm.org

ACM TechNews
July 11, 2008

Learn about ACM's more than 3,000 online courses and 1,100 online books
MemberNet
CareerNews
Unsubscribe

Welcome to the July 11, 2008 edition of ACM TechNews, providing timely information for IT professionals three times a week.


HEADLINES AT A GLANCE:

 

Women Break to Front of Tech
USA Today (07/11/08) P. 1B; Swartz, John

The glass ceiling seems to finally be shattering in the technology industry. "It's gratifying to see more women in prominent roles in tech," says Trend Micro CEO Eva Chen, one of the tech industry's top female executives. A wave of new female CEOs is changing the face of the once male-dominated tech industry, with women benefiting from more startups, better funding, and the low cost of starting a Web 2.0 company. Even some established tech companies such as Google are proving to be welcoming places for aspiring female executives. "Computer sciences and the Internet have made technology tangible and put a face on them as careers for everyone, women included," says Google vice president of search and user experience Marissa Mayer. Although the number of female CEOs at well-known tech companies is difficult to determine, estimates easily place more female CEOs in the tech industry than on the Fortune 500 list, which has only 12 female CEOs and only one from the tech industry, Xerox's Anne Mulcahy. There are more opportunities today than there were 10 years ago because women are pursuing engineering degrees and careers, and they are better suited for fast-paced environments in the online world, says Teresa Phillips, founder and CEO of Graspr, which hopes to become the YouTube of how-to clips. Furthermore, marketers are deploying technology intended to reach women, which makes those companies more interested in hiring women. Ruckus Wireless CEO Selina Lo says women are gaining in importance as technology companies seek employees with more "right brain" skills.
Click Here to View Full Article
to the top


Study: Electronic Voting Increased Counting Errors in France
IDG News Service (07/09/08) Sayer, Peter

Polling stations using electronic-voting systems in four recent French elections suffered from more voting discrepancies than polling stations that used traditional paper ballots, concludes a new study. University of Nantes researcher Chantal Enguehard examined the discrepancies between the number of electors who signed the electoral register to confirm that they voted and the number of votes subsequently counted for each polling station. The study compared discrepancies from the 6,427 electronic-voting stations and the 14,624 paper-ballot voting stations used in both rounds of the 2007 presidential election and two subsequent elections. There were discrepancies between the number of signatures and the number of votes in about 19.8 percent of electronic-voting machines, compared to only 5.3 percent with paper-ballot voting stations. Also, the discrepancies were larger with electronic-voting machines. Enguehard says it is unlikely that voters' unfamiliarity with the machines is the cause for two reasons. First, the ratio of discrepancies between electronic and traditional stations got worse, not better, with time, and there was no correlation between the bureaus with discrepancies and the bureaus that received the most complaints about difficulties with the voting machines.
Click Here to View Full Article
to the top


Senate Grapples With Web Privacy Issues
Washington Post (07/10/08) P. D3; Whoriskey, Peter

Despite support from leading technology companies and frequent consumer complaints, Congress has been unable to pass Internet privacy legislation. Following a two-hour Senate committee hearing on July 9 on Internet advertising and privacy, Sen. Byron L. Dorgan (D-N.D.), who led the discussion, said the hearing primarily served to emphasize how little legislators understand about the subject. The hearing was called in response to fears that the massive volume of information that Internet companies are collecting on users is violating their privacy. The practice of assembling profiles on users to determine personal preferences and activities has been going on for years, but as Web sites have increasingly been united in large ad networks, the various profiles kept by smaller sites have been combined to create more detailed and widespread user profiles. Over the past year, some Internet service providers (ISPs) have been experimenting with a practice that would provide even more detailed profiles, using a technology called deep packet inspection, which allows them to examine streams of data coming from a user's Internet connection. Critics of deep packet inspection compare the practice to wiretapping. At the hearing, representatives from companies that provide deep packet inspection services to ISPs assured the panel that they were doing their best to preserve privacy. Experts say that before Congress can pass privacy legislation it must first decide what constitutes personally identifiable information, whether a person's Internet address should be considered private, should people be informed about data collection practices, and should users be allowed to see profiles compiled about them.
Click Here to View Full Article
to the top


Bluffing Could Be Common in Prediction Markets, Study Shows
University of Michigan News Service (07/10/08)

University of Michigan professor Rahul Sami and doctoral student Stanko Dimitrov have produced a new mathematical model that suggests bluffing in prediction markets is a profitable strategy more often than previously believed. The analysis questions the incentives such markets create for revealing information and making accurate predictions. Predictive markets have been shown to be more accurate than polls in predicting events, though dishonest tactics such as bluffing can distort their accuracy. Sami says their work is the first to demonstrate that strategies involving deception of future traders are a real possibility in a variety of information conditions. The researchers' solution to bluffing is to penalize later trades by charging participants to change their bets. Charging people to change their bets would give people more incentive to be honest from the start, the researchers say. "If you're running a prediction market, the whole point is to make predictions and you want your predictions to be reflecting the actual information the participants have," Sami says. "What bluffing does is worsen the predictions with the wrong information. It defeats the purpose." Their research will be presented at the ACM Conference on Electronic Commerce, which takes place July 11 in Chicago.
Click Here to View Full Article
to the top


The New Face of R&D: What's Cooking at IBM, HP and Microsoft
Computerworld (07/10/08) Anthes, Gary

Microsoft, IBM, and Hewlett-Packard's research and development agenda includes considerable investment in collaboration with other companies, customers, and universities, which Henry Chesbrough of the University of California, Berkeley's Center for Open Innovation says nurtures an openness that can expedite the migration of ideas into the marketplace. Following HP's hiring of Prith Banerjee, engineering dean at the University of Illinois at Chicago, as the new director of HP Labs, the company announced that the lab would refocus its R&D efforts on "big bet" projects in the areas of information explosion, dynamic cloud services, content transformation, intelligent infrastructure, and sustainability, with individual projects concentrating on exascale computing, social computing, quantum computing, and green computing. Banerjee is confident that this shift in focus will yield more fully developed research prototypes for product divisions, allowing products to be brought to market faster and at less expense. Meanwhile, IBM Research's John Kelly has announced that IBM would explore the "high-risk" basic research areas of nanotechnology, integrated systems and chip architecture, cloud computing and Internet-scale data centers, and the use of advanced math and computer science to manage business integrity at a cost of more than $100 million over three years. He says IBM would boost collaboration with government agencies, universities, and other companies, and establish small, regional joint ventures or "collaboratories" with universities, foreign governments, or commercial partners to move technology quickly into the marketplace through the leveraging of local skills, funding, and sales channels. Microsoft Research focuses on the intersection of computer science with other disciplines, and stands out from many IT companies with its emphasis on first doing good computer science and then considering its commercial possibilities, says director Richard Rashid. "We are increasingly engaged where computer science is making a big difference in the way other sciences are done," he says.
Click Here to View Full Article
to the top


MIT Reports Finer Lines for Microchips
MIT News (07/08/08) Chandler, David

MIT researchers have advanced nanoscale lithography by creating finer patterns and lines over larger areas than other current methods. The new technique could lead to next-generation computer memory and integrated-circuit chips, advanced solar cells, and other devices. The team created lines about 25 nanometers wide separated by 25 nanometer spaces. The most advanced commercially available computer chips currently available have a minimum feature size of 65 nm. The technique could be economically beneficial because it works without the chemically amplified resists, immersion lithography techniques, and expensive lithography tools considered essential to work at such a small scale with optical lithography. The new method could allow for the commercialization of many new nanotechnology inventions that have been designed but are unavailable because of the absence of a viable manufacturing method. The researchers used a technique called interference lithography (IL) to create the patterns using a tool that is designed to perform a particularly high precision variant of IL known as scanning-beam interference lithography. The new technique uses 100 MHz sound waves, controlled through custom high-speed electronics, to diffract and frequency-shift the laser light, creating a rapid patterning of large areas with unprecedented control over feature geometry.
Click Here to View Full Article
to the top


UC San Diego Unveils World's Highest-Resolution Scientific Display System
University of California, San Diego (07/09/08) Ramsey, Doug

The University of California, San Diego's (UCSD) California Institute for Telecommunications and Information Technology (Calit2) has created the Highly Interactive Parallelized Display Space (HIPerSpace), the world's highest-resolution display system for scientific visualization. HIPerSpace has nearly 287 million pixels of screen resolution, 10 percent more than the second-largest display in the world, recently constructed at the NASA Ames Research Center, and 30 percent more than UCSD's first HIPerSpace display, built in 2006. The original HIPerSpace display was moved to a larger location and expanded by 66 million pixels to become the new HIPerSpace display. "The higher resolution display takes us more than half way to our ultimate goal of building a half-billion-pixel tiled display system to give researchers an unprecedented ability to look broadly at large data sets while also zooming in to the tiniest details," says HIPerSpace principal investigator Falko Kuester. HIPerSpace is an ultra-scale visualization environment developed on a multi-tile paradigm featuring 70 high-resolution Dell 30-inch displays in 14 columns of five displays each. The 31.8-feet-wide by 7.5-feet-tall HIPerSpace is already being used by a variety of research groups at UCSD, allowing them to view largest data sets while simultaneously focusing on the smallest elements.
Click Here to View Full Article
to the top


Sneeze-Sensing Software Gives Avatars a Good Laugh
New Scientist (07/11/08) Barras, Colin

Researchers in the United Kingdom have developed software that is capable of automatically recognizing laughter, sobbing, sneezing, and yawning, and then generating an appropriate facial expression in animated characters. Darren Cosker at the University of Bath and Cathy Holt at the University of Cardiff used optical motion capture to record the facial expressions of people making such "non-linguistic" sounds, and then recorded their voices. The software is capable of matching the vocalizations to the facial expressions. Cosker and Holt also worked with James Edge, a researcher at the University of Surrey, to animate a standardized facial model. The researchers say there is some variation in non-linguistic sounds, which leads to a level of ambiguity in audio. "One person's laugh can sound similar to another person's crying," Cosker says. "In terms of classifying actions on the basis of audio alone, we still need to do more work." Nonetheless, their software could ultimately lead to better Web-based avatars and computer-animated movies.
Click Here to View Full Article
to the top


Artificial DNA Can Power Future Comps
Times of India (07/07/08)

Researchers in Japan have developed a DNA molecule made mostly from artificial parts. Artificial versions of DNA offer enormous information storage benefits, and some scientists previously have created DNA molecules with a few artificial parts. However, University of Toyama researchers led by Masahiko Inouye have stitched together four new basic building blocks inside the sugar-based framework of a DNA molecule. The artificial bases have resulted in very stable, double-stranded structures resembling natural DNA. Moreover, they are right handed and some formed triple-stranded structures. "The unique chemistry of these structures and their high stability offer unprecedented possibilities for developing new biotech materials and applications," according to the Toyama team. The development could also help clear the way for nano-sized computers.
Click Here to View Full Article
to the top


Mapping Infectious Diseases
Technology Review (07/10/08) Singer, Emily

HealthMap is a public-health surveillance system that uses real-time information from a variety of Internet sources to create a map of diseases. A series of text-processing algorithms analyzes information gathered from the Internet and picks out diseases being reported and the location of the event, and tries to determine the story's relevance. For example, the software must distinguish between a news item on new cases of tuberculosis from an article about a TB vaccination campaign. HealthMap cofounder John Brownstein, a professor at the Informatics Program at Children's Hospital Boston, says the algorithms report stories correctly about 95 percent of the time. The data is plotted on a world map, with different colors displaying the most recent reports. Harvard physician Larry Madoff says being able to see things in a spatial representation helps medical professionals realize when a disease is spreading, when there is a cluster of cases, and when different cases may be related. The automated approach is also very fast, even reporting cases before the World Health Organization, Brownstein says. The site is fully automated and updates every hour. World travelers can check the map to look for new outbreaks, but researchers say the map will likely benefit poorer nations the most, due to their lack of a public health monitoring infrastructure and prevalence of infectious disease.
Click Here to View Full Article
to the top


First European Initiative on Grid Computing, Biomedical Informatics and Nanoinformatics
AlphaGalileo (07/07/08)

Biomedical Informatics, grid technologies, and nanoinformatics will be the focus of a new partnership between research organizations in Europe and Latin America. The European Commission is funding ACTION-GRID, with hopes of deepening its knowledge of these areas of research and supporting further collaboration on projects with experts in Latin America, the Western Balkans, and North Africa. Over the next 18 months, the European Union will spend about 1 million euros to support the information exchange and to produce a white paper on how biomedical informatics, grid technologies, and nanoinformatics could impact medicine and recommend potential research initiatives. For nanomedicine, the project has implications for implantable devices, nanosurgery, modeling and simulation using informatics approaches, and databases of nanoparticles, among other areas. The Biomedical Informatics Group at the Universidad Politecnica de Madrid (UPM) is coordinating ACTION-Grid. The consortium also includes the Institute of Health Carlos III, Hospital Italiano de Buenos Aires in Argentina, Universidad de Talca in Chile, Forth in Greece, HealthGrid in France, and the University of Zagreb Medical School in Croatia.
Click Here to View Full Article - Web Link May Require Free Registration
to the top


New Logic: The Attraction of Magnetic Computation
ICT Results (07/07/08)

The EU-funded MAGLOG project sought to adapt magnetoelectronic technology, which takes advantage of the magnetic properties of electrons and their charge, for logic operations, including memory, data storage, and computation. "The main goal of MAGLOG was to show that magnetic logic gates could be produced on a conventional complementary metal-oxide-semiconductor [CMOS] platform," says MAGLOG project coordinator Guenter Reiss. "For successful commercialization, it is critical that this novel method of data processing can be integrated into conventional chip technologies." Lithographic etching of structures within ferromagnetic material to create zones where the material's magnetic orientation can switch between states in response to input signals is one successful production approach. Another approach employs magnetic tunneling junctions that are assembled from alternating layers of ferromagnetic materials and insulators, resulting in a programmable logic gate. This property could enable chip designers to fabricate generic chips that can then be tailored via logic gate programming. Magnetic logic can facilitate greater efficiency because magnetoelectronic components are generally less power-consumptive than conventional microprocessor elements, while their non-volatility can lower chip consumption even further.
Click Here to View Full Article
to the top


IBM Looks to Tap Massive Data Streams
HPC Wire (07/03/08) West, John E.

IBM says the solution to capitalizing on the massive amount of data being generated everyday may lie in stream computing. "In traditional computing the machine dictates the pace at which things gets done," says IBM Research's Nagui Halim. "In stream computing, the machine's job is to figure out what's going on in the real world in real time." For example, Halim says the financial services industry generates five million data items per second, and some opportunities, such as information asymmetries that can be exploited, last only a few seconds. Consequently, real-time systems are needed to consume, analyze, and react to the millions of pieces of data that are created in only a few milliseconds. Similar demands exist in real-time monitoring of complex industrial processes such as chip manufacturing, credit card fraud detection, commercial flight tracking, and various other monitoring tasks. Companies have become experienced in building solutions to handle such massive amounts of information, but so far efforts have focused on solving specific problems in specific businesses. Halim wants to take what has been learned from various point solutions and build a generalized infrastructure and body of knowledge that will accelerate the adoption of stream computing by research groups and individuals. IBM is designing the stream infrastructure to be useful to nonexperts from the ground up, a significant change from much of the software written for supercomputers. Halim is working on a complete solution that involves hardware, operating systems, compilers, middleware, and tools.
Click Here to View Full Article
to the top


The Secret to Designing User-Friendly Interfaces for Desktop Software
Computer Weekly (07/04/08) Clark, Lindsay

Northwestern University professor Donald Norman says that interface designers may be obsessed with how good their programs look, but it is how users feel after using the program that determines the usability of a device. For example, he says it is the system that powers iTunes, and not the iPod's circular touch-and-scroll interface, that makes the device so user friendly. "What people miss about the iPod is that it's not about the device," Norman says. "Apple did a magnificent job of the entire system, from licensing the music to the iTunes Web site." Norman says people care about getting the job done and feeling happy when they are finished. Features on Amazon.com, such as the variety of emails sent after a purchase, the ability to cancel within a few hours, and being able to track where it is shipping from and when it will arrive, are more important than traditional usability, Norman says. Overall usability is not only an organization's ability to design applications or Web sites that matter, but whether it can create the supporting infrastructure to make the experience pleasant and useful. Norman says accomplishing such usability requires breaking down departmental boundaries and allowing multidisciplinary teams to consider the user experience throughout the process.
Click Here to View Full Article
to the top


KU Researcher Calls for Approval of Wireless Gadgets That Use 'White Space'
University of Kansas News (07/02/08)

A new generation of personal electronic devices could make use of unoccupied "white space" in the television spectrum, according to research at the University of Kansas. KU professor Joseph Evans, director of the university's Information and Telecommunication Technology Center, says the TV bands are located right in the middle of the spectrum below 1 GHz, which is described as "beachfront property." Evans and colleagues recently performed research focused on unlicensed devices using white space and found that the operation of unlicensed devices in the television band could be carried out with no significant effect on DTV receivers in the area. Evans says devices using the TV band could lead to more interoperable public safety communication, reduced broadband costs, and simplified implementation of wireless technology in rural areas. "I've become a believer that white space technology is feasible," he says. "I do believe it is fair and prudent that the engineering details be carefully worked through--we're still some distance from being able to field those types of devices." This fall the FCC will decide whether to allow the use of devices that scan TV frequencies for white space and use these unused bands for transmission, and in February 2009 a new swath of white space will be opened as TV stations transition from analog to digital broadcasting. The KU research has informed a public debate over whether the FCC should approve such technology, with Evans presenting evidence in 2007 that white space devices do not generate interference for TV viewers when operated under suitable rules.
Click Here to View Full Article
to the top


Research Institute in Pensacola Shapes the Future
Orlando Sentinel (FL) (07/02/08) Smith, Wes

The Institute for Human and Machine Cognition (IHMC) in Pensacola, Fla., uses state-of-the-art technology to enhance human performance through artificial intelligence, robotics, and "sensory-substitution." The nonprofit research center has attracted top research scientists from around the world, and has $26 million in contracts, mostly with the military and NASA. The center also collaborates on projects with the University of Central Florida's (UCF) Institute for Simulation & Training, says UCF's Terry L. Hickey. Hickey, a neuroscientist, says the center's researchers are good at applying scientific knowledge in useful ways. A new satellite office will open next year in Ocala, Fla., which will give Orlando-area scientists more opportunities to work on making cutting-edge concepts a reality, says IHMC board member Beverly J. Seay. "If we don't have unique people like them thinking 10 or 20 years out, we will lose our technology edge," Seay says. The Pensacola center includes nearly 100 cognitive psychologists, neuroscientists, physicians, philosophers, engineers, social scientists, and others, such as former Steely Dan and Doobie Brothers guitarist Jeff "Skunk" Baxter, a part-time "senior thinker." Baxter is an expert in terrorism, missile defense, and chemical and biological warfare. Part of the center's mission is to develop concepts that can be turned into products for businesses, such as the "human-oriented" aircraft cockpit system that will be licensed and sold commercially.
Click Here to View Full Article
to the top


Artificial Intelligence Tied to Search Future
InfoWorld (07/10/08) Krill, Paul

Speaking at "The New AI: New Paradigms for Using Computers Workshop" at the IBM Almaden Research Center in San Jose, Calif., University of Washington Turing Center director Oren Etzioni said that artificial intelligence (AI) could potentially enhance Internet searches, but there are obstacles to overcome first. Etzioni said that within the next five years next-generation search systems will emerge based on technologies such as Open IE (Information Extraction), which involves techniques for mapping sentences to logical expressions. Other solutions for enhanced searches are also emerging, including semantic tractability, which allows for simple sentences and even conversations with possible double meanings to be understood by computers, Etzioni said. Etzioni also highlighted the KnowItAll project, which focuses on extracting high-quality information from text on the Web, and TextRunner, which deals with open information extraction and is intended to serve as a foundation for a massive knowledge base. The event also showcased several projects in AI and machine learning, including using AI to identify interesting assertions, data visualizations, and continuous interfaces; and examining obstacles to software developer adoption of statistical machine learning.
Click Here to View Full Article
to the top


New Ways to Connect Data, Computers, and People
Chronicle of Higher Education (07/07/08) Foster, Andrea L.

Astrophysicist Edward Seidel will take over the National Science Foundation's Office of Cyberinfrastructure (CI) starting in September. The CI office awards competitive grants to researchers conducting revolutionary work in computer science, as well as oversees national advances in supercomputing, high-speed networking, data storage, and software development. Seidel says developing a CI-savvy work force might be the most important long-term investment that needs to be made, noting that the nation is facing a critical shortage of computationally skilled researchers and support staff. Increasing the number of researchers who understand the importance of CI is just as important as increasing budgets and upgrading to new equipment, Seidel says. He says that all areas of research, education, and industry are being transformed by advances in CI, and future advances will require assembling teams with different kinds of expertise to attack complex problems in a variety of subjects. Universities need to hire more faculty who will use CI to advance their disciplines, and consider developing local training courses in computation science, the use of CI, as well as participate in national training events.
Click Here to View Full Article
to the top


To submit feedback about ACM TechNews, contact: [email protected]

To be removed from future issues of TechNews, please submit your email address where you are receiving Technews alerts, at:
http://optout.acm.org/listserv_index.cfm?ln=technews

To re-subscribe in the future, enter your email address at:
http://signup.acm.org/listserv_index.cfm?ln=technews

As an alternative, log in at myacm.acm.org with your ACM Web Account username and password, and follow the "Listservs" link to unsubscribe or to change the email where we should send future issues.

to the top

News Abstracts © 2008 Information, Inc.


© 2008 ACM, Inc. All rights reserved. ACM Privacy Policy.