Welcome to the February 6, 2012 edition of ACM TechNews, providing timely information for IT professionals three times a week.
ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).
HEADLINES AT A GLANCE
Oscars Vote Vulnerable to Cyber Attack Under New Online System, Experts Warn
Guardian (United Kingdom) (02/02/12) Andrew Gumbel
The Academy of Motion Picture Arts and Sciences recently announced that starting next year members will be able to vote for the Oscars using electronic ballots instead of the existing vote-by-mail system. However, many computer scientists believe the system will be vulnerable to attacks that could compromise the votes. "Everybody would like there to be secure Internet voting, but some very smart people have looked at the problem and can't figure out how to do it," says Stanford University professor David Dill. Researchers have listed multiple potential vulnerabilities to online voting systems, such as denial-of-service attacks, malware, and penetration of the server's security wall. The Academy is not the first organization to experiment with Internet voting, and several U.S. states have already adopted electronic-voting systems to help military personnel and other U.S. citizens living overseas submit their votes. However, serious problems have been exposed, such as a local election in Washington, D.C., in October 2010, during which University of Michigan researchers took control of the server software and were able to find out which citizens voted for specific candidates and even changed some votes.
In Interaction Design, Human Understanding Is Key
Silicon Republic (02/04/12) Laura O'Brien
At the recent Interaction 12 conference in Dublin, designers were encouraged to develop a greater level of human understanding when creating user interfaces. Sprout founder Dirk Knemeyer says that interaction designers must become experts in human understanding to develop their craft. He says that since technology has become fully integrated into society, interaction designers should have some understanding of psychology, sociology, neuroscience, endocrinology, and economics to develop meaningful interfaces for the future. "What we know now is that even if you think you have a goal, it’s likely that it’s going to shift and change as you find your way to it because right now, the user is just going to muddle their way through a situation that’s emerged in their life," says Macquarium's Andrew Hinton. In addition, traditional user-testing may not get designers the answers they need, especially as the Web gets more social, says Dana Chisnell, co-author of the Handbook of Usability Testing. "We’re not sure how to do user research that’s going to tell us about what we really need to know about relationships," Chisnell says. In the future, products will be designed not at scale for mass audiences and demographics, but for individuals, Knemeyer predicts.
Embodiment, Computation and the Nature of Artificial Intelligence
Technology Review (02/06/12)
Although many artificial intelligence researchers have adopted the idea that true intelligence requires a body, known as embodiment, a growing group of researchers, led by the University of Zurich's Rolf Pfeifer, say the notion of intelligence makes no sense outside of the environment in which it operates. Pfeifer and Zurich's Matej Hoffmann not only want to redefine artificial intelligence, they want to change the nature of computing itself. The researchers recently published a paper outlining several case studies that examine the nature of embodiment in different physical systems, such as the distribution of light-sensing cells in a fly's eye. The fly's computation is the result of simple motion-detection circuitry in the brain, the morphology or distribution of cells in the body, and the nature of flight in a three-dimensional universe. The researchers say that this, and other low level cognitive functions, such as locomotion, are actually simple forms of computation involving the brain-body-environment triumvirate, which is why the definition of computation needs to be expanded to include the influence of environment.
Researchers Move Graphene Electronics Into 3D
University of Manchester (02/03/12) Daniel Cochlin
University of Manchester researchers have developed a potentially practical method for using graphene as the basic material for computer chips instead of silicon. The researchers suggest using graphene vertically as an electrode from which electrons tunneled through a dielectric into another metal, known as a tunneling diode. The researchers exploited graphene's unique trait that an external voltage can strongly change the energy of tunneling electrons. This new method resulted in a vertical field-effect tunneling transistor in which graphene is a critical ingredient. "I believe they can be improved much further, scaled down to nanometer sizes and work at sub-THz frequencies," says Manchester researcher Leonid Ponomarenko. The Manchester team made the transistors by combining graphene with atomic planes of boron nitride and molybdenum disulfide. "Tunneling transistor is just one example of the inexhaustible collection of layered structures and novel devices which can now be created by such assembly," says Manchester professor Konstantin Novoselov.
Google Supports Female Students in STEM Subjects
SmartPlanet (02/06/12) Charlie Osborne
Google's Mind the Gap! program is aimed at encouraging female students to enter the engineering profession and help equalize the ratio of men and women in scientific and technology-driven industries, says Google's Michal Segalov. The program is a collaborative effort with the Israeli National Center for Computer Science Teachers, and includes monthly school visits for girls to the Google office in Israel and annual technology conferences at academic institutions. Google hopes that by exposing young women to careers in science, technology, engineering, and math fields, they will learn more about the opportunities available to them, changing the stereotype that those professions are male-oriented. Since the program began in 2008, more than 2,500 girls have visited the Google offices, and about 40 percent of those elected to pursue computer science as their high school major after attending.
Microsoft Researchers Say Anonymized Data Isn't So Anonymous
Network World (02/02/12) Tim Greene
Data routinely gathered in Web logs, such as Internet Protocol (IP) address, cookies, operating systems, browser type, and user-agent strings can threaten online privacy because they can be used to identify the activity of individual machines, according to Microsoft researchers. However, they say an analysis of such data when anonymized can help detect malicious activity and improve overall Internet security. The researchers found that HTTP user-agent information can accurately tag a host with an accuracy of 92.8 percent when more than one user ID was linked to a single host, such as with a family that shares a single computer. The researchers also found that even anonymized data can leak information. "[C]oarse-grained IP prefixes achieve similar host-tracking accuracy to that of precise IP address information when they are combined with hashed [user-agent] strings," the researchers say. They aimed to determine how much identifying information gets revealed by common identifiers and to understand the patterns of aggregated activities and explore their implications. "Our analysis suggests that users who do not wish to be tracked should do much more than clear cookies," the researchers note.
PRACE to Establish Six Advanced HPC Training Centers
The Partnership for Advanced Computing in Europe (PRACE) has selected the Barcelona Supercomputing Center, the CSC-IT Center for Science, the University of Edinburgh, Cineca, Maison de la Simulation, and the Gauss Center for Supercomputing as PRACE Advanced Training Centers (PATCs). PATCs will provide training and education activities to the European research community on utilizing PRACE's computational infrastructure. PRACE ultimately wants the PATCs to serve as hubs and key drivers of European high-performance computing (HPC) education. "The establishment of the PRACE Advanced Training Centers is one of the most visible achievements of PRACE to the European researchers," says CSC's Pekka Manninen. "The PATC network enables us to synergize European HPC training activities for the benefit of the whole of Europe." PRACE has initially selected six of its member sites as PATCs, but it will assess the location centers every two years, and the sites may vary over time.
Tailor-Made Search Tools for the Web
Fraunhofer Institute of Intelligent Analysis and Information Systems (IAIS) researchers are working to bring semantic search to smartphones in the form of a new app. The Eat and Drink app is designed to scour the Web for reviews of restaurants, bars, and cafes, and provide recommendations for users. "There's no need to read through lengthy restaurant reviews, instead the app provides a summary of the special features and main aspects of a particular establishment," says Melanie Knapp, who developed the app with her team. "'Eat and Drink' provides information as to why a particular rating is positive or negative." Users launch an area or keyword search, and Eat and Drink will display the results in the form of tags. The researchers say the app semantically analyzes and processes unstructured text, down to the sentence level, using learning and pattern-recognizing methods to deliver results that are much more refined and far less cut-out in nature. They say the underlying technology also could be used to develop apps and programs for other sectors. For example, news organizations are interested in Quote, a semantic search engine for finding quotations of public figures.
Industry-Funded Software Research Goes Open Source
Campus Technology (02/01/12) David Raths
Several large companies that fund software research on university campuses are engaged in open source research in the hope of drawing a thriving developer community. An example is the Science and Technology Centers (ISTCs) launched by Intel at Stanford University, Carnegie Mellon University, and the University of California, Berkeley. "The preferred [intellectual property (IP)] policy is to conduct open research wherein ISTC researchers, whether from academia or Intel, agree to not file patents and to publish all patentable inventions," Intel says. "All significant software developed in the course of conducting research will be released under an open source license." Each ITSC can support 10 to 15 faculty members and as many as 30 students. Consultant Melba Kurman notes that many companies are beginning to consider longer development timeframes, and she thinks open source is a solid solution in instances where patents are not vital. Kurman also lists other potential advantages of the open source licensing model, including its fit with a university's nonprofit, tax-exempt status, the avoidance of publication delays caused by patent applications, no need to haggle over IP terms between university and company researchers, and allowances for the research sponsor to bring in more companies to sponsor open source consortia.
Are the Days of Hands-Off Internet Policies Numbered?
Government Computer News (02/01/12) William Jackson
The Internet will be a major topic of discussion when world leaders meet in December at the World Conference on International Telecommunications to review and revise the International Telecommunications Regulations, which were put into place in 1988 and at the time largely dealt with systems that linked telephones and peripheral devices such as fax machines. "There is an increasing amount of attention" on how the Internet should be governed and what the role of government and international bodies should be," notes former Bush administration United Nations (UN) representative David Gross. Some developing countries, including India, would like international bodies such as the UN's International Telecommunications Union to regulate the Internet, while nations such as China and Russia want a greater degree of control over the Web. Gross says the United States favors an open, cooperative approach. "The U.S. position has been the expectation, willingness, and desire that the entire world would come along for the ride," he says. The International Telecommunications Regulations have acted as international law since 1988, but a major issue at the December conference will be if and how the regulations should be applied to the Internet.
Harnessing the Predictive Power of Virtual Communities
University of Ljubljana researchers say they have developed an algorithm that can detect virtual communities better than existing state-of-the-art algorithms. The propagation-based algorithm can extract both link-density and link-pattern communities without any prior knowledge of the number of communities. Classical communities are defined by their internal level of link density, while link-pattern communities are characterized by internal patterns of similar connectedness between their nodes. Ljubljana's Lovro Subelj and Marko Bajec tested the algorithm on 10 real-life networks, including social, information, and biological networks, and concluded that real-life networks appear to be composed of link-pattern communities that are interwoven and overlap with classical link-density communities. They hope to create a generic model to understand the conditions, such as the low level of clustering, for link-pattern communities to emerge, compared to link-density communities. The researchers say the model could be used to predict future friendships in online social networks, analyze interactions in biological systems that are hard to observe, and detect duplicated code in software systems.
Green IT: In Search of an Energy Yardstick
Computerworld (01/30/12) Mary Brandel
The most widely used metric for measuring data centers' energy consumption is the Green Grid's Power Usage Effectiveness (PUE) measure, but it does not reveal the amount of energy used per unit of work. The Green Grid and other industry groups are currently working on metrics that analyze the measurement of productive energy consumption. Many companies are using combinations of the available metrics, as well as developing their own metrics, to describe data center efficiency and productivity. "While not perfect, PUE does a good job of achieving a snapshot of how much electricity is powering what the data center is there to do," says Forrester Research's Doug Washburn. Other metrics being used include Carbon Usage Effectiveness, which addresses data-center-specific carbon emissions and is calculated by measuring total carbon dioxide data center emissions and dividing by equipment energy. Server Compute Efficiency and Data Center Compute Efficiency measure what proportion of work is useful. Data Center Energy Productivity aims to quantify the ratio of useful work produced by a data center to total energy consumed by it. "If you use one metric without the other, you can be fooling yourself," says the Green Grid's Katherine Winkler.
Abstract News © Copyright 2012 INFORMATION, INC.
To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.