Association for Computing Machinery
Welcome to the April 13, 2012 edition of ACM TechNews, providing timely information for IT professionals three times a week.

ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


AFOSR Seeking "Transformational Computing"
CCC Blog (04/12/12) Erwin Gianchandani

The U.S. Air Force Office of Scientific Research (AFOSR) has launched a basic research initiative that aims to bring together the computational hardware, software, aerospace sciences, physics, and applied mathematics communities to create a novel and unique capability to design high-performance computing platforms to facilitate the development of Air Force systems. The initiative aims to resolve the issues of increased power consumption and the slowing of Moore's law. "There is a clear need for a fundamental basic research program wherein the hardware and algorithms are placed on an equal footing to develop specialized, heterogeneous, and very high-performance systems to answer the computational objectives of the Air Force and the Department of Defense," AFOSR's announcement says. The goal is to develop the fundamental research to enable highly focused, potentially heterogeneous computational platforms that offer orders of magnitude better performance for single application areas. "This effort will be focused on fundamental and interdisciplinary research in computational hardware, computational software, and mathematical models, and is especially interested in work that characterizes the relationship between hardware and algorithms," the announcement says.


EU Investigates Internet's Spread to More Devices
BBC News (04/12/12)

The European Commission (EC) expects rapid growth in the number of household appliances and other devices that are connected to the Internet by 2020, and as a result is launching a consultation over controls for the way information is gathered, stored, and processed. The typical person currently has at least two devices connected to the Internet. However, by 2015 that number is expected to grow to seven Web-connected devices per person, for a total of 25 billion worldwide, and more than 50 billion by 2020. The EC notes that previous technological advances have led to new legislation, such as the European Union's (EU's) Privacy and Data Communications Directive, which requires users to give permission for Web sites to install tracking cookies into their browsers. "Technologies like these need to be carefully designed if they are to enhance our private lives, not endanger them," says EU representative Emma Draper. "Sharing highly sensitive personal data--like medical information--to a network of wireless devices automatically creates certain risks and vulnerabilities, so security and privacy need to be built in at the earliest stages of the development process."


Simulating Tomorrow's Chips
MIT News (04/13/12) Larry Hardesty

Researchers at the Massachusetts Institute of Technology's (MIT's) Computer Science and Artificial Intelligence Laboratory have developed Arete, a method for improving the efficiency of hardware simulations of multicore chips. The researchers say Arete guarantees the simulator will not fall into a deadlock state in which cores get stuck waiting for each other to give up system resources. Arete also could make it easier for designers to develop simulations and for outside observers to understand what those simulations are meant to accomplish. The method involves a circuit design that enables the ratio between real clock cycles and simulated cycles to fluctuate as needed, which allows for faster simulations and more economical use of the field-programmable gate array's (FPGA's) circuitry. "What we’re proposing is, instead of having this in your head, let’s start with a specification," says MIT graduate student Asif Khan. The researchers' high-level language, known as StructuralSpec, builds on the BlueSpec hardware design language developed at MIT in the late 1990s. StructuralSpec users provide a high-level specification of a multicore model, and the program produces the code that implements that model on an FPGA.


Secrets of App Store Revealed by Artificial Life Forms
New Scientist (04/11/12) Paul Marks

University College London researchers Soo Ling Lim and Peter Bentley have developed a simulation of the Apple App Store to study how it works. The researchers note that Apple's online marketplace of more than 500,000 apps is a self-regulating ecosystem that does not tolerate copycats. The simulated App Store, known as AppEco, uses software that obeys unique behavioral rules to mimic apps, developers, and consumers. The simulation mimics four types of developers, which are known as innovators, optimizers, milkers, and copycats. The researchers ran a series of simulations, each time starting with all four categories of developers contributing an equal number of apps. If they forced the proportion of apps from each group to stay constant, the copycats quickly made the most money. But their advantage soon disappeared as the ecosystem suffered from a lack of novel products. In another simulation, consumers' choices dictated which apps thrived and which did not. Under those conditions, optimizers sold the most apps, followed by innovators, milkers, and copycats. "Surprisingly, it naturally suppresses the copycat 'bad guys' without even needing the App Store owners to start imposing rules," Bentley says.
View Full Article - May Require Free Registration | Return to Headlines


UChicago to Host North American Programming Contest April 14-15
UChicago News (04/10/12) Steve Koppes

The University of Chicago is hosting an invitational programming contest on April 14-15 to serve as a tune-up for the 22 North American teams that qualified for the 2012 World Finals of the International Collegiate Programming Contest (ICPC), which ACM will hold on May 17 in Warsaw, Poland. The Chicago contest will be the first to include all World Finals teams from the ICPC North American Super-Region. "Since 2000, the top team in the World Finals has always been Russian, Chinese, or Polish, so the idea is that an event like this will help to collectively raise the profile of all the North American teams," says University of Chicago programming team coach Borja Sotomayor. Just 110 teams qualified for the world finals, out of more than 8,000 teams from more than 2,000 universities in 88 countries. The universities of Harvard, Stanford, Carnegie Mellon, and Illinois at Urbana-Champaign are among the institutions sending teams to the invitational. "Hosting the invitational this weekend marks a major step forward in the development of our undergraduate hacker community, and the computer science department is very proud to be able to provide space and support for it," says University of Chicago department chairman John Goldsmith.


Microsoft Calls on Elite Universities to Join Schools in IT Course Revamp
V3.co.uk (04/10/12) Rosalie Marshall

United Kingdom universities should overhaul their information technology (IT) programs to better prepare students for careers in the IT industry, says Microsoft's Stephen Uden. The call for changes to IT studies comes at a time when the government is preparing to reform the computer science curriculum at the grade school level. "I am really pleased that the education secretary decided to drop the IT GCSE, and that the government is working to replace it with a computer science qualification to prepare people for the future," Uden says. "There needs to be more industry-focused content in what universities are teaching, particularly around the disciplines of cloud computing and security." He notes the industry badly needs certain skills, and is encouraging universities to teach Java and other technologies. Uden also says the top-ranked universities in the United Kingdom tend to offer courses that are least relevant for the IT workplace.


ICT Fostering Inter-Family Relationships, Naturally
CORDIS News (04/11/12)

The Together Anywhere, Together Anytime (TA2) research project is developing tools designed to help families communicate and interact as naturally as possible even if they are thousands of miles apart. There is a lack of "applications that enable groups of people in different places to communicate and interact in a natural way," says BT researcher Doug Williams. TA2 combines artificial intelligence with ambient intelligence, multimedia tools, and audio and video capturing, encoding, processing, and transmission to enable near-natural interaction and communication between dispersed groups of people. The researchers say the system goes beyond standard videoconferencing by enabling different groups of people in multiple locations to interact and use shared applications. "Using the system is a bit like watching a movie or a TV talk show except the people you are watching are interacting with you and others in real time," Williams says. The TA2 system also enables dispersed families to play games, and is equipped with ambient intelligence that includes sensors and colored lights to let participants know when other group members are available. "A lot of the components of the system are already available commercially, although we adapted and improved many of them to suit our purposes," Williams says.


IBM, Universities Launch Supercomputing Scheme
ZDNet (04/11/12) Charlie Osborne

Canada's federal government and Ontario's provincial government have partnered with IBM on a project to improve the nation's computing infrastructure. Seven Canadian universities will participate in the $210 million initiative to upgrade and develop supercomputing and cloud platforms. High-performance computing will enable researchers to store and analyze vast quantities of data, which would be helpful for addressing challenges in healthcare, sustainability, urban construction, and energy use. The current priorities of the Ontario-based project, which will form the IBM Canada Research and Development Center, include smart grid technology, weather modeling, computer platforms, and improving an aging infrastructure. "No matter the industry, science and technology are driving the bottom line," says Gary Goodyear, Canada's minister of state for science and technology. "Canada has what it takes to be an innovation leader." Officials believe the project has the potential to spur innovation in technological fields. IBM will contribute as much as $175 million to the project by 2014 in addition to the federal and provincial funds. Ontario's government intends to invest $15 million to $20 million.


WebRTC Puts Video Chats All in the Browser
PhysOrg.com (04/10/12) Nancy Owano

The Web Real Time Communication (WebRTC) standard is expected to gain widespread use once it is mature, according to technology observers. The draft standard is designed to support streaming audio and video communication directly on the Web, eliminating the need for plug-ins. The underlying technology for WebRTC comes from Global IP Solutions, which Google acquired in 2010. Google opened the source code under a BSD license with the intention of spurring its standardization. WebRTC enables Web browsers with RTC capabilities via JavaScript application programming interfaces (APIs), and the goal is to allow "rich, high quality, RTC applications to be developed in the browser via simple JavaScript APIs and HTML5," according to the WebRTC Web site. Google, Mozilla, and Opera support the initiative, while Mozilla created a stir when it recently demonstrated a browser-based video chat application using WebRTC for Firefox. However, the standards group still has a lot of work to do. Reports indicate the standard is going through major revisions and is being drafted in the WebRTC W3X working group.


Iran Building a Private, Isolated Internet, but Can It Shut Out the World?
Government Computer News (04/10/12) Kevin McCaney

Reports that Iran is planning to engineer a "Halal" Internet free of objectionable material and Western influence, and possibly launch it this summer, have fueled much media speculation. A Wall Street Journal report in March indicated that the new network's goals include cracking down on dissent, censoring content, and keeping outside forces from fomenting protest. The report said Iran's Supreme Leader Ayatollah Ali Khamenei established the Supreme Council of Cyberspace, comprised of the chiefs of the country's intelligence, militia, and security ministries, the Iranian Revolutionary Guard Corps, and media organizations to control Iranian cyberspace, and have authority to enact laws. One of the council's members said in announcing the group that security is a factor, noting that "we are worried about a portion of cyberspace that is used for exchanging information and conducting espionage." Last October, Iran reportedly took action to move its domain name hosting in-house, in readiness for the new national network. However, a Wall Street Journal report in December cited Iran experts' views that the country's complete severance from the rest of cyberspace is unlikely, and a more probable approach would be for Iran to operate dual networks, with outside access for the privileged.


Transactional Memory: An Idea Ahead of Its Time
Brown University (04/09/12) Richard Lewis

Brown University researchers were studying theoretical transaction memory technologies, which attempts to seamlessly and concurrently handle shared revisions to information, about 20 years ago. Now those theories have become a reality. Intel recently announced that transactional memory will be included in its mainstream Haswell hardware architecture by next year, and IBM has adopted transactional memory in the Blue Gene/Q supercomputer. The problem that transaction memory aimed to solve is that core processors were changing in fundamental ways, says Brown professor Maurice Herlihy. Herlihy developed a system of requests and permissions in which operations are begun and logged in, but wholesale changes, or transactions, are not made before the system checks to be sure no other thread has suggested changes to the pending transaction as well. If no other changes have been requested, the transaction is consummated, but if there is another change request, the transaction is aborted and the threads start anew. Intel says its transactional memory is "hardware [that] can determine dynamically whether threads need to serialize through lock-protected critical sections, and perform serialization only when required."


Asimov's Robots Live on Twenty Years After His Death
Inside Science (04/06/12) Alan S. Brown

In the two decades since the passing of science fiction author Isaac Asimov, his concept of robots programmed to meet certain safety standards has become a touchstone for artificial intelligence (AI) researchers. He proposed three laws to prevent robots from harming humans, but they contain contradictions that have encouraged others to propose new rules. In a 2004 essay, Michael Anissimov of the Singularity Institute for Artificial Intelligence noted that "it's not so straightforward to convert a set of statements into a mind that follows or believes in those statements." Anissimov believes that programmers must develop "friendly AI" that loves humans rather than try to program machines with rules. Three years ago, Texas A&M scientist Robin Murphy and Ohio State Cognitive Systems Engineering Lab director David Woods proposed three laws to govern autonomous robots, the first based on the assumption that since humans deploy robots, human-robot systems must subscribe to high safety and ethical standards. The second law states that robots must follow appropriate commands, but only from a limited number of people. The third asserts that robots must protect themselves only following the transfer of control of their operations to humans.


Bits of Reality
Science News (04/07/12) Vol. 181, No. 7, P. 26 Tom Siegfried

Information derived from quantum computing systems could reveal subtle insights about the intersection between mathematics and the physical world. "We hope to be able to verify that these extraordinary computational resources in quantum systems really are part of the way nature behaves," says California Institute of Technology physicist John Preskill. "We could do so by solving a problem that we think is hard classically ... with a quantum computer, where we can easily verify with a classical computer that the quantum computer got the right answer." To solve certain hard problems that standard supercomputers cannot accommodate, such as finding the prime factors of very large numbers, quantum computers must process bits of quantum information. Quantum machines would only be workable for problems that could be posed as an algorithm amenable to the way quantum weirdness can eliminate wrong answers, allowing only the right answer to prevail. In 2011, the Perimeter Institute for Theoretical Physics' Giulio Chiribella and colleagues demonstrated how to derive quantum mechanics from a set of five axioms plus one postulate, all rooted in information theory terms. The foundation of their system is axioms such as causality, the notion that signals from the future cannot impact the present.


Abstract News © Copyright 2012 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe