Association for Computing Machinery
Welcome to the August 14, 2013 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


Wireless Devices Go Battery-Free With New Communication Technique
UW News (WA) (08/13/13) Michelle Ma

University of Washington engineers have developed a wireless communication method called ambient backscatter that enables devices to interact without batteries or wires, in a development that could advance the Internet of things. The system enables devices to exchange information using existing TV and cellular transmissions. The team developed a small, battery-free device with an antenna that picks up and reflects a TV signal, which then is picked up by other similar devices. "We can repurpose wireless signals that are already around us into both a source of power and a communication medium," says lead researcher Shyam Gollakota. "It's hopefully going to have applications in a number of areas, including wearable computing, smart homes, and self-sustaining sensor networks." The team won the best-paper award for their work at the ACM SIGCOMM 2013 conference, which began Aug. 12 in Hong Kong. "Our devices form a network out of thin air," says paper co-author Joshua Smith. "You can reflect these signals slightly to create a Morse code of communication between battery-free devices." The researchers note that the system will enable smart sensors to be located inside nearly any structure and to communicate with one another.


How to Share Scientific Data
The New York Times (08/13/13) John Markoff

An inundation of digital scientific data has given rise to a debate over who should access it, how it can be stored, and who will pay to do so. A recent paper published in the journal Science by Google chief Internet evangelist and ACM president Vint Cerf and Rensselaer Polytechnic Institute computer scientist Francine Berman recommended sharing the costs of making the data widely available. The paper notes that managing the data would entail a major cultural shift within the government, corporate institutions, and individual researchers. Cerf says storing and sharing digital information is becoming a "crucial issue" for both public and private institutions. The debate could intensify next week when U.S. government agencies are expected to submit proposals for how they would "support increased public access to the results of research funded by the federal government." The challenge will likely be compounded by U.S. Office of Science and Technology Policy director John P. Holdren's edict that such plans be executed using "resources within the existing agency budget." The lack of new money makes addressing the cost issues difficult, according to the National Science Foundation's Alan Blatecky. Cerf and Berman suggest that private companies and corporate and academic labs should invest in new computer data centers and storage systems to prevent the loss of vital research data.


Hackers Called Into Civic Duty
The Wall Street Journal (08/13/13) Ben Kesling

U.S. cities strapped for cash are tapping civic-minded hackers to inexpensively improve their online services by using city data to build apps for tracking diverse activities. For example, last year Chicago Mayor Rahm Emanuel signed an executive order mandating the city make available all data not guarded by privacy statutes, and Code for America reports that Chicago now has almost 950 publicly accessible data sets. Among the hacker-developed products used by Chicago is an app that maps house fires and determines if the house has any residents who might require recovery and resettlement assistance from the local Red Cross. Other examples of hacker-generated civic apps is New York City's free Embark program, which tracks train locations in real time and plots out routes for subway riders, and San Francisco's free iPhone app, which lets people use public parks and areas by supplying parking data and class schedules and displaying photos of the green spaces. Hackers also are helping South Bend, IN, mine volumes of unstructured data to extract useful information, while Code for America aims to place programming volunteers in cities that want to put civic data to better use. "It's a year-long commitment, almost like Peace Corps for geeks," says Code for America's Christopher Whitaker.
View Full Article - May Require Free Registration | Return to Headlines | Share Facebook  LinkedIn  Twitter 


Denser, Faster Memory Challenges Both DRAM and Flash
Technology Review (08/14/13) Tom Simonite

University of Michigan researchers have developed crossbar memory technology, a type of memory chip that could boost the speed and storage capabilities of future smartphones and other computing devices. The researchers say crossbar technology can store data 40 times more densely than the most compact memory available today, and it is also faster and more efficient. The technology's ability to store a lot of data in a small space could see it replace the flash memory chips that are the basis of memory cards, some hard drives, and the internal storage of mobile devices. "It will be much denser and faster than flash because it is not based on moving electrons around or on transistors," says Michigan professor Wei Lu. The current version of the technology can store 1 terabyte of data on a single 200-square-millimeter chip. Data is written by applying a specific control voltage to a particular crossbar junction. Data is read out from cross memory by testing the conductivity of each junction.


Intelligence Chief Clapper to Set Up U.S. Surveillance Review Group
IDG News Service (08/13/13) John Ribeiro

The Obama administration recently launched a review of whether the United States makes optimal use of advancements in technology to protect its national security while preventing unauthorized disclosure and maintaining public trust. Director of National Intelligence James R. Clapper has been instructed to form the Review Group on Intelligence and Communications Technologies, which will brief the president on its interim findings within 60 days of the establishment of the group. The review group "will assess whether, in light of advancements in communications technologies, the United States employs its technical collection capabilities in a manner that optimally protects our national security and advances our foreign policy while appropriately accounting for other policy considerations, such as the risk of unauthorized disclosure and our need to maintain the public trust," according to a White House memorandum. The Obama administration will appoint an independent board to review the country's surveillance programs, and also will add an advocate to defend privacy in the Foreign Intelligence Surveillance Court when agencies ask the court for new surveillance orders.


NSA Leaks Make Plan for Cyberdefense Unlikely
The New York Times (08/12/13) David E. Sanger

Although the U.S. National Security Agency (NSA) has lobbied inside the government to deploy advanced cyberdefense systems designed to intercept cyberattacks before they can cripple critical infrastructure, the plan has virtually no chance of moving forward given the backlash against NSA over the recent disclosures about its surveillance program, according to administration officials. NSA recently reported that it "touches about 1.6 percent" of all the traffic carried on the Internet each day. However, NSA director Gen. Keith B. Alexander's plan would put the agency in the position of examining a far larger percentage of the world's information. As part of the plan, the government would tap the data pipes that feed the largest Internet service providers in the United States. The system would give NSA a real-time way to track computer servers worldwide and monitor email traffic for data that indicates potential malware attacks or attempts to steal information from U.S. companies. "It's defense at network speed," Alexander reportedly told a Washington security-research group recently, "because you have only milliseconds." The U.S. government currently is working with major defense contractors on a model for a national defense against cyberattacks, and NSA has been assembling teams to provide offensive and defensive cyberdefense capabilities.


Meshnet Activists Rebuilding the Internet From Scratch
New Scientist (08/08/13) Hal Hodson

Activists across the United States are constructing meshnets--decentralized, user-owned wireless networks that enable secure communication without surveillance--partly in response to the Internet's lack of privacy and neutrality. The Seattle Meshnet project, for example, is comprised of nodes, with each node made up of a radio transceiver and a computer to transmit messages from other parts of the network. In the event data cannot be channeled through one route, the meshnet finds another. U.S. meshnet projects are underpinned by Hyperboria, a virtual, peer-to-peer meshnet that runs through the existing Internet. Once physical meshnet nodes are established, existing Hyperboria links can be routed through them. Hyperboria's peer-to-peer connections facilitate full encryption of every link in the chain of communication. The Seattle Meshnet recently completed a crowdfunding effort for meshbox routers that come preloaded with the cjdns software that is required to join the Hyperboria meshnet, and which keeps all communications encrypted. Meanwhile, a much larger meshnet in Catalonia, Spain, consists of more than 21,000 wireless nodes, enabling user communications as well as support of Web servers, videoconferencing, and online radio broadcasts that would remain operational if the rest of the country's Internet was blacked out.


Nice Threads! Computer Scientists Develop New Model to Simulate Cloth on a Computer With Unprecedented Accuracy
UCSD News (CA) (08/07/13) Ioana Patringenaru

University of California, San Diego (UCSD) researchers have developed a model to simulate the way cloth and light interact. The researchers say the model can be used in animated movies and in video games to make cloth look more realistic. "Not only is our model easy to use, it is also more powerful than existing models," says Google's Iman Sadeghi, who helped develop the model while a Ph.D. student at UCSD. The model is based on an approach that simulates the interaction of light with cloth by simulating how each thread scatters light, and factoring in the weaving pattern. "It essentially treats the fabric as a mesh of interwoven microcylinders, which scatter light the same way as hair, but are oriented at 90 degrees from each other," Sadeghi says. The model also can act as a framework to visualize what new fabrics would look like, and it can simulate any combination of weaving pattern and thread types, notes Canfield Scientific's Oleg Bisker, who helped develop the model while a student at UCSD. The researchers aim to demonstrate the model's ability to handle different types of thread and an unlimited variety of weaving patterns.


Some Like IT Cold: Intelligence Agencies Push for Low-Power Exascale
Green Computing Report (08/09/13) Tiffany Trader

The U.S. intelligence community is investing in superconductive computing research so it can help institute more efficient, low-power exascale computing. The Intelligence Advanced Research Projects Activity (IARPA) launched its Cryogenic Computing Complexity Program earlier this year "to establish superconducting computing as a long-term solution to the power-space-cooling problem and as a successor to end-of-roadmap complementary metal-oxide-semiconductor (CMOS) for high performance computing." The best technology currently available would still produce an exascale system requiring several hundred megawatts to operate, while the transistor is about to reach its scale limits. The first stage of IARPA's five-year project will focus on developing the technologies needed to demonstrate superconducting computing's value. The project's second stage will involve integrating those technologies into a "small-scale computer based on superconducting logic and cryogenic memory that is energy-efficient, scalable, and able to solve interesting problems." The government believes that "superconducting computing offers an attractive low-power alternative to CMOS with many potential advantages." Studies show that superconducting technology creates a foundation for 1-petaflop systems that run at only 25 kW and 100-petaflop systems that operate at 200 kW.


UCLA Researcher Invents New Tools to Manage 'Information Overload' Threatening Neuroscience
UCLA Newsroom (CA) (08/08/13) Elaine Schmidt

University of California, Los Angeles (UCLA) researchers have developed a research map tool to enable neuroscientists to quickly scan existing neuroscience research and plan their next study. "Information overload is the elephant in the room that most neuroscientists pretend to ignore," says UCLA professor Alcino Silva. "Without a way to organize the literature, we risk missing key discoveries and duplicating earlier experiments. Research maps will enable neuroscientists to quickly clarify what ground has already been covered and to fully grasp its meaning for future studies." The maps provide simplified, interactive summaries of research findings, and a Web-based app helps scientists expand and interact with their field's map. "We founded research maps on a crowd-sourcing strategy in which individual scientists add papers that interest them to a growing map of their fields," Silva says. "Each map is interactive and searchable; scientists see as much of the map as they query, much like an online search." Scientists can focus in on areas that interest them and track published findings to determine what might be missing and, therefore, which experiments might be worth pursuing. The researchers aim to automate the map-creation process so new papers will automatically be added to the collection as they are published.


Use Digital Signal Processing Engineering to Prevent a Flash Crash, says NJIT Prof
New Jersey Institute of Technology (08/08/13) Sheryl Weinstein

New Jersey Institute of Technology professor Ali Akansu thinks that digital signal processing (DSP) technology can be used to prevent another flash crash from ever happening again. "There are DSP engineering theory and methods that may help quantitative finance and algorithmic trading so that no one ever sees such a crash again," Akansu says. He notes that recent advances in high-performance computing and DSP, as well as low latency networking and storage technologies, have transformed the financial industry. Akansu says the advances have led to new research opportunities in a variety of fields, including mathematical finance and DSP engineering, among others. Speaking at a recent conference, Akansu analyzed and explained the 2010 market flash crash. Akansu praises the virtues of high-performance DSP technologies such as field programmable gate arrays and graphics processing units for real-time portfolio risk measurement and control. Akansu also stresses the importance of a better-maintained limit order book made possible by high-performance DSP technologies. "I believe that by using new technology, we can develop fairer and more robust markets," he says.


Breaking Through the Fault-Testing Bottleneck in Chip Production
CORDIS News (08/08/13)

A consortium of universities and technology companies funded by the European Commission has developed a way to improve fault testing and verification in integrated circuits, in a development that could transform the semiconductor industry. The Diagnosis, Error-Modelling and Correction for Reliable Systems Design (DIAMOND) project aimed to develop new models and technology to test, detect, verify, and repair IC errors. The researchers note that chip design projects typically cost about 60 million euros, but automating the error detection and correction process could lower the costs to 15 million euros. The team began by creating a holistic model for various fault types, thus allowing the same localization engines to be used for design errors, soft errors, and defects. The team then developed more efficient automated localization and correction methods, emphasizing system-level approaches in which previous research work did not suffice. Finally, the team created post-silicon in-situ debug approaches to extend the life of silicon chips by localizing and isolating faulty regions. The team also developed an open source system-level design error localization and correction system, which can correct 60 percent of benchmark designs, compared to 16 percent with previous systems. By eliminating bottlenecks in testing and verification, the new methods could reduce chip development time, speed innovation, and decrease the cost of electronic devices.


IT Helps Researchers See Their Data Like Never Before
Missouri S&T News (08/07/13) Andrew Careaga

Missouri University of Science and Technology researchers have developed Visualizing Four Dimensions in Rolla (V4DiR), a system that enables scientists to view data in three dimensions (3D) over various time spans. Missouri S&T's Mark Bookout demonstrated V4DiR's capabilities by showing maps-in-motion of natural disasters, such as all of the world's earthquake occurrences from 1920 through 2012, and the tornado activity in the United States since 1950. The researchers also have used V4DiR with a 3D immersion of a New Mexico mine that was created using LIDAR radar. "One of the most powerful things to me is to watch how someone who sees it for the first time reacts to the visualization," Bookout says. The researchers also note the potential of using V4DiR to visualize data related to research on how children with autism behave during various times of the day. "V4DiR has the potential to enhance any sort of research," Bookout says. "It allows us to use our natural pattern-recognition capabilities to isolate interesting groupings of information."


The Making of Facebook’s Graph Search
IEEE Spectrum (08/06/13) Tekla S. Perry

Facebook recently released its Graph Search search engine, which sorts through a user's Internet connections to find links between their acquaintances and other information stored in Facebook databases, such as the establishments they frequent and businesses in which they are interested. Graph Search, which debuted in a controlled-release in January, could help Facebook grow from a possibly expendable tool for entertainment and communication into a search tool that could challenge Google's search dominance. "If you’re getting useful information about contractors and vacation planning and other advice, you start to rely on it in a way people rely on Google," says Opus Research analyst Greg Sterling. Creating Graph Search was a challenge that began in April 2011 when CEO Mark Zuckerberg approached computer scientist Lars Rasmussen about developing a structured search engine over all the data that people share on Facebook. Rasmussen, who became the engineering director of Graph Search, started the project by creating a JavaScript prototype that could answer questions such as, "Who are my friends who live in New York?" Users initially had difficulty with the prototype, and data volume presented another challenge, with Facebook adding more than 9 billion new pieces of data a day.


Abstract News © Copyright 2013 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe