ACM TechNews
Association for Computing Machinery
Welcome to the April 23, 2014 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Updated versions of the ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


Future of the Internet Debated at Net Mundial in Brazil
BBC News (04/23/14) Leo Kelion

The future governance structure of the Internet will be debated at the two-day Net Mundial event in Brazil following an announcement by the United States that it will cede its oversight of Net address distribution in favor of a global multi-stakeholder community by September 2015. "The Internet is a collective construction and its governance process must also be built that way," argues Brazil Secretary for Information Technology Policy Virgilio Almeida. The goal of Net Mundial is to reach agreement on common principles and emphasize specific issues that could form the foundation of future Internet governance discussions. China, Russia, Tajikistan, and Uzbekistan say it should be stipulated in the draft outcome document that follow-up deliberations should transpire "within the United Nations framework." However, the U.S., Australia, and several European countries contend such responsibility should instead go to a body that is not swayed by any governments. "The document should focus on promoting cooperation to deal with cybercrime and cybersecurity instead of advancing controversial treaties or international agreements," says the U.S. State Department. Privacy advocates also voice concern the issue of government Web surveillance could be sidelined.


NIST to Drop Crypto Algorithm From Guidance
GovInfoSecurity.com (04/21/14) Eric Chabrow

The U.S. National Institute of Standards and Technology (NIST) has issued a draft of amended guidance that drops a cryptographic algorithm the U.S. National Security Agency (NSA) is thought to have used to bypass encryption that protects much of global commerce, banking systems, medical records, and online communications. The action comes after the cryptographic community expressed outrage over NSA's exploitation of a deterministic random bit generator (DRBG) to circumvent encryption by using guidance-specified parameters. They warn this could subsequently enable hackers to predict the secret cryptographic keys that form the foundation for the guarantees provided in the special publication. NIST recommends users and implementers switch to one of three other sanctioned DRBGs specified in the guidance. Cryptography expert Bruce Schneier says this is the right action for NIST to take to repair its credibility, even though it was unaware of NSA's algorithm tampering. NIST notes cryptographers spotted this potential vulnerability during the guidance's development, and the problem was initially ameliorated by providing mechanisms to generate alternative parameters that would not be prone to this weakness.


Super-High Frequencies Could One Day Deliver Your Mobile Video
IDG News Service (04/21/14) Stephen Lawson

New York University professor Ted Rappaport is developing millimeter-wave wireless technology, which employs frequencies many times higher than those normally used to carry cellular data. Rappaport notes the technology leverages frequencies that are rarely used and could someday relieve a shortage of available wireless frequencies. If a future spectrum shortage occurs, it will probably hit big cities, where there are often thousands of people simultaneously using cellular networks in the same area. Millimeter-wave wireless technology could help delay the problem by making it easier for a carrier to set up small cells for those densely populated areas. The smaller cells can work with the regular cells and deliver more service over the same spectrum. Millimeter-wave radios also involve using the additional frequencies in smartphones and small cells, shooting narrow beams of data even as mobile users change locations. In addition, millimeter waves reduce interference because the beams are so narrow there is little chance one will run into another. Rappaport notes millimeter waves are so narrow they could be used with antennas that can be precisely aimed and rapidly shifted. The antennas could point in different directions based on the charge applied to a given part of the array, which would enable them to keep sending signals to each other as a user travels.


Asia Student Supercomputer Challenge Emphasizes Applications, Scalability
HPC Wire (04/21/14) Tiffany Trader

The 2014 Asia Student Supercomputer Challenge (ASC14), which is taking place this week at Sun Yat-sen University's Guangzhou Supercomputer Center, aims to promote both fundamental and practical technology, improve students' ability to tackle actual problems, motivate student interest in supercomputing, and encourage students to meet with international peers. ASC14, organized by the HPC Advisory Council, the International Supercomputing, and China HSS Lab, held preliminary contests earlier this year to find a group of 16 finalist teams. As part of the competition, each team must build its own mini supercomputer with a 3,000-watt power budget and configure it to run the LINPACK benchmark in addition to three scientific applications. The competition committee also will announce a secret application after the competition has started, which the teams will have to configure to scale and run on the Tianhe-2 supercomputer. The team that obtains the highest scalability and performance optimization of supercomputer applications will be awarded a prize. ASC14 differs from other Student Cluster Competitions in that all the teams must assemble clusters from stock parts provided by the challenge organizer.


Simplifying Exascale Application Development
Pacific Northwest National Laboratory (04/18/14)

Scientists at Pacific Northwest National Laboratory's (PNNL) Data Intensive Scientific Computing group are developing formal design processes based on Concurrent Collections (CnC), a programming model that integrates task and data parallelism to simplify exascale system applications for developers. An example of this process at work is the transformation of the Livermore Unstructured Lagrangian Explicit Shock Hydrodynamics (LULESH) proxy application code that simulates hydrodynamics into a complete CnC specification. The resulting specification can be deployed and executed using a paradigm that exploits the massive parallelism and power-conserving properties of future exascale systems. Initiating a CnC specification starts via the manual depiction of dataflow between software components and the formalization of opportunities for analysis and optimization of parallelism, energy efficiency, data movement, and faults. With the conversion of a sketch description of the LULESH CnC model into a formal graph prior to writing code, PNNL researchers carried out static analysis, applied optimization techniques, and identified bugs, which lowered the costs associated with development and testing processes. "In addition to providing a natural and obvious pathway for application development, we identified communications and optimization issues that could be addressed with added clarity before the computation steps were even implemented," says PNNL's John Feo.


'Photonic Transistor' Switches Light Signals Instead of Electronic Signals
A*STAR Research (04/23/14)

A*STAR Data Storage Institute researchers say they are developing a practical "photonic transistor" for optical interconnects that can control light signals similarly to electronic transistors. The researchers note the most recent photonic transistor design is based on common semiconductor technology and offers high switching gain, low switching power, and high operating speed. The new design enables a switching gain of greater or equal to two, which means the output signal is more than double the strength of the input signal. As a result, the transistor can be cascaded, meaning the output signal from one photonic transistor is sufficiently strong that it can be split to feed several others. In addition, the design consumes 10 to 20 times less power than conventional all-optical switching technologies and can operate at very fast speeds, according to A*STAR's Vivek Krishnamurthy. The researchers now are working to experimentally realize their optical transistor. "Once we experimentally verify the prototype, we could further integrate it into large-scale optical switching systems for optical interconnects," Krishnamurthy says.


First Steps Towards 'Experimental Literature 2.0'
Swiss Federal Institute of Technology in Lausanne (04/17/14) Emmanuel Barraud

Researchers at the Swiss Federal Institute of Technology in Lausanne's Laboratory of Digital Humanities (DH Lab) have developed an application designed to reorder literary works by changing their chapter arrangement. The researchers used Swiss writer Daniel de Roulet's saga "The Human Simulation" to test the program. Developed as a free, French-language smartphone app, the 10-volume "The Human Simulation" can be read neatly and dynamically. "Each chapter constitutes a narrative unit with enough elements in common with the other chapters of the saga so that it is possible to read them in a different order than that of the publication," says de Roulet. The DH Lab's Cyril Bornet says the writing process was formalized and the text synthesized via algorithmic textual analysis. The lab's storytelling research aims to surpass the reconstruction of existing literary works by also accounting for readers' changing habits, which currently prefer short intrigues over long narrative structures. Bornet says the research forms part of his thesis to "analyze, through the use of computer tools, how digital media influence the way literary works are written."


Code Camp Empowers High School Girls With Computer Science Education
Stanford Daily (04/22/14) Catherine Zaw

The Stanford University Computer Science Department's Girls Teach Girls To Code (GTGTC) program recently hosted more than 200 high school girls on campus for a "Code Camp" designed to introduce them to the various real-life applications of computer science. GTGTC also provides smaller-scale events throughout the academic year, including company tours and visits to the Computer History Museum. GTGTC's popularity has led to plans for expansion, including trying to host Code Camp biannually as opposed to just once a year. "In the far future, we're hoping that this is an organization with chapters in different colleges," says GTGTC founder and coordinator Heidi Wang. "You can imagine if it was offered to girls across the nation and the world, how many girls we could reach and the impact we could make." The program aims to show high school girls that computer science is fun and flexible, says GTGTC coordinator Jessie Duan. "It's good to learn at a young age that you can be in computer science, because even if you don't end up pursuing that, you still feel like you have the capacity to learn other fields that are similar," adds GTGTC program mentor Marissa Mass.


DARPA Developing the Ultimate Auto-Pilot Software
Network World (04/18/14) Michael Cooney

The U.S. Defense Advanced Research Projects Agency (DARPA) is planning a new program called Aircrew Labor In-Cockpit Automation System (ALIAS), which would build on advances in aircraft automation systems and remotely piloted aircraft automation. "Our goal is to design and develop a full-time automated assistant that could be rapidly adapted to help operate diverse aircraft through an easy-to-use operator interface," says DARPA program manager Daniel Patt. "These capabilities could help transform the role of pilot from a systems operator to a mission supervisor directing intermeshed, trusted, reliable systems at a high level." As an automation system, ALIAS would carry out a planned mission from takeoff to landing, including persistent state monitoring and rapid procedure recall. DARPA says easy-to-use touch and voice interfaces could facilitate supervisor-ALIAS engagement. Three major technical thrust areas associated with the program include minimally invasive interfaces from ALIAS to existing aircraft, knowledge acquisition on aircraft operations, and human-machine interfaces, according to DARPA.


Researchers Use Twitter to Predict Crime
Agence France-Presse (04/20/14)

Tweets can be useful for predicting 19 to 25 kinds of crimes, especially for offenses such as stalking, thefts, and certain kinds of assault if the correct analysis is applied, according to researchers at the University of Virginia. Matthew Gerber from the Predictive Technology Lab and colleagues analyzed tweets from Chicago tagged to certain neighborhoods and the city's crime database. Then they looked forward and were able to make useful predictions about areas where certain crimes were likely to occur. Gerber says the team's algorithm learns the pattern and produces a prediction. "This approach allows the analyst to rapidly visualize and identify areas with historically high crime concentrations," according to the research paper on the method. "Future crimes often occur in the vicinity of past crimes, making hot-spot maps a valuable crime prediction tool." Gerber says the results are surprising, considering people rarely tweet about crimes directly, and he notes even tweets with no direct link to crimes may contain helpful information.


Wikipedia Searches and Sick Tweets Predict Flu Cases
New Scientist (04/17/14) Aviva Rutkin

A new algorithm mines data from Wikipedia to track flu cases across the United States. The program is designed to monitor certain entries that a sick person would look up, such as "flu season" and "fever," and hourly download publicly available information on how many people nationwide accessed the pages. In comparing their data with figures from the U.S. Centers for Disease Control, the researchers found they could accurately predict the number of cases in the county two weeks earlier and with a difference of just 0.27 percent. In addition, tweets about sickness, mentions of activities one might need to be healthy, and changes in Twitter use could be useful for monitoring a specific group of people, says Pennsylvania State University's Todd Bodnar. His team at the Center for Infectious Disease Dynamics analyzed the Twitter feeds of 104 students, and its new algorithm was able to identify with 99 percent accuracy if a student had suffered from flu during a given month.
View Full Article - May Require Free Registration | Return to Headlines | Share Facebook  LinkedIn  Twitter 


A Face to Remember
Scienceline (04/17/14) Sarah Lewin

Massachusetts Institute of Technology (MIT) researchers have developed a way to change profile photos to highlight a person's most memorable facial features. The researchers, led by MIT graduate student Aditya Khosla, say the technology could change everything from Facebook profile pictures to political campaigns and advertising strategies. They found that although people could not guess which pictures would be memorable in an online test, certain images were consistently remembered better than others. The researchers analyzed 2,222 photos and determined their memorability scores to develop software that can predict the memorability of photos it has never seen before. The program identifies a wide range of tiny details that contribute to memorability but would be impossible to find and define on their own. The program makes a chain of small, random changes to a photo, creating several photos with only slight differences from the original. The researchers tested the newly changed images on volunteers using an online memory game, and found that 75 percent of the images they altered to be more or less memorable produced the desired reaction in the participants. The researchers want to reduce the time the algorithm takes and to reduce errors and artifacts introduced in the process.


Brain-Derived Computing Beyond Von Neumann
Scientific Computing (04/18/14) Nages Sieslack

In an interview, Heidelberg University professor Karlheinz Meier discusses the emerging field of neuromorphic computing and Europe's ambitious Human Brain Project (HBP). "Neuromorphic computers are systems with the same massive parallelism as the brain and the same functions on the microscopic and the macroscopic level," Meier says. "Communication between cells is carried out by stereotypic action potentials that propagate through the network asynchronously and in continuous time." Meier says neuromorphic systems' most valuable quality is their ability to self-configure based on their input data. He also says the success of HBP hinges on collaboration between neurobiologists, theoretical neuroscientists, mathematicians, physicists, and engineers. Meier says HBP will follow two complementary strategies for assembling neuromorphic computers, and the project's goal is the construction and operation of six technology platforms that aggregate neuroscience data and apply it to brain models on an exascale system, from which researchers will attempt to derive very-large-scale neuromorphic computers. "By the end of the 30-month ramp-up phase, HBP plans to operate a physical model system with 4 million neurons and 1 billion synapses in Heidelberg, and a system of 0.5 million ARM cores in Manchester, UK," Meier notes.


Abstract News © Copyright 2014 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: technews@hq.acm.org
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe