Association for Computing Machinery
Welcome to the March 22, 2013 edition of ACM TechNews, providing timely information for IT professionals three times a week.

ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).

HEADLINES AT A GLANCE


IBM Moves Toward Post-Silicon Transistor
IDG News Service (03/21/13) Joab Jackson

IBM researchers have developed a method for creating transistors that could be fashioned into virtual circuitry that mimics how the human brain operates. The transistors are made from strongly correlated materials that have characteristics favorable for building more powerful and energy-efficient circuitry. Although the materials should act like conductors, they are actually insulators. "They don't obey conventional band theory,"and under certain conditions they can change their conductive states, says IBM researcher Stuart Parkin. IBM's approach involves electrons being introduced through contact with an ionic liquid consisting of large, irregularly shaped molecules. When a voltage is applied to the liquid, and the liquid is placed on an oxide material, the material can change from a conductor to an insulator. Parkin says this approach should be more energy-efficient than standard silicon transistors because the resulting transistors would not need to be constantly refreshed with a power source to maintain their state. "The scaling of conventional-based transistors is nearing an end after a fantastic run of 50 years," he says. "We need to consider alternative devices and materials that will have to operate entirely differently."


A Strange Computer Promises Great Speed
New York Times (03/21/13) Quentin Hardy

Lockheed Martin is preparing to become the first company to incorporate a quantum computer in its business operations. The company bought an early version of a D-Wave Systems quantum computer two years ago and Lockheed says the technology is now ready to help the firm solve some science and business problems millions of times faster than can be done with conventional computers. The company will use the quantum computer to create and test complex radar, space, and aircraft systems, says Lockheed's Ray Johnson. "This is a revolution not unlike the early days of computing," Johnson says. "It is a transformation in the way computers are thought about." The technology also could be used to determine the behavior of proteins in the human genome, a bigger and tougher problem than sequencing the genome. Quantum researchers "are taking a step out of the theoretical domain and into the applied," says Microsoft researcher Peter Lee. In Lockheed's system, a quantum computing processor made from a lattice of tiny superconducting wires is chilled close to absolute zero. It is then programmed by loading a set of mathematical equations into the lattice. The approach, known as adiabatic quantum computing, has been shown to have promise in applications such as calculating protein folding.


Hackers Can Be Battlefield Targets, Says NATO Report
The Hill (03/21/13) Carlo Munoz

The North Atlantic Treaty Organization (NATO) is releasing a cyberwarfare handbook, the Tallinn Manual, stating that civilian hackers who carry out cyberattacks during coordinated military campaigns can be targeted as combatants. With the goal of applying existing battlefield rules to the Internet, NATO's think tank, the Cooperative Cyber Defense Center of Excellence, commissioned the report, with input from the U.S. Cyber Command and the International Committee of the Red Cross. The handbook limits the types of networks that NATO members and their allies can strike, with civilian targets such as hospitals, dams, and nuclear power stations disallowed. The handbook provides for "proportionate countermeasures" in response to a strike, with lethal force against hackers only allowed in response to cyberattacks that resulted in death or significant property damage. "You can only use force when you reach the level of armed conflict," says the handbook's lead author, U.S. Naval War College professor Michael Schmitt. "Everyone talks about cyberspace as though it's the wild west. We discovered that there's plenty of law that applies to cyberspace."


Gender Gap in Tech Salaries Is All Gone, Dice Reports
InfoWorld (03/20/13) Katherine Noyes

Women now make as much as men in the information technology (IT) field, according to the latest salary survey from Dice. The survey found that since 2009, the average salaries of men and women technology professionals have been equal, as long as they have the same level of experience and education and parallel job titles. However, overall, men out-earned women in the 2013-2012 Dice Salary Survey, earning an annual income of $95,929, compared with $87,527 for women. Of the top five tech positions by gender, project manager was the only title shared by both men and women. Project manager was the top position for women, followed by business analyst, other IT, quality assurance tester, and technical recruiter. For men, software engineer held the top spot, followed by systems administrator, project manager, IT management, and applications developer. The survey also found that 58 percent of women were satisfied with their salaries, which is comparable to the 56 percent among their male counterparts. "When it comes to technology employment, it's a skills-driven marketplace," says Dice's Tom Silver. "The ability to apply that know-how to a given problem remains the core of employment--why tech professionals get hired and how they are compensated."


Georgia Tech Computer System Predicts NCAA Basketball Champion
Georgia Tech News (03/20/13) Jason Maderer

Georgia Tech’s Logistic Regression/Markov Chain (LRMC) computerized college basketball ranking system predicts the University of Florida will be the winner of this year's NCAA men's college basketball tournament. LRMC predicts that Florida, Louisville, Indiana, and Gonzaga are the teams most likely to advance to the Final Four, with Florida and Gonzaga playing for the championship. During the season, LRMC uses scoreboard data to create a weekly ranking of all 347 Division I NCAA teams. The computer system analyzes every game and factors in the margin of victory and where each game is played. "Our system combines the aspects of performance and strength of schedule by rewarding game performance differently according to the quality of each opponent," says Georgia Tech professor Joel Sokol. Last year, the LRMC team presented a paper that shows LRMC has been the most accurate predictive ranking system over the last 10 years, outperforming more than 80 others, including the NCAA’s Ratings Performance Index (RPI). "Compared to something like RPI, LRMC is able to predict which team is better by taking the margins of victories and losses into account," Sokol says.


HTTPS Security Encryption Flaws Found
InformationWeek (03/19/13) Mathew J. Schwartz

Security researchers have uncovered weaknesses that could be exploited in certain types of encrypted Web communications. The weaknesses can be found in the RC4 encryption algorithm that is frequently used to secure SSL/TLS communications within secure Web pages. The flaw was recently disclosed by University of Illinois at Chicago professor Dan Bernstein at the Fast Software Encryption conference. His presentation was based on his research conducted with researchers at the University of London and the Eindhoven University of Technology. The researchers found that RC4 is not sufficiently random, so an attacker could recover some plaintext from a communication secured using TLS and RC4. Such attacks could occur "from statistical flaws in the keystream generated by the RC4 algorithm, which become apparent in TLS ciphertexts when the same plaintext is repeatedly encrypted at a fixed location across many TLS sessions," the researchers said. They also noted that about "50 percent of all TLS traffic is currently protected using the RC4 algorithm. It has become increasingly popular because of recent attacks on CBC-mode encryption on TLS, and is now recommended by many commentators."


Can Control Theory Make Software Better?
MIT News (03/19/13) Larry Hardesty

Researchers at the Massachusetts Institute of Technology and Georgia Tech recently demonstrated the application of control theory principles to formal verification, in a method that could benefit approximate computation. The researchers adapted the Lyapunov function used in control theory, which has a standard graph that slopes everywhere toward its minimum value. If the Lyapunov function represents a physical system's dynamics and the minimum value depicts a stable state, then the curve of the graph guarantees the system will move toward increased stability. The researchers imagine a computer program as a set of rules for navigating a space defined by the program's variables and the program instructions' memory locations, with execution problems such as overloading the memory being regions in the space. Formal verification then requires proof that the program's variables will never reach danger zones. The researchers believe their method will be effective at least for control-system software and that it has possible applications in approximate computation. “This is the first work that is formalizing the notion of robustness for software," says University of Pennsylvania's George Pappas. "It’s a paradigm shift...to a view where you try to see how robust your software is to changes in the input or the internal state of computation.”


NSF Supports U.S. Participation in the Launch of International Effort Aimed at Making Data Easier to Share Among Researchers
National Science Foundation (03/18/13)

The U.S. government is helping to launch the new international Research Data Alliance (RDA), an interdisciplinary organization that will seek to accelerate data-driven innovation through research data sharing and exchange. The official launch was the first plenary session, March 18-20, in Sweden. Computational scientists and experts from multiple disciplines will look to push the discussion forward on removing barriers to sharing research data and stimulating more interaction and development within the data community. Rensselaer Polytechnic Institute professor Francine Berman and Indiana University professor Beth A. Plale will lead the U.S. involvement in RDA. The National Science Foundation (NSF) is supporting U.S. participation through a $2.5-million grant that promotes the coordination and development of infrastructure for data sharing. RDA currently is working to build community interest, garner international recognition for the importance of its goals, and grow its membership. "The establishment of RDA promises to break through inertia by 'just doing it'--that is, RDA supports mechanisms that enable data researchers and scientists to quickly adopt best practices and share and exchange data," says NSF's Alan Blatecky.


Quantum Computer Could Solve Prime Number Mystery
New Scientist (03/18/13) Jacob Aron

The University of Barcelona's Jose Latorre and the Autonomous University of Madrid's German Sierra say they have devised the first quantum algorithm to count prime numbers. The researchers say the algorithm could be used with an 80-qubit computer to break the record for primes. Such a computer would be bigger than any that exist today, but would be much smaller than the 1,000-bit computer needed for Shor's quantum algorithm to beat the record for factorization. "The idea is simple, elegant, and surprisingly powerful," says the Massachusetts Institute of Technology's Seth Lloyd. "It would be a good thing to do with a small quantum computer, if you had one." The Clay Mathematics Institute offers $1 million in prize money to solve a puzzle related to prime numbers, and the researchers' quantum algorithm could be used to win the competition.


Where Siri Has Trouble Hearing, a Crowd of Humans Could Help
Technology Review (03/18/13) Jessica Leber

University of Rochester researchers have developed Scribe, a rapid-fire crowdsourcing program that they say could provide new ways to enhance voice-recognition applications. Scribe's algorithms direct human workers to type out fragments of what they hear in a speech. The program can direct workers to unique but overlapping sections of a speech and then give them a few seconds to recover before asking them to type again. Scribe uses natural-language processing algorithms to string together the typed-out fragments into a complete transcript, and can produce a transcript of captions with a delay as short as three seconds using as few as three words. The researchers tested Scribe with workers from Amazon's Mechanical Turk. The crowdsourced work appears to be only slightly less accurate than that of a professional stenographer, according to Rochester professor Jeffrey Bigham. "What Scribe is starting to show is the ability to work together as part of a crowd to do very difficult performance tasks better than a person can do alone," Bigham says.


Laser-Like Photons Signal Major Step Towards Quantum ‘Internet’
University of Cambridge (03/19/13)

In a discovery that brings the quantum Internet closer to reality, Cambridge University researchers have found a new way to generate single photons with tailored properties from solid-state devices that are identical in quality to lasers. “Our results in particular suggest that multiple distant qubits in a distributed quantum network can share a highly coherent and programmable photonic interconnect that is liberated from the detrimental properties of the chips," says research leader Mete Atature. "Consequently, the ability to generate quantum entanglement and perform quantum teleportation between distant quantum-dot spin qubits with very high fidelity is now only a matter of time." The researchers created a semiconductor Schottky diode device with individually addressable quantum dots, which served as a photon source. Quantum dot transitions were used to generate single photons through the group's resonance fluorescence technique. In quantifying the likeness of the photons to lasers with respect to coherence and waveform, the group found they were identical. Protocols already have been proposed for quantum computing and communication that depend on this photon-generation scheme, and the discovery can be applied to other single photon sources. “We are at the dawn of quantum-enabled technologies, and quantum computing is one of many thrilling possibilities,” Atature says.


Record Simulations Conducted on Lawrence Livermore Supercomputer
Lawrence Livermore National Laboratory (03/19/13) Breanna Bishop

Lawrence Livermore National Laboratory (LLNL) researchers say they have conducted unprecedented simulations using all 1,572,864 cores of the Sequoia supercomputer. Sequoia is based on IBM's BlueGene/Q architecture, and is the first machine to exceed 1 million computational cores as well as the world's second-fastest supercomputer, operating at 16.3 petaflops. The researchers say the simulations are the largest particle-in-cell code simulations, based on the number of cores, ever performed. High-performance computers such as Sequoia enable the monitoring of simultaneous evolution of tens of billions to trillions of individual particles in highly complex systems. LLNL's Frederico Fiuza conducted the simulations to examine the interaction of ultra-powerful lasers with dense plasmas in a proposed method to generate fusion energy in a lab setting. The method is known as fast ignition and involves lasers capable of delivering more than a petawatt of power in a fraction of a billionth of a second to heat compressed deuterium and tritium fuel to temperatures exceeding 50 million degrees Celsius. "This means that a simulation that would take an entire year to perform on a medium-size cluster of 4,000 cores can be performed in a single day," Fiuza says.


Encryption Chip Fights Off Sneak Attacks
IEEE Spectrum (03/18/13) Yu-Tzu Chiu

National Chiao Tung University (NCTU) researchers have created a processor that could improve the security of cloud-based services by preventing side-channel attacks. Side-channel attacks analyze computation time, power consumption, and electromagnetic emissions to steal cryptographic keys or information about cloud server operations. Cloud computing uses various virtual machines that share physical resources such as processors on a single computer. Malware on a virtual machine could track resource behavior to determine the cryptographic keys used by another virtual machine on the same physical server. The NCTU team focused on processors that perform elliptic-curve cryptography, which is gaining popularity because the approach requires much shorter cryptographic keys than alternative public-key cryptography methods. The NCTU chip makes side-channel attacks so lengthy as to be infeasible, with researchers estimating it would take five or six years for a side-channel attacker to acquire information. The chip uses a heterogeneous dual-processing-element architecture, balancing computation time with power consumption depending on the operation type. The chip also features a digitally controlled oscillator to increase its ability to generate random numbers. The design stops hackers from detecting telltale changes in computation length that could reveal cryptographic keys by making computation time for different types of work appear equal.


Abstract News © Copyright 2013 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe