Association for Computing Machinery
Welcome to the June 3, 2011 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Also, please download our new ACM TechNews iPhone App from the iTunes Store by clicking here and our new ACM TechNews iPad App by clicking here.

HEADLINES AT A GLANCE


Caltech Researchers Build Largest Biochemical Circuit Out of Small Synthetic DNA Molecules
California Institute of Technology (06/02/11) Marcus Woo

California Institute of Technology (Caltech) researchers say they have built the most complex biochemical circuit ever created out of DNA-based devices. "We're trying to borrow the ideas that have had huge success in the electronic world, such as abstract representations of computing operations, programming languages, and compilers, and apply them to the biomolecular world," says Caltech's Lulu Qian. Qian, who worked with Caltech professor Erik Winfree, used a new kind of DNA-based component to build the largest artificial biochemical circuit ever made. The researchers' approach involves standardized components, which enables the system to be more complex but still work reliably. "We want to make better and better biochemical circuits that can do more sophisticated tasks, driving molecular devices to act on their environment," Qian says. The researchers used pieces of DNA to make logic gates, which receive and produce molecules as signals. The molecular signals travel between specific gates, connecting the circuit as if they were wires. The largest circuit the researchers made contains 74 different DNA molecules and can compute the square root of any number up to 15 and round down the answer to the nearest integer. "Like Moore's Law for silicon electronics, which says that computers are growing exponentially smaller and more powerful every year, molecular systems developed with DNA nanotechnology have been doubling in size roughly every three years," Winfree says.


E-Mail Fraud Hides Behind Friendly Face
New York Times (06/02/11) Matt Richtel; Verne G. Kopytoff; John Markoff

Spear phishing scams are rapidly spreading, one of the latest examples being a recently disclosed attempt to steal hundreds of accounts on Google's Gmail system and monitor the accounts of many prominent users. Security experts say that these efforts, which utilize fake emails that are personalized for the intended victim, are far more sophisticated than standard spear phishing initiatives, and thus are harder for recipients to spot. Symantec says that spear phishers' most popular targets are government agencies and senior managers and executives. To deter such attacks, Google encourages people to employ a two-step process that transmits a special code to their mobile phone when they log into Gmail. Meanwhile, the U.S. Defense Department asks staff to use a digital signature on their emails to authenticate their identity. However, West Point computer security expert Lt. Col. Gregory Conti warns that spear phishers have the advantage of momentum, given that their forgeries can be realistic and very subtle. He says that one reason the problem is getting harder to counter is that fraudsters are using social networks to amass personal information about potential targets.


Computer Science Grads Fielding 'Multiple Job Offers'
Network World (06/01/11) Ann Bednarz

There are about two or three job openings for every computer science graduate in 2011, according to Dice.com. Today, tech unemployment is about 4 percent, while the national average is about 9 percent, according to the U.S. Bureau of Labor Statistics. Meanwhile, Moody's Analytics is predicting that about 138,000 technology jobs will be added by the end of this year. Currently, Dice.com has more than 80,000 technology and engineering jobs listed, a 60 percent increase from two years ago. The Computer Research Association (CRA) says the number of undergraduate degrees earned in U.S. computer science departments has been rising for two years, and rose 9 percent last year. CRA also reports that the total number of new computer science enrollees increased for the third straight year. However, Dice.com's Alice Hill says the number of computer science grads is still small compared to the need. Dice.com found that 18 states and Washington, D.C. have fewer graduates than job openings, giving new graduates leverage that they have not had in recent years. "If you're a computer science degree holder, this is probably the best year to negotiate hard and maximize that first starting salary," Hill says.


Phase Change Memory-Based 'Moneta' System Points to the Future of Computer Storage
UCSD News (CA) (06/02/11) Catherine Hockmuth

Researchers at the University of California, San Diego (UCSD) have developed Moneta, a phase-change memory (PCM) solid state storage device that is thousands of times faster than conventional hard drives and up to seven times faster than current solid-state drives (SSDs). Moneta uses PCM, which stores data in the crystal structure of a metal alloy called a chalcogenide. SSDs have no moving parts, which makes them more energy efficient than conventional hard disk drives. "Phase-change memory-based solid state storage devices will allow us to sift through all of this data, make sense of it, and extract useful information much faster," says UCSD professor Steven Swanson. The PCM memory chips store data by switching the alloy between a crystalline and amorphous state, using heat and an electrical current. The chips read the data with a smaller current to determine which state the chalcogenide is in. Swanson says the technology could be ready for market in a few years as improvements are made to the underlying phase-change system. "Moneta gives us a window into the future of what computer storage systems are going to look like, and gives us the opportunity now to rethink how we design computer systems in response," he says.


Mind-Controlled Computing for the Disabled
Israel21c (06/02/11) Karin Kloosterman

Researchers at Ben-Gurion University of the Negev have developed MinDesktop, a graphical user interface (GUI) designed to help the physically challenged use their thoughts to send emails, surf the Web, turn on media players, and communicate with the outside world. "One application is helping disabled people with diseases like [amyotrophic lateral sclerosis] and other muscular problems, or for somebody who is using his hands for some other operation and cannot use a keyboard or mouse," says Ben-Gurion's Rami Puzis. During testing, able-bodied subjects were able to learn a new action and type a 12-character email in about seven minutes using only their thoughts. The breakthrough is in the way the students hierarchically organized the system's commands in a manner that is simple for a human to learn and use. "We are using three actions that the software and the headset can give us--two is not enough--and when there are many actions you define [in the system], it becomes noisy and harder to control," Puzis says.


China Joins Democratic Governments to Discuss Cyber Security Collaboration
V3.co.uk (06/01/11) Rosalie Marshall

Cybersecurity leaders from the United States, Britain, China, France, and India recently attended the Worldwide Cybersecurity Summit and discussed how the world should cooperate on tackling the challenge of cyberterrorism. Among the themes discussed at the summit were a need for greater international collaboration, what form this collaboration should take, and striking the right balance between online freedoms and regulation. A debate between India and France focused on whether the world needs more international regulation to supply a framework for states to effectively crack down on cybercriminals, or whether national legislation is sufficient. France argued that national laws should be enough, but that nation states should collaborate to battle cyberterrorists by measuring the risk of cyberthreats and establishing international security standards for businesses. Meanwhile, India contended that a global legal regime is required to address cyberterrorism, given the borderless nature of cyberspace. India's Latha Reddy cited the need for international collaboration between countries on definitions relating to cybersecurity, best practices, and a global regime to safeguard the financial sector. Chinese officials also raised the need for international cooperation on cybersecurity. The U.S.'s Shawn Henry discussed what the United States is doing to address cybercrime by spreading its resources on a global level. "The key to all this is partnership," Henry said.


Making the Case for Security
Technology Review (06/02/11) Brian Krebs

Purdue University professor Eugene H. Spafford says in an interview that security must be taken more seriously at the highest organizational levels if incidents such as the recent Sony PlayStation network breach are to be avoided. "Many [information technology] organizations have grown up from the level of system administrators who started at the bottom of the organizational hierarchy," Spafford notes. "These typically are people with computer science and technical training, but they don't speak business... As a result, they are not able to present the business case for security and privacy issues." Spafford argues that without sufficient, continuous, and proactive investment in security, a costly breach is inevitable. He cites a series of contributing and converging factors to the magnitude of such breaches, including the sheer number of systems and data available on the Internet, the growing population of Web-savvy Internet users, an increasingly larger number of targets and exploiters, and the occurrence of breaches overtaking the increase in law enforcement resources and our coping strategies. Spafford warns that hacks against many nonfinancial services are especially troubling, given that such organizations store information about users and their habits, and do not possess as strong a sense of stewardship over that data as they should.


The Million-Dollar Puzzle That Could Change the World
New Scientist (06/01/11) Jacob Aron

The single biggest problem in computer science, for which the Clay Mathematics Institute is offering a $1 million prize to whoever solves it, is determining whether P equals NP, which raises the issue that computation has a fundamental, innate limitation that goes beyond hardware. The complexity class of NP, or non-deterministic polynomial time, is comprised of problems whose solutions are difficult to come by but can be confirmed in polynomial time. All problems in the set P also can be found in the set NP, but the core of the P=NP? problem is whether the reverse also applies. If P turns out not to be equal to NP, it demonstrates that some problems are so involved naturally that crunching through them is impossible, and validating this theory would gain insight on the performance of the latest computing hardware, which divides computations across multiple parallel processors, says the University of Massachusetts, Amherst's Neil Immerman. The field of cryptography also could be affected by this proof, as even the toughest codes would be cracked by a polynomial-time algorithm for solving NP problems. On the other hand, finding an algorithm to solve an NP-complete problem would enable any NP problem to be solved in polynomial time, establishing a universal computable solution. This could support the creation of algorithms that execute near-perfect speech recognition and language translation, and that facilitate computerized visual information processing equal to that of humans.
View Full Article - May Require Free Registration | Return to Headlines


Cybersecurity Research Consortium: New Tech on the Way
IDG News Service (06/01/11) Grant Gross

The Northrop Grumman Cybersecurity Research Consortium expects to deploy new technology in about a year. Organized 18 months ago, the defense contractor has entered into a five-year partnership with the Massachusetts Institute of Technology (MIT), Carnegie Mellon University, and Purdue University to improve mobile and cloud computing and to lower the cost of recovery from cyberattacks. Computer science researchers are examining how to use low-cost processors to perform specialized encryption tasks in a cloud environment as well as digital watermarking technology as a way to establish the integrity of data, says Northrop Grumman's Robert Brammer. "These projects have shown the ability in a lab environment to withstand various types of cloud cyberattacks," Brammer says. "We're beginning to test these techniques on larger scale [environments]." Infrastructure security is another area of focus, and one project seeks to optimize the configuration and location of security sensors on computer networks. MIT is looking for ways to recover from attacks by returning a computer to a recently clean state. The project seeks to automate system restoration by recording a computer's history and rolling back any changes caused by an attack.


Quantum Knowledge Cools Computers
ETH Zurich (06/01/11) Simone Ulmer

Researchers at ETH Zurich and the National University of Singapore have found that, under certain conditions, cold is generated instead of heat when deleting data if the memory is known "more than completely," as is the case during quantum-mechanical entanglement because it carries more information than a classical copy of the data. The Landauer's Principle states that energy is always released as heat when data is deleted. "According to Landauer's Principle, if a certain number of computing operations per second is exceeded, the heat generated can no longer be dissipated," which will put supercomputers at a critical limit in the next 10 to 20 years, says ETH Zurich professor Renato Renner. However, the new study shows that during quantum entanglement, the deletion operation becomes a reversible process and Landauer's Principle holds true. The researchers proved this mathematically by combining the entropy concepts from information theory and thermodynamics. "We have now shown that the notion of entropy actually describes the same thing in both cases," Renner says. The results show that in a quantum computer, the entropy would be negative, meaning that heat would be withdrawn from the environment and the machine would cool down.


Researchers Map, Measure Brain's Neural Connections
Brown University (06/01/11) Richard Lewis

Brown University researchers have developed software for studying the neural circuitry in human brains. The software creates two-dimensional (2D) neural maps that can be combined with Web-based digital map interfaces and other three-dimensional (3D) images. "In short, we have developed a new way to make 2D diagrams that illustrate 3D connectivity in human brains," says Brown professor David Laidlaw. "You can see everything here that you can't really see with the bigger (3D) images." The images are created using a medical imaging protocol that measures the water diffusion around nerves and myelin in the brain. The 2D maps can identify spots where the myelin could be compromised. Users also can export the 2D images to the Web using Google Maps. "The advantage of using this mode of distribution is that users don't have to download a large dataset, put it in the right format, and then use a complicated software to try and look at it, but can simply load a Web page," says Brown's Radu Jianu.


Coming to a Network Near You: Faster Wi-Fi
CNet (05/31/11) Stephen Shankland

The current fastest Wi-Fi standard is 802.11n with a maximum speed of 450 Mbps, but two new versions are in development: 802.11ac at 1 Gbps and 802.11ad at 7 Gbps, which are sufficient to open up the wireless streaming video market within the next two years. 802.11ac uses the 5 GHz frequency spectrum exclusively, while 802.11ad uses the 60 GHz spectrum, where there is substantially more space to pack different communication channels side by side. Analyst Jagdish Rebello says that although hundreds of megahertz are available in 60 GHz, "the flip side is signals do not propagate far at all." Therefore, it appears that 802.11ac will inherit mainstream networking access technology used in homes, businesses, and public Wi-Fi hot spots, according to the Wi-Fi Alliance's Kelly Davis-Felner. Like 802.11ac, 802.11ad carries the promise of video, but it has sufficient capacity for uncompressed high-definition video. The Wireless Gigabit Alliance envisions the technology as a DisplayPort connector to plugging monitors into computers. A major market push for the new Wi-Fi standards will be linking people using different systems in a multiplayer game, although analysts question whether the standards will address the latency issue. However, the most pressing issue for 802.11ac and 802.11ad may be getting all stakeholders to agree on standards.


The Largest Telescope Ever Built Will Rely on Citizens to Analyze Its Reams of Data
Popular Science (05/27/11) Clay Dillow

To prove that they can handle the Square Kilometer Array (SKA), the world's biggest and most sensitive radio telescope, researchers from Australia and New Zealand plan to launch a huge cloud computing initiative that will replicate the data flow required to run the giant telescope. The SKA will consist of 3,000 radio dishes spread as far as 2,000 miles from the central core. The telescope will generate such a massive volume of data that SKA could need data links with a capacity greater than that of the current Internet. To support the SKA, Australia is investing $80 million into the Pawsey Center supercomputing hub, which will be petaflop-capable and the third fastest supercomputer in the world when it goes online in 2013. However, even that capacity may be insufficient, so researchers plan to use cloud computing to distribute the data across thousands of computers and mainframes at universities and institutes around the world. The researchers say the plan would solve the problem of constantly adding computing capacity and it would engage the public and the global academic community.


Abstract News © Copyright 2011 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe