Association for Computing Machinery
Welcome to the November 4, 2011 edition of ACM TechNews, providing timely information for IT professionals three times a week.

ACM TechNews mobile apps are available for Android phones and tablets (click here) and for iPhones (click here) and iPads (click here).


Japan Pushes World's Fastest Computer Past 10 Petaflop Barrier
Wired News (11/02/11) Cade Metz

Researchers at Fujitsu and RIKEN announced that Japan's K Computer is capable of 10.51 quadrillion floating point operations per second, breaking the 10 petaflop barrier for the first time. The K Computer will be named the world's fastest supercomputer in the next Top 500 list, which will be released Nov. 14, 2011, says University of Tennessee professor Jack Dongarra, who oversees the list. Fujitsu built the cluster using its SPARC64 VIIIfx processors, which are specifically designed for high-performance computing. The K Computer includes 864 server racks and more than 88,000 interconnected central processing units. The K Computer's average performance is about 93 percent of its peak 10 petaflop speed, according to the Linpack benchmark. IBM and Cray are working with the U.S. Department of Energy to build a 20 petaflop machine, which should go live sometime in 2012. The next major milestone for the supercomputing industry is an exascale computer, which would be 100 times faster than the K computer. "We’ll see hardware capable of producing exascale calculations in 2017 or 2018," says Cray's Margaret Williams.

Students Trade Hacking for U.S. Scholarships
Bloomberg (11/03/11) Heather Perlberg

The United States is boosting its cybersecurity efforts by offering scholarships to software developers in exchange for a commitment to work for the government. The government is currently proposing to increase the budget for its Scholarship for Service program by 67 percent to $25 million in 2012, according to the U.S. National Science Foundation, which administers the program. Since the program began in 2001, about 1,500 graduates have joined 140 agencies, including the National Security Agency, the Central Intelligence Agency, and the Department of Homeland Security. The scholarship program sponsors about 300 students, covering full tuition for up to two years, lodging, books, and a stipend of up to $30,000. The vast majority of the students fulfill their two-year work obligation to the government, with just a few being bought out by private companies such as Microsoft. Although the government’s goal is to retain as many skilled professionals as possible, the U.S. benefits from having more cybersecurity experts, regardless of which sector they end up working for. The government is training people that know the technology and can teach it and bring it to wherever they are going, says Department of Health and Human Services' Patrick Kelly.

IBM Open-Sources 'Internet of Things' Protocol
Read Write Web (11/02/11) Scott M. Fulton III

IBM is collaborating with Eurotech to donate a complete protocol for asynchronous inter-device communication to the Eclipse Foundation. The Message Queuing Telemetry Transport (MQTT) protocol could potentially enable an Internet of things in which digitally empowered devices send messages to one another. In its final draft proposal, IBM notes that MQTT could enable a wide range of machine-to-machine messaging solutions. For example, IBM says the protocol could enable public and private transit systems "to monitor ... critical alerts, adjusting their routes and even notifying commuters and customers of alternative routes, transportation, lodging, and meals. Social networks could subscribe, allowing residents and commuters to interact, adapt, and even provide feedback and status to the city." The draft proposal also contends that current HTTP-based Web services protocols are inadequate and require adaptation in the context of machine-to-machine communication. "Open source messaging components ... will have to work equally well across the constrained networks and embedded platforms that are inherent to [the] physical world of machine-to-machine systems," the draft says. "This will enable a paradigm shift from legacy point-to-point protocols and the limitations of protocols like [simple object access protocol] or HTTP into more loosely coupled yet determinable models."

3D Chips: The Next Electronics Revolution
Computerworld (11/02/11) Lamont Wood

The semiconductor industry is moving toward three-dimensional (3D) chip design, stacking dies and moving data from one layer to another. Some researchers are using though-silicon vias (TSVs) to create 3D chip designs. Dies currently are connected to each other with TSVs, ensuring that the interconnects between devices are microns long instead of millimeters long, cutting signal latency by orders of magnitude and reducing power consumption. In fact, consultant Herbert Reiter says developing 3D chips that consume 50 percent less power than two-dimensional (2D) chips is an achievable goal. Sematech's Sitaram Arkalgud notes that layering memory on top of chips is widely viewed as another potential application for 3D technology. He says the first generation of volume-production 3D devices should appear in 2013. However, although 3D chips may not reach the market until 2013, a variant of the technology, known as 2.5D, already is in production. The 2.5D technology has certain characteristics that make it more attractive than 3D, which still has issues with heat dissipation and interference caused by the copper in the TSVs, says Xilinx's Liam Madden.
View Full Article - May Require Free Registration | Return to Headlines

Major Breakthrough Improves Software Reliability and Security
Columbia University (11/02/11)

Columbia University researchers have developed Peregrine, software designed to improve the reliability and security of multithreaded computer programs. "Our main finding in developing Peregrine is that we can make threads deterministic in an efficient and stable way: Peregrine can compute a plan for allowing when and where a thread can 'change lanes' and can then place barriers between the lanes, allowing threads to change lanes only at fixed locations, following a fixed order," says Columbia professor Junfeng Yang. "Once Peregrine computes a good plan without collisions for one group of threads, it can reuse the plan on subsequent groups to avoid the cost of computing a new plan for each new group." The researchers say the program gets at the root cause of software problems, enabling Peregrine to address all of the issues that are caused by nondeterminism. They note that Peregrine can handle data races or bugs, is very fast, and works with current hardware and programming languages.

GPUs, Low-Power Pave Path to Exascale Supercomputing
EE Times (11/03/11) Sylvie Barak

Although supercomputing is advancing, the path to exascale computing is blocked by power efficiency, cost, and data security limitations. Nevertheless, reaching exascale computing is a concrete goal for the industry, one that could be reached by 2018, says Gartner Research's Carl Claunch. "Most supercomputer buyers are being constrained more by budget than by anything else," Claunch says. He notes that despite the great strides in computer modeling and elasticity in high-performance computing (HPC), the drive for faster, smaller, and less expensive systems remains strong. The United States, China, and Japan have been the most aggressive in the pursuit of exascale systems, investing hundreds of millions of dollars into research. But despite that investment, many potential supercomputer clients are still being constrained by power restrictions or even floor space limitations, according to Claunch. In addition, "[graphics processing units (GPUs)] are a very important part of heterogeneous computing in terms of alleviating the bottlenecks," says Supermicro's Don Clegg. Cray's Margaret Williams sees an increasing role for GPUs in future supercomputing systems.

Security Expert Warns of Cyber World War
Fox News (11/01/11)

Cybersecurity expert Eugene Kaspersky warns that the threat of a cyberterrorist attack with "catastrophic consequences" is a real and present danger made all the more probable in a world that is nearly in a state of cyberwar. "There is already cyberespionage, cybercrime, and hacktivism--soon we will be facing cyberterrorism," he says. U.K. prime minister David Cameron, speaking at the recent London Cyber Conference, also stressed the potency of the cyberwar threat. "Every day we see attempts on an industrial scale to steal government secrets--information of interest to nation states, not just commercial organizations," he said. "Highly sophisticated techniques are being employed." Cameron promised that the response to these threats will be as strong as the response to any other threat to national security. The United States and Britain used the conference to outline principles that they hope will form the basis of international cooperation in Web governance, wherein states would work collaboratively on issues such as security and copyright protection without imposing additional strictures on users. U.K. foreign secretary William Hague noted that whatever disagreements come out, the rapid development of the Internet means that discussions of its future and governance must migrate to an international platform.

Screen-Spy Program Can Read Texts and Emails
New Scientist (11/02/11) Melissae Fellet

Sneaky snoops could steal private text messages or sensitive email from mobile devices used in a public space, up to 60 meters away, according to researchers from the University of North Carolina at Chapel Hill. To prove that such an attack is realistic, the team developed iSpy, software that can pick up text messages remotely, using only known techniques. Exploiting the magnified keys feature of smartphones, the software analyzes video footage and identifies letters based on how they pop up in larger bubbles on small touchscreens when pressed. The program assigns an accuracy probability to each detected letter, and correctly identifies them more than 90 percent of the time. The software then identifies words, both individually and in the context of the message being sent; to capture passwords, it collects letters and does not perform any word recognition. The program can identify messages from video taken from an ordinary mobile phone camera from a distance of three meters away, and from video taken with a digital single-lens reflex camera from 12 meters. "We were surprised at how well that worked," says researcher Jan-Michael Frahm.

Researchers Defeat CAPTCHA on Popular Websites
IDG News Service (11/01/11) Lucian Constantin

Stanford University researchers have developed an automated tool that can decipher Completely Automated Public Turing tests to tell Computers and Humans Apart (CAPTCHAs), which are used by many Web sites as an anti-spam test. The Stanford team, led by researchers Elie Bursztein, Matthieu Martin, and John C. Mitchel, developed various methods of cleaning up purposefully introduced background noise and breaking text strings into individual characters for easier recognition. Some of the CAPTCHA-breaking algorithms are based on tools used by robots to orient themselves in new environments. The researchers created Decaptcha, which was run against CAPTCHAs used by 15 high-profile Web sites. The only tested site that could not be broken was Google. The researchers also developed several recommendations to improve CAPTCHA security, including randomizing the length of the text string, randomizing the character size, applying a wave-like effect to the output, and using collapsing or lines in the background.

New Algorithm Could Substantially Speed Up MRI Scans
MIT News (11/01/11) Helen Knight

Massachusetts Institute of Technology (MIT) researchers have developed an algorithm that could reduce the time it takes to process a magnetic resonance image (MRI) from 45 to 15 minutes. The algorithm, developed by MIT professors Elfar Adalsteinsson and Vivek Goyal, uses information gained from the MRI's first contrast scan to produce subsequent images. The algorithm uses the first scan to predict the likely position of the boundaries between different types of tissue in the subsequent contrast scans. "Given the data from one contrast, it gives you a certain likelihood that a particular edge, say the periphery of the brain or the edges that confine different compartments inside the brain, will be in the same place," Adalsteinsson says. For each pixel, the algorithm calculates what new information it needs to construct the image and what information it can take from the previous scan, such as the edges of different types of tissue, says MIT graduate student Berkin Bilgic. The researchers are working to improve the algorithm by accelerating the time it takes to process the raw image data into a final scan that can be analyzed.

US Energy Agency Demos Blazingly Fast Network
Network World (11/01/11) Carolyn Duffy Marsan

The U.S. Department of Energy recently launched the Energy Sciences Network (ESnet), a 100 Gbps Ethernet network that will enable researchers to create more complex, real-world simulations in climate change, particle physics, astronomy, and other scientific fields. ESnet's Steve Cotter says the network was built using federal stimulus grants to achieve two primary goals. "One was accelerating the deployment of 100G so that the equipment manufacturers didn't shelve the technology on fears that there wouldn't be demand," Cotter says. "The other reason they gave us the money was to build a next-generation network test bed and to fund network research." The 100G network will connect the U.S.'s Lawrence Berkeley National Laboratory, the Argonne National Laboratory, and the Oak Ridge National Laboratory using dark fiber. "We believe we are the first network at this scale to collect that information and make it available to researchers so we can begin to look at the energy consumption of networks," Cotter says.

A New Method for the Compression of Complex Signals Is Presented
Universidad de Carlos III de Madrid (10/31/11)

Researchers at Universidad Carlos III de Madrid (UC3M) and the University of Southern California (USC) have developed a compression method that improves the compacting of video signals, which could be used to study brain function by analyzing the electrical signals from brain scans. The researchers say their method presents a new type of transform for the compact representation of video sequences. "Our object of interest is the video and our problem is to compress it, that is, to represent it in the most compact manner possible," says UC3M's Eduardo Martinez Enrique. The researchers say the compression method can compact energy more efficiently than previous methods. It will enable researchers to reduce a binary system to transmit a video using streaming or digital terrestrial TV. Other possible applications for the technology include noise reduction in video, data compression in sensor networks, and the study and interpretation of brain behavior. "Our transform extends this concept to video sequences, because it can follow the most suitable directions throughout a sequence of images, also taking into account the temporal dimension," Enrique says.

NCWIT Trying to Increase Number of Women in Technology Field
Campus Technology (10/25/11) Tim Sohn

The National Center for Women & Information Technology (NCWIT) has partnered with academia and industry to improve the recruitment and retention of women in information technology (IT) and expects an additional 1,000 women to graduate with IT-related degrees next year. The program, called NCWIT Pacesetters, provides a peer group for discussing experiences, research, and results; puts on the annual NCWIT Summit on Women in IT; and offers case studies on educational reform, recruitment, and retention. Other offerings include social science research on workforce participation, education, and innovation; outreach campaigns on IT careers and education for women; and an online inventory of resources. The Pacesetters program has boosted the percentage of women computing graduates at the University of Virginia from 10 percent to 25 percent, and led to twice as many female engineer interns at Google. Meanwhile, the University of California, Santa Cruz says the program helped increase the number of female majors in computer science by 40 percent.

Abstract News © Copyright 2011 INFORMATION, INC.
Powered by Information, Inc.

To submit feedback about ACM TechNews, contact: [email protected]
Current ACM Members: Unsubscribe/Change your email subscription by logging in at myACM.
Non-Members: Unsubscribe