Association for Computing Machinery
Welcome to the October 23, 2009 edition of ACM TechNews, providing timely information for IT professionals three times a week.

HEADLINES AT A GLANCE


FCC to Draft Net Neutrality Rules, Taking Step Toward Web Regulation
Washington Post (10/23/09) P. 18A; Kang, Cecilia

The U.S. Federal Communications Commission (FCC) unanimously voted to start drafting rules to support Internet neutrality. "It's hard to imagine anything more important to the future of the success of our economy than a healthy and vibrant Internet, and there is no question that the openness of the Internet is the secret sauce to its success," said FCC chairman Julius Genachowski following the vote. He also noted that the FCC has a core agenda to drive investment, innovation, and opportunity in 21st century communications. In addition to providing their customers with consistent Web traffic, Internet service providers (ISPs) would need to detail their network traffic management operations to ensure no wrongdoing. The U.S. government's previous stance on the Web regulation issue was one of non-involvement, but it changed its position as worries mounted that ISPs could begin to exhibit favoritism toward their products and services. Crafting the rules will be a long, contentious process as wireless carriers, telecommunications firms, and other industry players and participants sound off on the benefits and drawbacks. The FCC's Democratic commissioners are vocal advocates of net neutrality rules, while the agency's Republican commissioners argue that such regulations are unnecessary. Critics are concerned that Web regulation will have an adverse effect on innovation in the Internet sector. However, Genachowski disagrees that the rules would hamper investment in broadband networks. "I reject the notion that we must choose between open Internet rules and investment by service providers in their networks," he said during the meeting.
View Full Article - May Require Free Registration | Return to Headlines


ACM Software Competition Pushes Students to Create Smarter Software
Campus Technology (10/22/09) Schaffhauser, Dian

ACM's International Collegiate Programming Contest (ICPC), sponsored by IBM, challenges students to solve real-world problems using open technology and advanced computing methods in a very short time period. Last year's competition attracted tens of thousands of students on 7,100 teams from universities in about 90 countries. "The world faces many daunting problems such as pandemic diseases, climate change, water pollution, food safety, finite energy resources, as well as issues with urban management and mass transportation," says IBM's Doug Heintzman, ICPC's sponsorship executive. "At IBM, we believe we have a responsibility to help develop the next generation of technology leaders, help them to understand and tackle these complex business issues." ICPC executive director Bill Poucher, a professor at Baylor University, says the contest gives students the opportunity to demonstrate their talents and present themselves to top recruiters. "The contest is also a forum for advancing technology in an effort to better accommodate the growing needs of the future," Poucher says. "At the same time, the competition is a chance for students of similar interests to exchange ideas and peer educate." Following the regional contests currently underway, finalists will attend the World Finals, which will take place in February 2010 in Harbin, China, hosted by Harbin Engineering University.


Where the Virtual World and Reality Meet
EuroNews (10/21/09)

Researchers in Barcelona are developing virtual reality spaces that incorporate touch-sensitive tiles and immersive animations. Pompeu Fabra University professor Paul Verschure says his research team has built an experience-induction machine as part of the PRESENCCIA project to understand how humans can exist in physical and virtual environments simultaneously. One of the project's major challenges was creating a credible virtual environment, which required the researchers to understand how people's brains construct a vision of the world. "Imagine what we see is sort of rapidly jumping about--that would not be a believable experience for us," Verschure says. "So that means one thing we have really tried to engineer here also from a psychological perspective is how do I feed this continuity of expectations that our brain is generating about the world." The researchers say the ultimate goal is to advance human-computer interaction beyond the traditional keyboard, screen, and mouse. "What we're trying to do is to understand why people behave in a more or less natural way in a virtual reality," says PRESENCCIA project coordinator Mel Slater. Petar Horki, a student at Austria's Graz University of Technology, is using PRESENCCIA concepts to create a virtual reality system that uses mind control, allowing the user to simply think about an action to perform that action in the virtual world. "Actually, I'm not doing anything, I'm just imagining I'm doing a brisk foot movement, and by this imagination I can move at least in this virtual room," Horki says.


Universities Face Challenges of Embracing the World Wide Web
University of Southampton (ECS) (10/09/09) Lewis, Joyce

The Research & Development Society recently presented Dame Wendy Hall of the University of Southampton's School of Electronics and Computer Science with the Duncan Davies Medal. Awarded annually, the medal was established to honor individuals who have helped to keep the United Kingdom at the forefront of research and development. In addition to accepting the award, Hall, president of ACM, also delivered the 2009 Duncan Davies Lecture titled "Research 2.0: The Age of Networks." Hall discussed the development of Web science, the increasing need for interdisciplinary research, and how universities will be impacted. The next generation of Web technology will allow for greater opportunities of interdisciplinary research by international teams, she said. "The role of government is crucial in setting policies to create an environment in which such research can flourish but in the age of networks, universities may also have to radically change in order to facilitate such exciting and necessary developments and better train people to meet the needs of businesses in the future," Hall said. "There is a growing realization that a clear research agenda aimed at understanding the current, evolving, and potential Web is needed."


47th Design Automation Conference Announces Calls for Submissions to Technical Program
Business Wire (10/21/09)

The organizers of ACM's 47th Design Automation Conference (DAC) are now accepting contributions for the technical program of next year's gathering, which takes place June 13-18, 2010, in Anaheim, Calif. Members of the electronic design and design automation (EDA) community have until Oct. 26, 2009, to make suggestions for special sessions, which are geared for emerging areas that have not been sufficiently covered in research papers. The deadline for suggestions on panels and tutorials is also Oct. 26. Submissions for research papers and Wild and Crazy Ideas (WACI) papers are due on Nov. 19. Research papers should focus on multicore/many core architectures, system prototyping technology, and embedded software design and debug, while WACI papers give industry professionals and researchers an opportunity to cover more unconventional technical ideas. Students have until Nov. 25 to enter their designs for electronic systems into the Student Design Contest, which is jointly sponsored by ISSCC and DAC. Submissions for User Track Presentations, due on Jan. 18, 2010, should address the real-life issues that IC designers, application engineers, and design flow developers face. And ideas for the specific design, design methodologies, and design automation topics of workshops should be submitted by Jan. 29.


Vulnerability Seen in Amazon's Cloud-Computing
Technology Review (10/23/09) Talbot, David

A new study by researchers from the Massachusetts Institute of Technology (MIT) and the University of California, San Diego (UCSD) suggests that leading cloud-computing services may be vulnerable to eavesdropping and malicious attacks. The study found that it may be possible for attackers to accurately map where a target's data is physically located within the cloud and use various strategies to collect data. MIT postdoctoral researcher Eran Tromer says the vulnerabilities uncovered in the study, which only tested Amazon.com's Elastic Computer Cloud (EC2) service, are likely present in current virtualization technology and will affect other cloud providers. The attack used in the study involves first determining which physical servers a victim is using within a cloud, implanting a virus on those servers, and then attacking the victim. The researchers demonstrated that once the malicious virtual machine is on the target's server, the malware can carefully monitor how access to resources fluctuates, potentially allowing the attacker to glimpse sensitive information about the victim. The attack capitalizes on the fact that virtual machines still have IP addresses visible to anyone within the cloud. The researchers found that nearby addresses often share the same physical hardware within the cloud, so an attack can set up numerous virtual machines, look at their IP addresses, and determine which ones share a server as the target. It may even be possible to detect the victim's passwords using a keystroke attack, Tromer says. Amazon's Kay Kinton says that Amazon has deployed safeguards that prevent attackers from using the techniques described in the study.


'Humanised' Computers That Know What You're Thinking
iTWire (10/23/09) Dinham, Peter

University of Surrey researchers have developed an automatic system to identify non-verbal social signals in conversations, which could enable computers to better understand meaning in speech and eventually lead to more intuitive computer interfaces. Tim Sheerman-Chase, the leader of the research effort, says the technology could allow social cues such as agreement, understanding, thinking, and questioning to be identified in videos. He says humans unconsciously use body gestures, emotions, and gaze direction to understand the meaning of spoken language, and enabling computers to understand these signals could provide an invaluable tool in the study of social situations and in advancing computer interfaces. In the study, human conversations were recorded and interesting clips from the conversations were rated by 21 annotators in a Web browser. "This provided clear examples of 'thinking' and 'not thinking,' along with positive and negative examples of the other non-verbal signals," Sheerman-Chase says. "A computer learned which parts of the face could be used to identify each social signal in video." He says the ability for computers to understand meaning in natural conversation is critical to enabling them to recognize and use humans' innate communication skills. "Although the accuracy of the system is far from perfect, it is comparable to human performance for some types of social signals," he says.


To Protect Your Privacy, Hand Over Your Data
New Scientist (10/22/09) Venkatraman, Vijaysree

A new proposal from the Massachusetts Institute of Technology's (MIT's) Human Dynamics Laboratory suggests that digital identities would be more secure if they were based on data collected from "reality mining," which studies how people behave using the digital data produced by computerized activities. MIT researcher Alex Pentland says that researchers and corporations have already realized the potential for reality mining, and argues that if people were to gain control over their own personal data mines they could use that information to prove who they are or inform smart recommendation systems. Pentland believes that allowing access to that data is safer than relying on key-like codes and numbers, which can be stolen or faked. He proposes creating a central body--supported by cell phone networks, banks, and the government--that would manage a data identity system. Banks could provide pieces of data to a third party running a check on a person's identity, and individuals could use their own data for services such as apps on a smartphone. Pentland says such a system would be far more powerful than existing recommender systems. He has been working to alleviate concerns over using personal data as an identification system, and has gotten the Harvard Law Lab and the World Economic Forum to develop and support the idea. He says 70 other industry partners have expressed interest and will be asked to test a design for the system.


Georgia Tech Wins NSF Award for Next-Gen Supercomputing
Georgia Institute of Technology (10/21/09) Wilson, Stefany

Georgia Tech has received a five-year, $12 million award from the U.S. National Science Foundation's Office of Cyberinfrastructure to develop and deploy a high-performance computing (HPC) system. Georgia Tech will lead a partnership of academic, industry, and government experts in the creation of two heterogeneous HPC systems that will expand the range of research projects scientists and engineers can undertake. The project will unite experts and resources from Georgia Tech's College of Computing, Oak Ridge National Laboratory, the University of Tennessee, the National Institute for Computational Sciences, Hewlett-Packard, and NVIDIA. "Our goal is to develop and deploy a novel, next-generation system for the computational science community that demonstrates unprecedented performance on computational science and data-intensive applications, while also addressing the new challenges of energy efficiency," says Georgia Tech professor Jeffrey Vetter. The project will be developed and deployed in two phases. An initial system delivery is planned for deployment in early 2010, with innovations in performance and power being achieved through heterogeneous processing based on graphics processing units.


ANU Plans $50m Supercomputer Spend
ZDNet Australia (10/20/09) Tindal, Suzanne

The Australian National University's (ANU's) National Computing Infrastructure (NCI) is expected to receive $50 million from the Australian government to build a new data center and supercomputer that could be available by 2012. The funding follows a recent report that strongly urged the government to upgrade Australia's computing infrastructure. The new computer primarily will be used for climate change modeling and research, although it also will be available for other projects from universities and institutions. NCI director Lindsay Botten is developing a project plan for the government that, if approved, will enable the computer to be bought and operational by late 2011 or early 2012. Botten says that approximately $30 million will go to the hardware and $20 million will be used to build the necessary data center facilities. None of the funding will go toward on-costs for the computer and data center, which would have to be paid by research institutions using the facility. ANU recently negotiated the purchase of a Sun supercomputer, which is expected to be operational by the end of the year. That computer will be capable of a peak performance of 140 teraflops.


RIT Scientists Use Supercomputers to ‘See’ Black Holes
Rochester Institute of Technology (10/19/09) Gawlowicz, Susan

Rochester Institute of Technology (RIT) scientists will use several grants to expand their supercomputer and use two of the world's fastest supercomputers to advance research on black holes. RIT researchers are using supercomputers to run mathematics and computer graphics to simulate elements of black holes that cannot be seen directly. "It is a thrilling time to study black holes," says RIT's Manuela Campanelli. "We're nearing the point where our calculations will be used to test out one of the last unexplored aspects of Einstein's General Theory of Relativity, possibly confirming that it properly describes the strongest gravitational fields in the universe." RIT's computer lab hosts NewHorizons, a cluster that consists of 85 nodes with four processors each, connected through an Infiniband network capable of sending data at 10 Gbps. The researchers recently received an $85,000 U.S. National Science Foundation (NSF) grant to improve their computer power by nearly 50 percent with an additional 20 nodes. The researchers also were awarded 3.5 million CPU hours on the Ranger supercomputer through the TeraGrid. The researchers also received a NSF grant that reserves time on Blue Waters, a supercomputer that is scheduled to begin production in 2011 at the University of Illinois' National Center for Supercomputing Applications. Blue Waters will consist of more than 200,000 processing cores, and is expected to be the most powerful supercomputer in the world for open scientific research.


Intelligent System to Help Autistic Children Recognize Emotions
AlphaGalileo (10/19/09)

Computer scientists from Nanyang Technological University in Singapore are developing an intelligent facial expression recognition system that would help autistic children understand the emotions of other people. The system makes use of derivative-based filtering to locate the edge of the human face, as well as boosting classifier to recognize facial expressions. Teik-Toe Teoh, Yok-Yen Nguwi, and Siu-Yeung Cho of the Center for Computational Intelligence in the School of Computer Engineering use Gaussian derivatives and Laplacian derivatives, and filters out non-face images using Adaboost. The technique is capable of finding key fiducial points for feature extraction and selection processing. Meaningful features are then classified into the corresponding classes. The team says the use of derivative filtering and boosting classifier makes for an efficient facial expression recognition system, which would be a portable device.


Researcher Paves Alternate Path for Hard Drives
EE Times (10/19/09) Merritt, Rick

Carnegie Mellon University professor Jimmy Jian-Gang Zhu is developing a prototype hard disk technology based on his microwave-assisted magnetic recording (MAMR) technique, which could potentially allow three terabits (Tbits) of data to be stored on a square inch of a spinning disk. Meanwhile, Seagate Technology is developing heat-assisted magnetic recording (HAMR) technology, which uses a laser light on each drive head to heat a portion of the disk just before data is written to it, and Hitachi Global Storage Technologies is working on a way to use patterns to track bit location on media. One analyst says the industry may have to choose one of the competing technologies within the next year. Either patterned media or HAMR could create densities of 1 to 10 Tbits per square inch, and both have similar costs based on available estimates. However, Zhu believes his approach could cost less than either technology and provide similar advancements. His lab is fabricating a prototype device and plans to test its performance. In MAMR, a drive head emits a microwave field that excites the electrons in the media, building up energy that makes it easier to write data. MAMR uses a localized high frequency AC magnet field generated by a magnetic thin film stack integrated with existing recording heads. Zhu says the stack contains only a few more magnetic layers than modern recording heads. He says that MAMR "represents the least disruptive approach with substantial gain of storage capacity."


Abstract News © Copyright 2009 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: technews@hq.acm.org




Unsubscribe
Change your Email Address for TechNews (log into myACM)


About ACM | Contact us | Boards & Committees | Press Room | Membership | Privacy Policy | Code of Ethics | System Availability | Copyright © 2014, ACM, Inc.