ACM TechNews (HTML) Read the TechNews Online at: http://technews.acm.org
ACM TechNews
April 25, 2008

Learn about ACM's 2,500 online courses and 1,100 online books
MemberNet
CareerNews
Unsubscribe

Welcome to the April 25, 2008 edition of ACM TechNews, providing timely information for IT professionals three times a week.


HEADLINES AT A GLANCE:

 

Imagine Cup Challenges Students to Save Earth With Technology
Agence France Presse (04/24/08)

Six teams, chosen from a field of 16,000 competitors, this week displayed their software programs designed to help preserve the environment at the Imagine Cup, Microsoft's annual innovation challenge. Worldwide, 185,000 university students from 100 countries registered to participate in this year's Imagine Cup. One team from each country will participate in the challenge in Paris in July. "It is our competition to generate passion in teens to use technology to change the world," says Microsoft's Chris Weber. Rochester Institute of Technology students took the U.S. championship with their low-cost combination of sensors and software that essentially allows home owners to monitor themselves and alerts them to energy waste. The RIT team says their invention is an affordable and advanced form of the "smart home" technology that could allow people to control their household systems through the Internet. RIT student Adam Risi says their device adds another layer to the smart home by allowing it to tell owners when there is a problem that could be fixed to conserve energy. This year Imagine Cup included categories for video games and photography. The winning video game was "Ecocism," in which players maneuver animated hovercrafts in a quest to reforest a barren Earth and fight off spider-like robots trying to destroy the trees. "The raw potential and creativity that these students show in their projects help me to feel extremely optimistic about our future," says Microsoft's Scott Davidson.
Click Here to View Full Article
to the top


'Revolutionary' Collective Intelligence of Users Touted at Web 2.0 Expo
Computerworld (04/23/08) Havenstein, Heather

O'Reilly Media President Tim O'Reilly said at the O'Reilly Web 2.0 Expo that the advent of the Internet as a platform represents an "amazing revolution in human augmentation" comparable to the emergence of literacy. The expo's theme was Web 2.0's ability to tap the collective intelligence of many users. "The real heart of Web 2.0 is collective intelligence, which I have defined as harnessing the network effect to build applications that get better the more people use them," O'Reilly said. "Applications that are built on open, decentralized networks actually lead to new concentrations of power." O'Reilly encouraged enterprises to realize that enterprise data contains hidden meaning and useful information that can be harnessed with Web 2.0 tools. New York University professor Clay Shirky argued that people have a vast "cognitive surplus" that Web 2.0 technologies can draw upon. He calculated, for instance, that all of Wikipedia's constituent parts--lines of code, page edits, etc.--encompass 100 million hours of human cognition, and that the cognitive surplus typically consumed by the 200 billion hours of television Americans watch annually could potentially yield thousands of new Wikipedia projects a year.
Click Here to View Full Article
to the top


A CluE in the Search for Data-Intensive Computing
National Science Foundation (04/23/08) Cruikshank, Dana W.

The National Science Foundation's Computer and Information Science and Engineering (CISE) directorate has released a solicitation for proposals for the new Cluster Exploratory (CluE) initiative, which was announced as part of an arrangement between Google, IBM, and NSF. NSF hopes the initiative will lead to innovations in data-intensive computing and also serve as an example for future collaborative efforts between the private sector and academic computing researchers. CluE will give NSF-funded researchers access to software and services running on a Google-IBM cluster and allocate cluster computing resources for a broad range of proposals that will explore the potential of data-intensive computing to contribute to science and engineering research. "The software and services that run on these data clusters provide a brand new paradigm for highly parallel, highly reliable distributed computing, especially for processing massive amounts of data," says CISE assistant director Jeannette Wing. NSF will select proposals from academic researchers who will be able to access the cluster, with the NSF providing support to the researchers to conduct their work. Google and IBM will also provide support as well as cover the costs associated with operating the cluster.
Click Here to View Full Article
to the top


Earnings Gap Narrower for Women in IT
CRA Bulletin (04/25/08) Vegso, Jay

The difference in what men earn and what women earn is lower in several professional-level IT occupations than it is in the overall workforce, according to data from the Bureau of Labor Statistics. The median weekly earnings for women in all occupations last year was $614, or 80 percent of men's median weekly earnings. But women who were computer software engineers fared better, with median weekly earnings that were 87 percent of men's median weekly earnings. Meanwhile, female computer and information systems managers' median weekly earnings of $1,363 amounted to 85 percent of the $1,596 men in the same position earned. Conversely, the biggest pay gap between men and women in professional-level IT professions was in the network systems and data communications analyst position. The median weekly earnings for women in that position amounted to just 72 percent of men's median weekly earnings.
Click Here to View Full Article
to the top


SDSC, UCSD Mark Earth Day With Panel Discussion on 'Green' Computing Technologies
University of California, San Diego (04/22/08) Zverina, Jan

The San Diego Supercomputer Center and UC San Diego's Rady School of Management hosted a panel of academic and industry experts on Earth Day to discuss ways to make the world's rapidly expanding computer infrastructure more environmentally friendly. The discussion focused on promising research and techniques that could improve the energy efficiency of computing systems. A recent EPA study found that electricity use by servers and data centers in the U.S. was 1.5 percent of total energy consumption in 2006, and that usage would double in five years. Panelists at the UCSD event included UCSD Jacobs School of Engineering professor Tajana Simunic Rosing, who discussed her research on power and thermal management of computer systems. Rosing said that intelligent power management of computer systems, such as monitoring and taking advantage of variations in workloads, can lead to large energy savings. Sun Microsystems distinguished engineer Kenny Gross highlighted software innovations at Sun that have overcome many challenges associated with traditional hardware-based instrumentation approaches. Lastly, IBM Almaden Research Center program director Winfried Wilcke discussed energy-related trends in computer technology and systems architecture, including the importance of efficient cooling in data centers and some of the opportunities to boost power efficiency through technology improvements at the system and chip levels.
Click Here to View Full Article
to the top


FBI Releases Details of Expansive Data-Sharing Program
CongressDaily (04/23/08) Noyes, Andrew

The FBI has released more information about an initiative that promises to create a powerful high-tech clearinghouse for conducting nationwide criminal searches. The FBI disclosed that the Law Enforcement National Data Exchange (N-DEx) will serve as a secure Web site for searching and analyzing crime data from the various agencies. Agents will be able to use N-DEx to conduct criminal searches by "modus operandi," and to concentrate on identifying factors such as clothing, tattoos, associates, and cars, all from a single access point. The software for the information-sharing project will be able to analyze data to determine criminal activity hotspots and crime trends, assess the level of threat for individuals and addresses, and offer visualization and mapping capabilities. N-DEx is expected to be fully operational in 2010. N-DEx will offer capabilities similar to CopLink and the U.S. Navy's LiNX program, both of which have been "growing organically," according to the Center for Democracy and Technology's Jim Dempsey, who says the FBI is taking a "big bang" approach with N-DEx. Christopher Calabrese of the American Civil Liberties Union says the massive database could be an attractive target.
Click Here to View Full Article
to the top


Interview With Donald Knuth
InformIT (04/25/08) Binstock, Andrew

Computer scientist Donald E. Knuth, winner of ACM's A.M. Turing Award in 1974, says in an interview that open-source code has yet to reach its full potential, and he anticipates that open-source programs will start to be totally dominant as the economy makes a migration from products to services, and as increasing numbers of volunteers come forward to tweak the code. Knuth admits that he is unhappy about the current movement toward multicore architecture, complaining that "it looks more or less like the hardware designers have run out of ideas, and that they're trying to pass the blame for the future demise of Moore's Law to the software writers by giving us machines that work faster only on a few key benchmarks!" He acknowledges the existence of important parallelism applications but cautions that they need dedicated code and special-purpose methods that will have to be significantly revised every several years. Knuth maintains that software produced via literate programming was "significantly better" than software whose development followed more traditional methodologies, and he speculates that "if people do discover nice ways to use the newfangled multithreaded machines, I would expect the discovery to come from people who routinely use literate programming." Knuth cautions that software developers should be careful when it comes to adopting trendy methods, and expresses strong reservations about extreme programming and reusable code. He says the only truly valuable thing he gets out of extreme programming is the concept of working in teams and reviewing each other's code. Knuth deems reusable code to be "mostly a menace," and says that "to me, 're-editable code' is much, much better than an untouchable black box or toolkit."
Click Here to View Full Article
to the top


Google Taps Morgan Stanley Vet for CIO
Washington Post (04/24/08) Weisenthal, Joseph

Benjamin Fried, a programmer who rose through the ranks to run much of Morgan Stanley's computing infrastructure, will join Google as its chief information officer. Google's previous CIO, Doug Merrill, announced plans in March to depart the search giant for a new position at record label EMI. Previous interviews with Merrill noted the challenges he found running the tech department at Google, where most employees don't want to follow instructions from a centralized technology group. Fried, who serves on the editorial advisory board of the ACM's Queue magazine, will leave his post as Morgan Stanley's Managing Director of Information Technology as of May 2.
Click Here to View Full Article
to the top


Multicore Code Booster
HPC Wire (04/25/08)

A paper that investigates the issue of how to optimize multicore supercomputer utilization using the lattice Bolzmann code has earned a Best Paper Award in the application track at the IEEE International Parallel and Distributed Processing Symposium. "The computing revolution towards massive on-chip parallelism is moving forward with relatively little concrete evidence on how to best to use these technologies for real applications," wrote CRD researcher Samuel Williams in the paper. Williams and the other researchers devised a code generator that could efficiently and productively optimize a lattice Bolzmann code to support augmented performance on multicore supercomputers. They opted for the lattice Bolzmann code employed to model turbulence in magnetohydrodynamics simulations that play an important role in physics research. The code's performance on traditional multicore systems was usually poor, and the improvement yielded by the optimization research was significantly higher than any published to date. Insight into the construction of effective multicore applications, compilers, and other tools was another benefit of the research.
Click Here to View Full Article
to the top


IT Labor Shortage or Not, Gaps Remain
Baseline (04/23/08) Chickowski, Ericka

One in four CIOs say finding skilled IT professionals is their greatest staffing challenge, reveals a recent Robert Half International survey. Leaders of major companies have urged the government to increase the number of H-1B visas to help fill the gap, but others say that H-1B visas take jobs away from American workers and lower industry salaries. However, hiring managers looking for specific qualifications say it does not really matter if there are copious amounts of technology workers in the workforce, it only matters if their organizations can find the right workers for available positions. Many technology recruiters and industry insiders say there may be enough IT workers applying for jobs, but there are gaps in select skill sets that employers require to keep their IT department running. Many of the missing skills are nontechnical skills related to management. "What we're finding and what we're hearing is that companies no longer want people just with strong technical skills," says the Computer Technology Industry Association's Steven Ostrowski. "They need to have a combination of technical and business knowledge in that understanding how it fits into the business interests or the business operations of the organization they work for." Beyond management skills, security skills also are in high demand, according to a recent CompTIA survey, which found that 74 percent of 3,500 IT professionals and employers cited security skills as the top qualification needed. Moreover, just 57 percent of those surveyed said their existing employees had sufficient security skills. Other skills in demand include knowledge of Web 2.0 applications and Java 2 Platform and Enterprise Edition (J2EE).
Click Here to View Full Article
to the top


Evolution of Internet Powers Massive Particle Physics Grid
Network World (04/22/08) Brodkin, Jon

Uncovering clues about the universe's origins is one of the purposes of the Large Hadron Collider (LHC), and distributing the massive volume of data generated by particle collisions to the thousands of researchers around the world is the job of the Worldwide LHC Computing Grid, which will be composed of approximately 20,000 servers. "It's using some advanced features and new technologies within the Internet to distribute the data," says Open Science Grid executive director Ruth Pordes. "It's advancing the technologies, it's advancing the [data transfer] rates, and it's advancing the usability and reliability of the infrastructure." Raw data produced by the collisions is relayed over dedicated 10 Gbps optical-fiber connections to the CERN Computer Center, and from there routed to tape storage as well as to a CPU farm that processes information and generates "event summary data." Eleven Tier-1 sites around the world are then sent subsets of both the raw data and summaries; each site is linked to CERN though a dedicated 10 Gbps connection, while a general purpose research network is used to connect Tier-1 facilities to each other. Once reprocessed by the Tier-1 centers, the raw data is circulated to Tier-2 centers for analysis by physicists via general purpose research networks. Brookhaven National Laboratory's Michael Ernst says the LHC collisions will generate 10 petabytes to 15 petabytes of data annually. Pordes says Tier-2 researchers will be able to remotely access the LHC Computing Grid's servers when running complex experiments based on LHC data.
Click Here to View Full Article
to the top


UW's Computing Research Prowess Brings Microsoft to Madison
University of Wisconsin-Madison (04/23/08) Mattmiller, Brian

Microsoft will open the Microsoft Jim Gray Systems Lab at the University of Wisconsin-Madison later this spring. The advanced development laboratory will be directed by UW-Madison emeritus computer sciences professor and Microsoft Technical Fellow David DeWitt. DeWitt says the development lab will establish an academic partnership with UW-Madison that directly supports graduate education in computer sciences. "Microsoft is here because we are doing some of the best database work in the world and we have produced scores of graduates who have gone on to successful careers in the industry," Dewitt says. "Our focus will be on continuing the production of talented graduate students and taking on some of the great challenges in database systems." Microsoft will support a number of graduate research assistantships in the department starting this fall, and DeWitt says the partnership will provide internships to UW-Madison students and consulting opportunities for UW-Madison faculty. Gurindar Sohi, chair of the UW-Madison computer sciences department, says the school has a history of advancing computing, starting when the department shifted from a mathematical focus to a systems research environment that produced a steady flow of practical advances in commercial computers.
Click Here to View Full Article
to the top


Sun Looks to Free Up the Rest of Java
InfoWorld (04/22/08) Krill, Paul

Sun Microsystems is increasing its efforts to spread Java usage by making it a completely open-source platform that can be packaged with Linux distributions. Sun is also working with Linux distributors to have them offer an updated version of OpenJDK, which constitutes the open-source Java platform. The open sourcing process of the Java platform started in November 2006, but various components, including some encryption libraries, graphics libraries, the sound engine, and some SNMP management code could not be offered under the GNU General Public License. Sun's Rich Sands says the company has essentially removed most of the encumbrances over the past year, but there is more work to be done to offer the Java sound engine and SNMP code as open source. The process is expected to be completed this year, though developers may be able to proceed without components such as the sound engine. Sands says once Java is 100 percent open source it can be shipped as part of Linux. RedMonk analyst Michael Cote says having Java on Linux will help Sun spread Java as widely as possible.
Click Here to View Full Article
to the top


High Tech High School Preps Students for IT Careers
InformationWeek (04/24/08) Jones, K.C.

The Academy of Information Technology and Engineering (AITE) in Stamford, Conn., is a public high school that focuses on students with a passion for technology, information systems, engineering, and digital electronics. AITE students are defying numerous studies that show U.S. students are falling behind their peers in other countries in science, technology, engineering, and math. In addition to technology, the students are interested in current affairs, international politics, travel, and science, and much of their success is because of the school's architects and administrators. Unlike other schools, AITE's hallways and classrooms are filled with natural daylight with floor-to-ceiling windows. Every student is given a school-issued laptop that can connect to the Internet through Wi-Fi. The school offers over 600 students more than 30 IT electives, including architectural design, CAD technology, civil engineering, geographic information systems, Web production, robotics, pre-engineering, and digital electronics. Students interested in classes not offered at the school can take virtual classes online, and AP classes are offered by the University of Connecticut, Norwalk Community College, and the University of New Haven. Students apply through a lottery system, and AITE accepts students at all performance levels as long as they commit to the school's high standards. "These children now and, in future generations, are going to be digital masters," says AITE principal Paul Gross.
Click Here to View Full Article
to the top


What Can I, Robot, Do With That?
ICT Results (04/21/08)

The MACS (multi-sensory autonomous cognitive system) project is using the cognitive theory of "affordances," developed by psychologist James J. Gibson, to teach robots how to use objects they encounter in their environment instead of focusing on identifying what they are. Computer vision and knowledge techniques may identify an object as a chair, but a system of affordances will tell the robot that the object can be used for sitting. An affordance system will allow a robot to view an object of a certain height and width and know if it is acceptable for sitting, but the system will also allow the robot to make decisions such as if the object is light enough to carrying or if it needs to be pushed, or if it could be used for other purposes such as holding a door open. The goal of affordance-based machine cognition is to create a robot capable of using whatever it finds in its environment to complete a specific task. "Affordance-based perception would look at whether something is graspable, or if there is an opening, rather than worrying about what an object is called," says MACS project coordinator Erich Rome. The European Union-funded project successfully created an integrated affordance-inspired robot control system, which includes the implementation of a perception module, a behavior system, an execution control module, a planner, a learning module, and an affordance representation repository.
Click Here to View Full Article
to the top


Optimal Online Communication
Netherlands Organization for Scientific Research (NWO) (04/16/08)

Dutch researcher Peter Korteweg has developed algorithms that focus on optimizing communication to a central point in wireless networks, which is a critical problem for online networks. Korteweg looked to reduce communication costs, the time needed to collect data, and the processing time of messages. He was able to gain a better understanding of the impact of fast communication on the quality of the algorithm. Also, communication costs and message delays were in line with the best offline solution. Eindhoven University of Technology headed the project, which was funded with a grant from the Free Competition, formerly the Open Competition, of NWO Physical Sciences.
Click Here to View Full Article
to the top


USC, Retirement Homes to Collaborate on Technology for the Aging Population
The State (SC) (04/17/08) Hammond, James T.

The University of South Carolina, the Fraunhofer Institute for Software Engineering of Kaiserlautern, Germany, and the Palmetto Health System of Columbia, S.C., are collaborating on a project dedicated to developing technological solutions to the problems of aging. The Lutheran Homes of South Carolina and Still Hopes Episcopal Retirement Community have agreed to participate in the collaborative research, which will focus on three areas. First, the researchers will work to make homes safer for seniors, including the development of technological systems that help monitor their needs and assist with daily activities that often become more difficult with age. The second objective will focus on examining and promoting mobility outside the home by improving transportation safety and driving responses in seniors. Third, the researchers will study brain health to slow or prevent the onset of diseases such as Alzheimer's and Parkinson's. The products and services will be developed through USC's SeniorSmart Center, which will eventually comprise 10 to 20 researchers dedicated to helping seniors remain independent and engaged in their communities. Dieter Rombach, head of the Fraunhofer Institute for Software Engineering, says improving the quality of life is particularly important for Germany, which has the oldest average age of any industrialized country.
Click Here to View Full Article
to the top


To Defeat a Malicious Botnet, Build a Friendly One
New Scientist (04/22/08) Inman, Mason

University of Washington computer scientists want to create swarms of good computers to neutralize hostile computers, which they say is an inexpensive way to handle botnets of any size. Current botnet countermeasures are being overwhelmed by the growing size of botnets, the researchers say, but creating swarms of good computers could neutralize distributed denial-of-service attacks. The UW system, called Phalanx, uses its own large network of computers to shield the protected server. Instead of accessing the server directly, all information passes through the herd of "mailbox" computers. The good botnet computers only pass information when the server requests it, allowing the server to work at its own pace instead of being flooded by requests. Phalanx also requires computers requesting information from the server to solve a computational puzzle, which takes a small amount of time for a normal Web user but significantly slows down a zombie computer that sends numerous requests. The researchers simulated an attack by a million-computer botnet on a server protected by a network of 7,200 mailbox computers running Phalanx. Even when the majority of mailbox computers were under attack, the server was able to run normally.
Click Here to View Full Article
to the top


Computational Photography
American Scientist (04/08) Vol. 96, No. 2, P. 94; Hayes, Brian

The concept of computational photography is rooted in efforts by imaging laboratories to develop cameras that can not only digitize images, but carry out extensive computations on image data. Digital photography has for the most part been focused on rendering images in a manner that closely resembles traditional chemical photography, but a computer-equipped camera could enhance images with details that other cameras miss. Through the capture of more information about the light field, focus and depth of field can be corrected after the fact, while motion blur can be eliminated with other methods. Recording the complete light field requires a sensor capable of measuring both the intensity and the direction of every incident light ray, and while such a task is currently impossible for a single sensor chip, additional hardware can approximate the effect. One strategy to record a light field involves incorporating an array of "microlenses" in front of the sensor within a camera, with each individual microlens focusing an image of the main lens aperture onto a region of the sensor chip so that the sensor can perceive multiple small images and view the scene from slightly different angles. A light-field camera can facilitate shifts in point of view, while software for viewing the stored data set allows the photographer to move the plane of focus back and forth through the scene or generate a composite image with high depth of field. One method for dealing with motion blur involves a "flutter shutter" camera that opens and closes the shutter repeatedly in a nonuniform pattern so that the information needed to correct blur can be captured. Giving photographs a non-photorealistic look is the object of an experiment involving a camera that emphasizes three-dimensional geometric structures through the employment of a diagrammatic rendering style that can ease the distinction of parts against a dynamic background.
Click Here to View Full Article
to the top


To submit feedback about ACM TechNews, contact: [email protected]

To be removed from future issues of TechNews, please submit your email address where you are receiving Technews alerts, at:
http://optout.acm.org/listserv_index.cfm?ln=technews

To re-subscribe in the future, enter your email address at:
http://signup.acm.org/listserv_index.cfm?ln=technews

As an alternative, log in at myacm.acm.org with your ACM Web Account username and password, and follow the "Listservs" link to unsubscribe or to change the email where we should send future issues.

to the top

News Abstracts © 2008 Information, Inc.


© 2008 ACM, Inc. All rights reserved. ACM Privacy Policy.