Welcome to the January 30, 2009 edition of ACM TechNews, providing timely information for IT professionals three times a week.
HEADLINES AT A GLANCE
Rapid Expansion Strains Elite Indian Institutes
Chronicle of Higher Education (01/30/09) Vol. 55, No. 21, P. A20; Neelakantan, Shailaja
Indian educators are very unhappy with the government's decision to rapidly double the number of Indian Institutes of Technology (IIT), supposedly to help India retain its best and brightest to nurture economic expansion. They say the rollout is straining the resources of the existing IITs as they struggle to absorb new students, either by adding more courses or sending faculty to teach at remote temporary sites. Faculty members, administrators, students, alumni, and industry experts agree that both the new and existing IITs will suffer from the government's hastiness to construct the new campuses; the new institutes will be unable to deliver a sound education, while the older institutes' quality and international reputation will be tarnished. Former Pan-IIT alumni association chairman Pradeep Gupta complains that the expansion "should have been done in a more planned, phased manner." Exacerbating the situation are political factors, such as local politicians vying to have a new institute built in their state. Former Chennai IIT director P.V. Indiresan says reducing the competitiveness of the admission's process will lead to a corresponding downgrade in the value of degrees, while educators already facing a crushing workload due to the faculty shortage will have even less time for research, impeding the institutes' evolution into true research universities. Academics say that a possible government-approved hike in faculty salaries at the institutes will not address the shortage, as India does not generate sufficient numbers of high-quality engineering doctorates. Critics also say the tough jobs academics have will discourage engineering graduates from becoming academics themselves, especially when the private sector offers much better pay.
With Economic Slump, Concerns Rise Over Data Theft
IDG News Service (01/29/09) McMillan, Robert
Laid-off employees are the biggest IT security threat created by the economic recession, according to a new McAfee study, which warned that cybercrime could cost businesses worldwide more than $1 trillion. The study surveyed 1,000 IT decision makers from 800 companies in eight countries. The study says that laid-off employees may steal intellectual property from their former employer in order to sell the information, improve their chances of getting hired with a competitor, or start a company of their own. In addition, acquisitions can leave IT workers unsure of how to report security problems or who to report them to. Existing controls also may not be monitored during an acquisition. Finally, workers who are unsure about their job security and the job security of their colleagues may be more hesitant to report security problems. Ignoring these problems can be costly. McAfee CEO Dave DeWalt says companies lose an average of $4.6 million in intellectual property during a security breach and have to spend about $600,000 to correct the problem. "We don't have the good risk models and as a result people are taking risks," says Purdue University computer science professor and study contributor Eugene Spafford. He says the frequency of security breaches will increase as a result of the recession as companies try to cope by cutting their information security expenses.
Intel Forms Lab in Europe
EE Times (01/29/09) Clarke, Peter
The formation of Intel Labs Europe (ILE) will enable the chipmaker to reorganize its research efforts in Europe under a single management structure. Martin Curley, professor of technology and business innovation at the National University of Ireland and global director of IT innovation at Intel, will serve as the director of ILE, which will establish "open labs" in Leixlip, Ireland, and Munich, Germany. Intel wants to host participation in European Union Framework 7 (FP7) projects and other collaborative research with companies and universities in Europe. ILE projects will likely involve visual computing, software development, enterprise solutions, green computing, advanced microprocessor research, and high performance computing. "With the foundation of ILE, Intel is establishing a strong network of its existing labs in Europe and preparing a platform for further potential investment and advanced innovation activity," the company says in a statement. Intel labs in Europe include facilities in Braunschweig, Germany; Barcelona, Spain; Gdansk, Poland; and Cologne, Germany.
Obama Unveils Cybersecurity Agenda
NextGov.com (01/23/09) Nagesh, Gautham
U.S. President Barack Obama has laid out a number of goals for improving the security of the nation's information networks. For instance, he has promised to declare the nation's IT infrastructure a strategic asset. In addition, the president has said he would appoint a national cyber advisor who would be responsible for developing a national cyber policy and for coordinating the efforts of federal agencies to improve cybersecurity. Obama also has pledged to prevent trade secrets from being stolen online from U.S. businesses by working with the private sector to develop new security technologies that would protect this information. Finally, the Obama administration has said it would work to develop the next generation of secure computers, software, and networking for national security applications and other vital parts of the nation's cyberinfrastructure.
46th DAC Technical and Pavilion Panel Committees Developing Exciting Panel Lineup
Business Wire (01/28/09)
Design automation professionals have begun to review the proposals that will become part of the lineup for the technical and pavilion panels of the 46th Design Automation Conference (DAC). They also will work with organizers and panelists to develop the sessions for the topics. Industry veteran Greg Spirakis, chair of the panel committee, is overseeing both the technical panel and pavilion panel committees. Juan Rey of Mentor Graphics is the chair of the technical panel committee, and the other members are CoWare's Eshel Haritan, IBM's Ruchir Puri, the Semiconductor Technology Academic Research Center's Hiroyuki Yagi, Texas Instruments' NS Nagaraj, Cadence Labs' Andreas Kuehlmann, and UCLA's Jason Cong. The pavilion panel committee is chaired by Rich Goldman of Synopsys, and the other members are Tiffany Sparks of Chartered Semiconductor, David Lin of Denali Software, Jim Lipman of Sidense, Dave Kelf of Sigmatix, Yatin Trivedi of Synopsys, and Sabina Burns of Virage Logic. "This diverse group of energetic and creative volunteers is assembling panel offerings which address a wide range of design challenges and some of the most important issues facing our industry," Spirakis says. The panel lineup and the program for the 46th DAC will be announced on May 4, 2009. The conference will be held July 26-31, 2009, at the Moscone Center in San Francisco.
Researchers Cool CPUs With Nano-Size Fridges
Computerworld (01/29/09) Lai, Eric
As chips become increasingly dense, the extreme heat they generate has pushed researchers to search for ways to mitigate it. Intel, RTI International, and Arizona State University researchers have developed a micro-refrigerator that can be installed on a chip to remove heat from hot spots. The micro-refrigerator, which would allow nanoscale systems to be smaller, uses less electricity than traditional heat sinks, fans, and liquid cooling systems, says RTI senior researcher Rama Venkatasubramanian. The micro-refrigerator is a super-thin film made from thermoelectric molecules such as Bismuth telluride and Antimony telluride, which convert heat into electricity. Venkatasubramanian describes the technology as using electrons to pump heat away. The researchers say they have been able to reduce heat on a simulated central processing unit by 15 degrees Celsius, but Venkatasubramanian is optimistic that using more thermally conductive materials on the chip would improve the heat reduction by as much as 40 degrees Celsius. The micro-refrigerator also would be very efficient because it can target hot spots on the chip and uses only between 2 to 3 watts when active.
Technology Review (01/29/09) Sauser, Brittany
Researchers at the Rochester Institute of Technology (RIT) and Lockheed Martin are working on a sensor-based monitoring system that assesses the health of a vehicle and alerts the driver to potential problems. The system uses a network of embedded smart sensors that are placed near problem-prone components. Information from the sensors is wirelessly transmitted to a central command center for analysis. Nabil Nasr, director of RIT's Center for Integrated Manufacturing, says the system goes beyond existing technology by predicting the future health or failures of vehicles. The project is part of a $150 million contract between Lockheed Martin and the U.S. Marine Corps, which will equip 12,000 military vehicles with the technology so the health of a vehicle can be quickly assessed before it is sent on a mission. The system includes both standard sensors for temperature and vibration monitoring, as well as customized smart sensors for monitoring the vehicle. Nasr says the researchers have developed sophisticated software for analyzing the data produced by the sensors. "The algorithms are extremely valuable because they help us build a model of predictive and condition-based maintenance, so we can predict failures before they occur, and we can make determinations about service based on the actual conditions of the equipment," says Randy Weaver, at the Rochester Genesee Regional Transportation Authority, which has been testing the technology for use in public transit systems.
Google Aims to Expose Network Meddling
InformationWeek (01/28/09) Claburn, Thomas
Google recently launched the Measurement Lab (M-Lab), an open research platform for testing Internet performance. M-Lab, which also is backed by the New America Foundation's Open Technology Institute, the PlanetLab Consortium, and academic researchers, is designed to provide Internet users with network-diagnostic information that can be used to identify network performance degradation. "Transparency has always been crucial to the success of the Internet, and, by advancing network research in this area, M-Lab aims to help sustain a healthy, innovative Internet," write Google's Vint Cerf and Stephen Stuart in a blog post. By making network performance data more accessible, Google hopes to make it more difficult for Internet service providers (ISPs) to degrade or block specific protocols or applications. The M-Lab effort is part of a battle between net neutrality proponents who want to ensure that everyone has equal access to the Internet, and ISPs who want the ability to choose who gets access to high-speed capabilities.
Weizmann Institute Scientists Create Working Artificial Nerve Networks
Weizmann Institute of Science (01/28/09)
At the Weizmann Institute of Science, Physics of Complex Systems Department professor Elisha Moses and former research students Ofer Feinerman and Assaf Rotem have created logic gate circuits made from living nerve cells grown in a lab. The researchers say their work could lead to an interface that links the brain and artificial systems using nerve cells created for that purpose. The cells used in the circuits are brain nerve cells grown in culture. The researchers grew a model nerve network in a single direction by getting the neurons to grow along a groove etched in a glass plate. Nerve cells in the brain are connected to a vast number of other cells through axons, and must receive a minimum number of incoming signals before they relay the signal. The researchers found a threshold, about 100 axons, below which the chance of a response was questionable. The scientists then used two thin stripes of about 100 axons each to create an AND logic gate. "We have been able to enforce simplicity on an inherently complicated system. Now we can ask, 'What do nerve cells grown in culture require in order to be able to carry out complex calculations?' " Moses says. "As we find answers, we get closer to understanding the conditions needed for creating a synthetic, many-neuron 'thinking' apparatus."
Many Task Computing: Bridging the Performance-Throughput Gap
International Science Grid This Week (01/28/09) Raicu, Ioan; Foster, Ian; Zhao, Yong
Researchers from the University of Chicago, Argonne National Laboratory, and Microsoft have conceived of Many Task Computing (MTC), a methodology designed to tackle the kinds of applications not easily supported by clustered high-performance computing or high-throughput computing (HTC). MTC "involves applications with tasks that may be small or large, single or multiprocessor, compute-intensive or data-intensive," the researchers write. "The set of tasks may be static or dynamic, homogeneous or heterogeneous, and loosely- or tightly-coupled." MTC's distinction from HTC lies in the timescale of task completion and the fact that the nature of the applications is frequently data-intensive. Many resources are utilized over short intervals to perform many computational jobs, both dependent and independent. Loosely-coupled applications involved in MTC are communication-intensive but are not naturally represented through the use of the standard message-passing interface. Applications that run on or generate large data volumes cannot scale without sophisticated data management, which makes them organically complementary to MTC, the researchers say. They conclude that MTC's impact on science will be profound, noting that "we have demonstrated good support for MTC on a variety of resources from clusters, grids, and supercomputers through our work on Swift, a highly scalable scripting language/engine to manage procedures composed of many loosely-coupled components, and Falkon, a novel job management system designed to handle data-intensive applications with up to billions of jobs."
Multicore Chips Leave Software Trailing, Warns Gartner
ZDNet UK (01/28/09) Barker, Colin
Software does not use multicore processors as effectively as it should, and the problem is likely to continue in the years to come, Gartner warns. Manufacturers are now shipping 32 processors per socket, which means that machines could host 1,024 processors within four years. Putting current software in advanced multicore machines would be akin to running a go-cart with a Ferrari engine, says Gartner analyst Carl Claunch. The fast development of chips with more cores also leads to more threads that each processor must handle. The same number of sockets becomes twice as many processors with each generation. Claunch notes that determining both the hard and soft limits on the number of processors for the software that runs today's servers is not easy. "The net result will be hurried migrations to new operating systems in a race to help the software keep up with the processing power available on tomorrow's servers," he says.
Fight Brews Over How to Build a Better Internet
Christian Science Monitor (01/29/09) P. 2; Arnoldy, Ben
The U.S. Internet infrastructure is slower and more expensive than many other developed nations, and reaches fewer of its citizens. About 10 percent of U.S. households do not have access to a high-speed or broadband data connection, and only 3 percent have fiber-optic connections capable of delivering the high-speed data rates that analysts believe will become a necessity in the future. Meanwhile, countries such as Sweden are already adopting the next generation of Internet infrastructure. Part of the economic stimulus package currently being debated by the U.S. Congress includes $6 billion to upgrade America's Internet. Although experts disagree on how that money should be used, few question the importance of improving the Internet's infrastructure. Experts say that slow Internet access is holding back economic and scientific progress. The Communications Workers of America says that a $5 billion investment in broadband expansion would create 100,000 new jobs in telecommunications and information technology. Large companies favor government tax breaks to promote rapid Internet expansion, while Internet service providers in underserved areas say they need grants to pay for extending broadband to remote locations. Under the current House bill, all the spending will be allocated through project grants, instead of tax credits, which may not get used for another year. Rob Atkinson, president of the Information Technology and Innovation Foundation, says the government should provide both grants and tax breaks, and says much more funding is needed. "We'd have to invest $30 billion to get to the same level [of broadband penetration] as the Swedes," he says.
Fighting Malware: An Interview With Paul Ferguson
InfoWorld (01/23/09) Grimes, Roger A.
TrendMicro senior researcher Paul Ferguson says the sheer volume of malware today is incredible, and the real challenge is collecting data from as many points as possible and arranging the facts so that law enforcement can use that information as evidence. "The better job we can do collecting and normalizing the data up front, the easier it is to help law enforcement to get subpoenas and arrest warrants," Ferguson says. In Russia, Ukraine, and Eastern Europe, a few large organizations make the majority of the malware, though they pretend to be many small groups. Part of Ferguson's job involves correlating data to identify members of these groups through digital fingerprints. These groups generally use tried and true techniques. Their bots and worms are very similar and attacks often come from the same IP addresses, hosts, and DNS services. However, even these large groups use numerous freelance, low-level operators that provide specific skills. A major problem is that many of the larger players use policy holes to operate out in the open in countries like Russia where people such as Ferguson are powerless to stop them. Ferguson says much of the malware coming from China is actually from Russian groups that use the millions of unpatched PCs in China to launch attacks. He says most of the hacking in China, aside from the few professional criminal groups focusing on corporate espionage and the state-sponsored attacks on other governments, is actually social.
University Researchers Collaborate With Virginia Department of Emergency Management to Develop Statewide Hazard Mitigation Plan
Virginia Tech News (01/22/09) Fay, Patrick
Researchers at the Virginia Tech Center for Geospatial Information Technology (CGIT) are working with Virginia's Department of Emergency Management (VDEM) to revise a statewide plan that communities use to prepare for natural disasters. The Virginia Standard and Enhanced Hazard Mitigation Plan identifies and profiles natural hazards throughout Virginia and suggests strategies to reduce the impact of these events. Most of the CGIT's work focuses on the Hazard Identification and Risk Assessment (HIRA) portion of the hazard mitigation plan, which identifies natural hazards relevant to Virginia, quantifies their relative significance, and determines which parts of the state are most at risk. HIRA also identifies state owned and/or operated facilities and other critical facilities at risk. "Our use of advanced geospatial information systems allows us to take all this complex data and make sense of it," says CGIT environmental geographic information systems manager Rachael Heltz-Herman. "Rather than just using a quick and easy method, our approach delves deep into each subject area to ensure that we are using the most relevant sources." CGIT's Thomas Dickerson says the center investigates and compiles data on hazards from a variety of natural hazard subject areas in an effort to develop the mitigation strategies. Some data is already in a geospatial format and can be used directly in the plan, while other necessary data must be compiled from tables of historical occurrences, Dickerson says.
Q&A: Bruce Maxwell
Colby Magazine (01/09) Vol. 97, No. 4, Clockedile, Rob
Colby College roboticist Bruce Maxwell is engaged in a multi-institution effort to build a humanoid robot called HUBO. His students' contribution will be a vision system and social interaction controller-manager. The development of a vision system for social robots has been one of Maxwell's primary areas of concentration for the last eight or nine years, and he says the system "can do a lot of things, like find faces, find people and analyze their shirt colors, track blobs, and find text." Maxwell says this past summer his team built a testbed for programs and the vision system without the need for simulation, using a miniature robot. He says the effort is funded by the National Science Foundation through its Partnerships for International Research and Education program, whose goal is the creation of collaborations between American and overseas institutions, specifically "to take American expertise in social robots and combine it with the Korean expertise in humanoid robots." Maxwell also is involved in the development of robot avatars for museums, envisioning a system in which a visitor to one museum can operate a robot remotely to get a virtual tour of another museum. Maxwell foresees applications for robots in the home, but he says making robots functional within a person's home requires overcoming various challenges, including processing power constraints.
Abstract News © Copyright 2009 INFORMATION, INC.
To submit feedback about ACM TechNews, contact: [email protected]
Change your Email Address for TechNews (log into myACM)