Association for Computing Machinery
Welcome to the February 2, 2009 edition of ACM TechNews, providing timely information for IT professionals three times a week.

HEADLINES AT A GLANCE


Layoffs Mean More Than Lost Wages for H-1B Visa Holders
Mercury News (02/01/09) Carey, Pete

As technology companies cut back, many IT professionals are facing the reality of unemployment, which is daunting enough without having to worry about leaving the country. Skilled IT workers who are in the United States through H-1B visas are supposed to leave the country as soon as they lose their jobs. Workers brought in by labor contracting firms can stay in the U.S. if they are not working only if the contracting firm continues to pay them. Vish Mishra, president of the Silicon Valley networking group The Indus Entrepreneur, says most workers losing their jobs have a good chance of obtaining a new one because of a shortage of technical personnel. In reality, most H-1B employees can wait a week or two before leaving, but not much longer. The current cutbacks are small compared to the number of H-1B layoffs during the dot-com crash, but concern has risen to a new level among H-1B workers. The H-1B visa has been hotly contested since its creation in 1990, with labor trying to limit its use in favor of American workers, and businesses trying to expand the number of visas beyond its current limit of 65,000 per year. The recent economic crisis has only heightened the tension, which has led to growing concern in India. The Telegraph of India recently reported that growing opposition could prove fatal to the H-1B program.


Internet Comes Before Electricity
New York Times (02/02/09) P. B4; Nicholson, Chris

University of Michigan engineers have set up a solar-powered satellite dish in Entasopia, Kenya, that provides the small village access to the Internet through several computers. In many rural areas in Africa, mobile phones have become the main mode of communication. From 2002 to 2007, the number of Kenyans using cell phones increased almost tenfold to reach about a third of the population, many of whom do not have landlines, according to the International Telecommunication Union. However, many of the cell phones are designed more for talking than for Web browsing, and wireless data networks in the area are slow and have spotty coverage. Satellite connections are now faster and more reliable, which is why companies such as Google, which funded the University of Michigan project, are looking to use satellites to provide Internet connections to the estimated 95 percent of Africans who do not currently have it. The dish in Entasopia, which is designed to operate for months in harsh conditions with little maintenance, along with two other satellite dishes in equally remote villages, is part of a larger effort by Google to provide small communities with new tools to access the Internet. Google paid for the final design of the stations and the monthly fees for satellite bandwidth. Google is uncertain whether these satellite stations can pay for themselves in rural areas, due to the cost of the equipment and bandwidth. Bandwidth fees for stations such as the one in Entasopia can cost up to $700 a month, though slower connections cost less.


Every IT Woman Needs a Peer Network: Here's Why
Computerworld (02/02/09) Farnsley, Gail

Much attention has been paid to the declining number of women entering computer science-related fields, but there are still many women entering and already in those fields, and the industry needs to ensure that they have a support network in place to keep them in the industry, writes Purdue University professor Gail Farnsley. Peer networking offers many benefits and can create opportunities and a strong support network for women in IT. Peer networking includes organized meetings that can provide a career-oriented agenda and ways to make contacts in the industry. A variety of local, regional, national, and international groups also are available, Farnsley notes. She says joining formal groups enables professionals and students to meet amazing and accomplished women and to learn from some of the top female IT professionals. More informal networking groups also are available that offer an opportunity for a more open forum to discuss whatever is on members' minds. Affinity groups are networks of people at the same company who share a common interest, not necessarily the same type of job. Members of these groups can include executives, managers, programmers, human resources professionals, marketers, and assistants. These groups provide a way to get to know people at different levels and in different departments within the company, and can be particularly beneficial to those looking to move to a new assignment. Finally, one-on-one mentoring also can be very beneficial, even if the mentor and mentee are of opposite genders.


Microsoft Searches for Group Advantage
Technology Review (01/30/09) Lemos, Robert

Microsoft researchers are investigating if using data from several members of a social group, a technique Microsoft calls "groupization," can lead to better search results. Initial findings based on experiments involving about 100 participating Microsoft employees suggest that using different types of groups could produce significantly improved results. Microsoft researchers have developed an algorithm that, on average, pinpoints at least one search result for all members of a group that they judge to be better than other results returned by conventional search algorithms. Microsoft says the new approach could help the company overcome an industry-wide plateau in the quality of search results. "Today, search engines are really challenged and are sort of at the cusp of having to know individuals better," says Microsoft Research computer scientist Jaime Teevan. The researchers are exploring how people with similar interests or attributes search for information, grouping people using explicit factors such as their age, gender, participation in certain mailing lists, and job function. In some cases, implicit groups, such as people who appeared to be performing the same task or have similar interests, were inferred. The researchers found that groups defined by demographics such as age and location have little in common for most searches, but groups of people with similar interests tend to rank similar search terms highly.


'There Is Vibrant Energy in C-DAC'
Frontline (India) (02/09) Vol. 26, No. 3, Ramachandran, R.

S. Ramakrishnan, director general of India's Center for Development of Advanced Computing (C-DAC), says in an interview that the center's diversification into numerous other IT-related activities and expansion of its portfolio of institutions is a necessary strategy in the facility's effort to organically build an ecosystem surrounding its core focus on high-performance computing (HPC). Ramakrishnan says that from C-DAC's perspective, the main challenge is maintaining reasonably good volumes of technology and products. He says that "we have been working on institutional mechanisms, how to push technology and products into some suitable instruments, etc. The reason is, at the end of the day, economics matter." Ramakrishnan says research and development remains an area where C-DAC can still perform outstanding work and sustain a competitive advantage. He cites the center's pioneering work in Indian language computing as one of its most significant accomplishments. "Now that... many more institutions have got merged, [C-DAC] has become one single institution that does research [in the field], brings out the basic tools, the resources, the corpora, and makes these things widely available, and also initiating constant work on adding to the spell checkers, etc. in all the 22 [Indian] languages," Ramakrishnan says. He believes that C-DAC's HPC research and development initiative will continue to fulfill an essential role despite the presence of many commercially available HPC systems and HPC vendors in India, given the facility's strong position in both the market and the user community.


Closing the Gap Between High-Speed Data Transmission and Processing
UCSD News (01/27/09) Ramsey, Doug

University of California, San Diego (UCSD) engineers have set world-record speeds for real-time signal processing while working to develop the first terabit-scale technology for optical processing. The researchers say the technology could lead to widespread improvements in networking, computing, and defense. UCSD professor Stojan Radic and his team demonstrated the first real-time sampling of a 320 Gbps channel. Developed at the California Institute for Telecommunications and Information Technology's (Calit2's) Photonics Systems Lab, the technology is part of an advanced program on parametric optical processing being funded by the Defense Advanced Research Projects Agency. "For the first time we have been able to process signals as fast as 320 Gbps by making more than eight copies of the signal and simultaneously sampling all the copies--thereby allowing us to do real-time processing," Radic says. The new technology set records for aggregate speed and the number of copies simultaneously sampled. "Calit2 has a strong interest in very fast optical processing in order to bridge the gap between transmission and real-time processing speeds," says Calit2 director Larry Smarr. "The future of the Internet--especially for data-intensive collaborative science--is predicated on finding ways to process data on the fly, even at the highest transmission rates." Radic says the goal of the four-year project is to achieve 1 terabit-per-second processing with a single technology platform.


Data Mining Promises to Dig Up New Drugs
ICT Results (02/02/09)

European researchers have developed a robot called Eve that uses artificial intelligence, data mining, and knowledge discovery technology to analyze the results of the pharmacological experiments that it conducts. The robot can make informed decisions on how effective different chemical compounds will be at fighting diseases, potentially providing more effective treatments and a faster development process for medicines. Eve relates the chemical structure of different compounds to their pharmacological activity to learn which chemical compounds should be tested next. "Over time, Eve will learn to pick out the chemical compounds that are likely to be most effective against a certain target by analyzing data from past experiments and comparing chemical structures to their pharmacological properties," says Jozef Stefan Institute researcher Saso Dzeroski. Dzeroski says Eve should help scientists and pharmaceutical companies identify more effective compounds to treat diseases, and help them find drugs in a fraction of the time and cost of current methods. Dzeroski says Eve is the first robot-based computer system capable of originating its own experiments, physically performing them, interpreting the results, and repeating the cycle. He says that instead of choosing compounds for testing at random, Eve can pick compounds that are more likely to be effective.


Is Technology Producing a Decline in Critical Thinking and Analysis?
UCLA News (01/27/09) Wolpert, Stuart

University of California, Los Angeles (UCLA) professor Patricia Greenfield says that critical thinking and analysis skills decline the more people use technology, while visual skills improve. Greenfield, the director of UCLA's Children's Digital Media Center, analyzed more than 50 studies on learning and technology. She found that reading for pleasure improves thinking skills and engages the imagination in ways that visual media cannot. She says the increased use of technology in education will make evaluation methods that include visual media a better test for what students actually know, and will create students that are better at processing information. However, she cautions that most visual media does not allocate time for reflection, analysis, or imagination. "Studies show that reading develops imagination, induction, reflection, and critical thinking, as well as vocabulary," Greenfield says. "Students today have more visual literacy and less print literacy." Greenfield also analyzed a study that found that college students who watched "CNN Headline News" without the news crawl on the bottom of the screen remembered more facts from the broadcast that those who watched with the crawl. She says this study and others like it demonstrate that multi-tasking prevents people from obtaining a deeper understanding of information.


Potholes Seen on Road to Silicon Photonics
EE Times (01/28/09) Merritt, Rick

Optical interconnects for multicore processors are still a ways off, even as the need for silicon photonics continues to grow, agreed a panel of experts from Hewlett-Packard (HP), IBM, Intel, the Massachusetts Institute of Technology (MIT), and Sun Microsystems at the recent Photonics West conference. The panelists said there currently is no available light source to power on-chip optics, while the cost, power consumption, and heat production of silicon photonics technology needs to be lowered before it is viable. HP Labs optics researcher Ray Beausoleil says the technology needs to provide a byte of bandwidth per Flop of performance. "Teams are working on silicon lasers in many companies, but even if the technology was working today it would still be five years from commercialization," says Intel's Mario Paniccia. IBM's Jeffrey Kash says Intel has taken an interesting hybrid approach by using III-V materials for the gain function in ways that could be packaged to be compatible with silicon photonics. MIT professor Eugene Fitzgerald says a more practical solution would be III-V devices on-board, provided an inexpensive platform for eliminating heat could be found. Sun engineer Ashok Krishnamoorthy says his company prefers to not have the light source on the chip, or even in the data center, because cooling is a problem.


Wireless App Development Marching On
InfoWorld (01/26/09) Krill, Paul

The demand for wireless application development remains strong despite the struggling economy, concludes a new Evans Data Wireless Development study. Evans Data surveyed more than 400 wireless developers in the commercial and corporate enterprise sectors and found that 94 percent of corporate developers expect wireless enterprise application development to either increase or stay at the same level during the next year. "I think the biggest driver on the enterprise side is the fact that many of the wireless applications that are being developed for the enterprise are improving productivity, which of course is key," says Evans Data CEO John Andrews. Andrews says applications being built include location-based services and games. The strongest expectations for growth come from the Asia-Pacific region, where only 6 percent of developers expect wireless application development to decrease. Developers said revenue potential is the most important factor when selecting a wireless platform as a deployment target, with 21 percent identifying bigger market opportunities when selecting a platform and only 15 percent naming platform openness as a deciding factor.


Game Provides Clue to Improving Remote Sensing
Duke University News & Communications (01/27/09) Merritt, Richard

Duke University researchers have developed an algorithm capable of determining the best strategy for winning a game of CLUE, a mathematical model that also could be used to help robotic mine sweepers find hidden explosives. Duke post-doctoral fellow Chenghui Cai says robotic sensors, like players in CLUE, take information from their surroundings to help the robot maneuver around obstacles and find its target. "The key to success, both for the CLUE player and the robots, is to not only take in the new information it discovers, but to use this new information to help guide its next move," Cai says. "This learning-adapting process continues until either the player has won the game, or the robot has found the mines." Artificial intelligence researchers call these situations "treasure hunt" problems, and have developed mathematical approaches to improving the chances of discovering the hidden treasure. Cai says the researchers found that players who implement the strategies based on the algorithm consistently outperform human players and other computer programs. Duke professor Silvia Ferrari, director of Duke's Laboratory for Intelligent Systems and Controls, says the algorithm is designed to maximize the ability to reach targets while minimizing the amount of movement.


Iowa Staters Advance Developmental Robotics With Goal of Teaching Robots to Learn
Iowa State University News Service (01/30/09) Krapfl, Mike

Iowa State University professor Alexander Stoytchev is leading a team of graduate students that is researching how robots can learn the same things that children learn over the first two years of their lives. Stoytchev specializes in developmental robotics, which combines robotics, artificial intelligence, developmental psychology, developmental neuroscience, and philosophy. "It's one of the newest branches of robotics," he says. "People have learned that it's unrealistic to program robots from scratch to do every task, so we're looking at human models." Stoytchev's team is working on various software programs that will enable a robot to learn and use different sets of skills. Researchers are working to help a robot learn on its own which everyday objects it can use and which it cannot, how to use objects as tools, and understanding language. "The essential goal of developmental robotics is for robots to learn how to learn," says graduate student Matt Miller. "We want them to learn how to take a situation, adjust to it, and learn from it." The researchers believe that simple interactions with objects will help the robot learn the common sense that people develop naturally through interactions in the real world. "The robots of the future will be generalists," Stoytchev says. "They will have the ability to learn how to perform new tasks on their own without human intervention."


UMD Professor Awarded $935,000 National Institutes of Health Research Grant to Study and Prevent Adverse Drug Reactions
University of Minnesota Duluth (01/21/09) Latto, Susan Beasy

University of Minnesota Duluth professor Ted Pedersen and University Minnesota Twin Cities professor Serguei Pakhomov have been awarded a three-year, $935,000 National Institutes of Health research grant to develop natural-language processing (NLP) techniques that search through medical records to quickly detect widespread adverse drug reactions. Pedersen says the goal of the project is to improve the quality of post-marketing surveillance for adverse drug reactions. He notes that although the Food and Drug Administration approves all drugs before making them available, people often take so many drug combinations that it is not possible to test every interaction. Additionally, pharmaceutical companies may not have conducted enough studies to identify possible adverse reactions, he says. Pedersen will use NLP to develop methods that can identify different statements that have similar underlying meanings in medical records to enable the quick identification of patients who are taking similar combinations of drugs and possibly suffering from adverse reactions.


How Low Can You Go?
Nature (01/24/09) Hand, Eric

Stanford University researchers have created holograms that pack information into subatomic spaces. The researchers used the quantum properties of electrons, instead of photons, as a source to create a quantum analogy to the conventional hologram. By encoding information into an electron's quantum shape/wave function, the researchers were able to create a holographic drawing containing 35 bits per electron. "Our results will challenge some fundamental assumptions people had about the ultimate limits of information storage," says Stanford graduate student Chris Moon. With the use of a scanning tunneling microscope, the researchers placed carbon monoxide molecules into a layer of copper, the holographic plate. The molecules were positioned to create a speckled pattern that would result in a holographic "s." The electrons that exist naturally at the surface of the copper served as the illumination by interfering with the carbon monoxide molecules to create a quantum hologram. The researchers read the hologram using the microscope to measure the energy state of a single electron wave function, and saw that they could read a "s" with features as small as 0.3 nanometers.


Web 3.0 Emerging
Computer (01/09) Vol. 42, No. 1, P. 88; Hendler, Jim

Web 3.0 is generally defined as Semantic Web technologies that run or are embedded within large-scale Web applications, writes Jim Hendler, assistant dean for information technology at Rensselaer Polytechnic Institute. He points out that 2008 was a good year for Web 3.0, based on the healthy level of investment in Web 3.0 projects, the focus on Web 3.0 at various conferences and events, and the migration of new technologies from academia to startups. Hendler says the past year has seen a clarification of emerging Web 3.0 applications. "Key enablers are a maturing infrastructure for integrating Web data resources and the increased use of and support for the languages developed in the World Wide Web Consortium (W3C) Semantic Web Activity," he observes. The application of Web 3.0 technologies, in combination with the Web frameworks that run the Web 2.0 applications, are becoming the benchmark of the Web 3.0 generation, Hendler says. The Resource Description Framework (RDF) serves as the foundation of Web 3.0 applications, which links data from multiple Web sites or databases. Following the data's rendering in RDF, the development of multisite mashups is affected by the use of uniform resource identifiers (URIs) for blending and mapping data from different resources. Relationships between data in different applications or in different parts of the same application can be deduced through the RDF Schema and the Web Ontology Language, facilitating the linkage of different datasets via direct assertions. Hendler writes that a key dissimilarity between Web 3.0 technologies and artificial intelligence knowledge representation applications resides in the Web naming scheme supplied by URIs combined with the inferencing in Web 3.0 applications, which supports the generation of large graphs that can prop up large-scale Web applications.
View Full Article - Link to Publication Homepage | Return to Headlines


Abstract News © Copyright 2009 INFORMATION, INC.
Powered by Information, Inc.


To submit feedback about ACM TechNews, contact: [email protected]




Unsubscribe
Change your Email Address for TechNews (log into myACM)