ACM TechNews sponsored by Learn more about Texis, the text-oriented database providing high-performance search engine features combined with SQL operations and a development toolkit, that powers many diverse applications, including Webinator and the Thunderstone Search Appliance.   Learn more about Texis, the text-oriented database providing high-performance search engine features combined with SQL operations and a development toolkit, that powers many diverse applications, including Webinator and the Thunderstone Search Appliance.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Thunderstone or ACM. To send comments, please write to [email protected].
Volume 7, Issue 839: Friday, September 9, 2005

  • "Plan by 13 Nations Urges Open Technology Standards"
    New York Times (09/09/05) P. C2; Lohr, Steve

    A group of government officials from 13 nations will present a 33-page report at the World Bank calling for the international adoption of open-information technology standards, which are seen as critical to the expansion of economic growth, innovation, and efficiency. Several countries as well as some U.S. state governments are working to lower their reliance on proprietary software makers through the use of open source software. The report indicates that government policy ought to "mandate choice, not software development models," and makes a distinction between open technology standards and open source software. Still, open source software's proliferation clearly demonstrates how the open exchange of information technology can cut costs and enable users to innovate more easily. The report came out of a three-day February conference in Silicon Valley organized by Harvard Law School's Berkman Center for Internet and Society, where Brazilian, Indian, Chinese, and other government officials discussed tech standards and economic growth. Microsoft has actively participated in Web and Internet groups that have devised standards for data-sharing between different software programs, but some countries and states are pushing for open formats for documents, spreadsheets, and presentations as alternatives to proprietary formats such as Microsoft's Office programs. The World Bank favors open standards as a tool for economic growth in developing nations.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "High-Tech Goes Into Action in Disaster Zone"
    MSNBC (09/08/05); Boyle, Alan

    Rescue and relief operations in the region devastated by Hurricane Katrina are getting a boost from robots, interactive maps, meta-search engines, and other experimental high-tech tools. Lois Clark McCoy of the National Institute for Urban Search and Rescue says the response to the catastrophe highlights the need for a dramatic overhaul of disaster relief technologies. One tool that has proved very helpful in the rescue effort was the VGTV Xtreme robot, a video-equipped device that was used by Florida Task Force 3 to search for survivors or remains in structures deemed too unstable for human searchers. McCoy's institute and Autonomechs supervised a system for enhancing satellite imagery with virtual "pushpins" that indicate where survivors are and what they need in real time, which was employed by the U.S. Coast Guard shortly after the disaster to pinpoint rescue missions. Another tool employed in the relief effort is the online "Missing Persons Center," a joint project between the San Diego Supercomputing Center, the National Institute for Urban Search and Rescue, and AudienceCentral. The Missing Persons Center allows a person to enter data about missing people, and have that data matched up with multiple databases of survivors. The San Diego Supercomputing Center is also participating in the KatrinaSafe.com meta-search project, and the center's Greg Lund said at least 30 "safe lists" for survivors and their family members have been made available by various organizations.
    Click Here to View Full Article

  • "Vint Cerf: Google's New Idea Man"
    Associated Press (09/08/05)

    Google has scored a major coup with its hiring of Internet pioneer Vinton Cerf as its "chief Internet evangelist." Cerf, who will continue to serve as chairman of ICANN and as a visiting scientist at NASA's Jet Propulsion Laboratory, is considered one of the Internet's founding fathers because of his role in developing the TCP/IP communications protocol in the 1970s. With Robert Kahn, he recently received ACM's 2004 A.M. Turing Award, considered "the Nobel Prize for computing," for his achievements in computer networking. Cerf's installation at Google marks the end of his stint at MCI, where his work included the design of MCI Mail and development of advanced networking technologies that include multimodal services. Cerf said Google is generating opportunities for new applications through its buildout of a wholly new computing infrastructure, and he expects to spend much of his time developing such applications. Google CEO Eric Schmidt said in a recent interview that Cerf is perhaps the most significant intellectual acquisition the company recently made. Google's recent efforts to supplement its search engine include the launch of its Web-based Gmail email service, and its release of free software to organize computer files, sort digital photos, produce maps, and carry out Internet-based phone calls and text chats.
    Click Here to View Full Article

    An archived version of the ACM 2004 A.M. Turing Award lecture "Assessing the Internet: Lessons Learned, Strategies for Evolution, and Future Possibilities" is available at:

  • "Grammar Lost Translation Machine in Researchers Fix Will"
    USC Information Sciences Institute (09/01/05); Mankin, Eric

    The effectiveness of a computer translation system created by the University of Southern California's Information Sciences Institute (ISI) hinges on the brute force correlation of massive volumes of pre-translated text from media that are published in multiple languages, but ISI machine translation specialist Daniel Marcu says the output produced by this system takes a long time to read and is not viable for commercial use. To correct this problem, Marcu and ISI colleague Kevin Knight have embarked on the Advanced Language Modeling for Machine Translation project, a $285,000 initiative to enhance the system with grammatical processing. Language users can nest and cross-nest ideas and phrases into sophisticated referential structures with infinite facility, and programmers decided long ago to use the brute force method of matching phrases rather than attempt to trace branching "trees" of connections. The clear limitations of this strategy are prompting researchers to focus computing power toward the modeling of observed grammatical rules, and Knight cites the "Penn Treebank," a database of English text whose syntax has been hand-decoded by people, as a critical development. The USC/ISI machine translation system can generate a list of 25,000 English outputs for any input sentence. "We will re-rank these lists of candidate string translations with our tree-based language model, and we plan for better translations to rise to the top of the list," Knight and colleagues declared. Knight believes it is a reasonable short-term goal to give the system the ability to perceive separate trees from the limitless strings of words.
    Click Here to View Full Article

  • "Women Are 'Put Off' Hi-Tech Jobs"
    BBC News (09/08/05)

    An Intellect report finds that the British technology industry must make a better effort to recruit, persuade, and retain women in the high-tech work force, which is characterized by a bias toward males and a lack of female role models. The Office of National Statistics estimates that the percentage of women in technology industries declined from 27% to 21% between 1997 and 2005, while the British Computer Society reports that 28% of U.K. organizations do not employ female technologists. The Intellect report attributes the defection of women from the IT industry to long hours, a dearth of networking opportunities, and their perception of IT as a boys' club. The Department of Trade and Industry said it would be seeking to provide more role models for women through its work with various organizations. Meanwhile, the U.K. Resource Center for Women in SET (science, engineering, and technology) seeks to encourage more women to pursue SET careers as well as put 40% of women on industry and academic boards in a few years through collaboration with SET experts and employers to provide support, information, training, and mentoring programs. In addition, the Athena Project and the Scientific Women's Academic Network hopes to curb the loss of female researchers through a six-point charter to help address gender bias in British universities.
    Click Here to View Full Article

  • "New Search Engine 'Revolutionary'"
    University of New South Wales (09/08/2005)

    University of New South Wales Ph.D. student Ori Allon has developed a new search engine designed to enhance searches carried out on Yahoo!, Google, or other popular services. Search engines retrieve Web pages where specific keywords are found, but these pages are not always relevant to the topic. Allon's Orion search engine finds pages where the content covers a topic with a strong relationship to the keyword, and displays results as expanded text excerpts while listing other topics related to the keyword so the user can choose the most relevant. "You gain additional pertinent information that you might not have originally conceived, thus offering an expert search without having an expert's knowledge," Allon explains. Still, he says the option to click through Web sites will not be eliminated. Andrew Stead of New South Innovations believes Orion will fulfill Microsoft founder Bill Gates' recommendation that search be expanded beyond current popular perceptions.
    Click Here to View Full Article

  • "Coming to Grips With Robot Learning"
    IST Results (09/08/05)

    The IST-funded ArteSImit project has yielded a visual-motor system that controls an artificial hand via learning through mimicry. The project's goal was to uncover the neurophysiological underpinnings of finger and hand movements in humans and primates, and apply them to the design of a computer-operational dynamic model of imitation learning. The system uses a camera to monitor the environment, identifies the instructor's gestures from a set of predefined movements, and implements the correct sequence of arm, hand, and finger movements using the least amount of motion. The system only needs one camera to precisely recognize finger movements thanks to a unique algorithm. "Our final objective was to implement imitational learning on this artificial hand and to suggest applications of the new methods and models to other artifacts with many degrees of freedom that cooperate with humans," says ArteSImit project coordinator Alois Knoll. He believes the resulting techniques could clear the way for new kinds of prostheses. Knoll thinks the project's findings could also be applied to security systems by facilitating a method to capture and spot a person's movements in large crowds. The JAST project will continue the research initiated by ArteSImit with its goal to develop jointly-acting autonomous systems that communicate and function intelligently on mutual operations in dynamic unstructured settings.
    Click Here to View Full Article

  • "IU Researchers Develop Technology to Better Predict Storms"
    Inside Indiana Business (09/06/2005)

    Indiana University is one of nine academic institutions participating in the National Science Foundation-funded Linked Environments Atmospheric Discovery (LEAD) project, whose goal is to develop a high-speed computing and network infrastructure for more accurately predicting and tracking storms and other dangerous weather events. The forecasts provided by such technology would be a boon to government and public safety officials who plan disaster recovery support, and would help reduce the uncertainty administrators must often contend with when issuing evacuation orders. IU principal investigator Dennis Gannon says the LEAD infrastructure would constantly monitor stream data from ground sensors that detect and measure various meteorological factors and events, and pool and analyze input from satellites, visual observations, the NEXRAD radar network, and other sources. IU co-principal investigator Beth Plale says this data will be compared to historic patterns via data mining. "When the conditions are right for the formation of a severe storm, the system will be able to launch hundreds of simulations at the same time," she explains. Simulations rendered obsolete by additional sensor data will be eliminated, allowing the system to concentrate on more realistic models. IU recently won $2 million from the NSF to support LEAD and other "science gateways" through its participation in the TeraGrid project.
    Click Here to View Full Article

  • "Academics, Industry Launch Internet Innovation Symposium"
    UW News (09/06/05); Gardner, Nancy

    The upcoming Seattle Innovation Symposium will focus on how academic researchers and private-sector IT leaders can collaborate to identify important emerging technologies and rapidly develop them into the Internet's next billion-dollar market segments. The University of Washington's Business and Information Schools and the College of Engineering will host next week's two-day conference, which is funded by a National Science Foundation grant. One of the event's objectives is to determine what new innovations will fuel business in the worldwide economy and how academic and industrial researchers can team up to shape an international research agenda. "By bringing together and building a network of multidisciplined leaders in the study of innovation, we believe that we can reduce the time to transfer new innovation into economic value," said UW Business School professor Dick Nolan. The inaugural symposium will involve approximately 100 participants divided into small teams; each team will examine one emerging technology and develop a business application based on its strengths and capabilities. Dean of the UW's Information School Michael Eisenberg lamented the limited amount of funding earmarked for IT research, and said, "We must have a better shared understanding of the interplay between research at universities, venture capitalists, and industry researchers if we are to think creatively and constructively about the next market segments." Conference attendees will include representatives of Asian, European, and North American academia, as well as commercial entities such as Microsoft.
    Click Here to View Full Article

  • "Report: IT Blueprints Should Address Privacy Issues"
    Washington Technology (09/07/05); Lipowicz, Alice

    The academic Task Force on Protecting the Homeland and Preserving Freedom has released a report recommending the employment of IT tools such as data mining, link analysis, biometrics, and data integration for homeland security, provided they address privacy issues in advance. "This paper is an exhortation to technology developers: Consider privacy at the start of any system development," wrote task force member Paul Rosenzweig of the Heritage Foundation. The task force thinks new technologies in cyberspace should adhere to existing legal and policy restrictions in physical space, and suggests intrusiveness would be kept to a minimum if IT systems for homeland security are voluntary and employed for limited purposes. Rosenzweig indicated that data mining should be used to enhance information-gathering rather than as a source of information by itself. The task force advises the use of distributed rather than centralized architectures for data collection in order to lower the chances of abuse, and the employment of technologies that keep individuals anonymous while allowing them to be uniquely identified. The group also recommended that new IT technologies have built-in oversight measures that either thwart tampering or make tampering obvious, along with automatic audit functions that log all activity for review later. Furthermore, the report points to the need for a formal redress process to contend with false positive identifications.
    Click Here to View Full Article

  • "Free Access Service Allows Remote Networking"
    Washington University (St. Louis) (09/07/2005); Fitzpatrick, Tony

    Washington University has unveiled its Open Network Laboratory (ONL), providing its students with a free and remote networking service that will hopefully spur innovation and enhance the capability of the Internet for its users. The computer scientists who developed ONL created their own gigabit router, citing frustration over the opacity of commercial products. Jonathan Turner, the program's director and a professor of engineering and computer science, also views the open source project as a learning opportunity for students. The NSF-funded project centers around a group of open, extensible, high-performance routers with access available to remote users via a Remote Laboratory Interface (RLI); the RLI permits users to alter the software at work in the embedded processors, as well as the FPGA-implemented hardware. Through the RLI, users can configure the testbed, and run and monitor applications through information-collecting devices provided by the routers. Data visualization and remote displays help users gain insight into the activity of the operating environment. The routers are constructed around a scalable switch fabric and are closely related to commercial routers architecturally. The routers' ports each contain an embedded microprocessor to accommodate software plugins, and users can expand the functionality by writing their own plugins. That type of extensibility is rapidly coming to define the networking arena, and Turner is hopeful that it will help ONL become "a useful service for researchers and students, both at Washington University and around the country."
    Click Here to View Full Article

  • "Early in the Game: RPI Creates Video Game Major"
    Albany Business Review (09/04/05); D'Errico, Richard A.

    Starting next fall, a video game development major will be available to Rensselaer Polytechnic Institute students, who will be required to take courses not only in computer programming and simulation, but in psychology, calculus, and Shakespeare as well. Director of RPI's Social and Behavioral Research Laboratory Jim Watt thinks the institute's gaming major may be one of the first ever offered at a traditional U.S. college, and he estimates that there are over 700 programs, certificates, or courses offering games studies in the nation. The RPI gaming major's game-related courses include a focus on game architecture, game history and culture, and artificial intelligence for games, which Watt says were suggested by people in the gaming industry. Participating students must have a double major: A bachelor of science in games and simulation arts and sciences, and a bachelor in computer science, communication, psychology, management, or electronic arts. Carnegie Mellon, MIT, the University of Southern California, and Georgia Tech are among the U.S. schools that offer games studies. Vicarious Visions CEO Karthik Bala says RPI's game-developing curriculum came out of a grassroots crusade by students who took courses in computer science, psychology, and the arts to ready themselves for the game industry long before the university started offering a undergraduate minor in games studies.
    Click Here to View Full Article

  • "Women in IT: How Are We Doing?"
    rabble.ca (09/02/05); Scott-Dixon, Krista

    In her book, "Doing IT: Women Working in Information Technology," Krista Scott-Dixon describes IT as a blend of both positive and negative that is alternately stifling, liberating, limiting, and vitalizing for women. "The mundane minutiae of people's daily experiences with information technologies have smoothed the cutting edge of the 'information revolution,'" she explains. "At the same time, the banality of these technologies can conceal their potential to enable dramatic changes in work practices." Scott-Dixon reports that women in IT remain a minority, generally earn less and do more uncompensated work than their male counterparts, and are still confronted with both subtle and obvious discrimination along racial, sexual, social, and age-related lines. Few women enroll in technical fields in universities, and those who do soon drop out; most women end up in IT by accident rather than by choice. But Scott-Dixon refuses to rationalize the lack of female IT workers with pat explanations such as an innate dislike of technology, natural disinterest in the field, or cognitive limitations. She illustrates her point by noting that many women she has spoken to regard IT as a stimulating and empowering field, and this observation is backed up by a Statistics Canada survey in which more than 50% of respondents said their work has become more interesting since technology was introduced.
    Click Here to View Full Article

  • "Automated Code Inspection at Early Stages Can Reduce Software Failures"
    Business Standard (India) (09/02/05); K, Arunkumar

    The market for static testing tools stands to grow considerably as a result of the focus on testing free software at the coding stage in India. The Indian software industry views static testing as a way to save costs by improving productivity as compensation expenses continue to escalate. In particular, testing codes as they are being developed cuts into quality expenditures such as the cost associated with preventing, finding, and fixing defective software, which accounts for as much as 20% to 40% of sales. "Many of these costs can be significantly reduced by static analysis at the coding stage itself," explains L. Narayanan, business development manager (India), Programming Research. "This also reduces the workload on manual testing." Carnegie Mellon University's Software Engineering Institute says "if you want a quality product out of test, you must put a quality product into test," and warns that only a small number of defects are removed by manual testing. Testing by Alcatel revealed that static testing produced savings in development costs and maintenance costs, as well as project life cycle.
    Click Here to View Full Article

  • "The Changing User Interface"
    InformationWeek (09/05/05) No. 1054, P. 39; Ricadela, Aaron

    The importance of improved user interfaces becomes clear as the Web grows more diffuse, and users try to organize a burgeoning number of digital files and companies attempt Web-based brand promotion. Perhaps the most dramatic changes to the desktop look and experience will come from Microsoft's Windows Vista, which will feature the Avalon graphics software system. Avalon will bring vector graphics to Windows; make on-screen typefaces less difficult to read; add smoothness and translucency to Vista's desktop and window panes; ease user access to audiovisual capabilities in Windows; and allow software companies to create hybrid Windows/Web software through its XAML programming language. The increased mobility of personal computing will impel cellular phone makers to make PDA-hybrid smart phones with easy and rapid navigation, although panelists at the AlwaysOn Innovation Summit said features should be light while layered menus should be absent. Macromedia CEO Stephen Elop pointed to the need for consistency in the user interface across the various platforms a consumer may use for routine chores. The look of the Web is being transformed by Macromedia's Flash products and asynchronous JavaScript plus XML (Ajax) technologies: The former can upgrade the performance of refined Web visual effects (animation, graphs, video) to add more compelling content to sites, and the latter improves the snap and responsiveness of Web pages while also easing the development of new applications. Sharing large computer files through small portable devices is one of the goals of Georgia Tech professor Jeff Pierce's research group, whose Serefe system assigns instant-message addresses to a user's PC and cell phone. Another brainchild of Pierce's team is user interfaces that are consistent across various devices.
    Click Here to View Full Article

  • "Science Fiction?"
    Economist (09/03/05) Vol. 376, No. 8442, P. 59

    As businesses are slow to return to pre-dot-com bust spending levels, electronics manufacturers are targeting the individual consumer, though their vision of the digital home fails to resonate with many; a number of technology companies are placing their bets on that sector, though, and undertaking marketing initiatives to stimulate demand. The cynical view holds that companies are not excited so much about the improvements the digital age will herald, but the equipment consumers will have to purchase to get there. A recent survey by Parks Associates found that consumers are still preoccupied with utilitarian concerns about technology, as 89% of those with a computer network at home cited sharing Internet access as their top priority; the study also found that online media sharing has a marginal influence, as online sales constituted only 2% of total music revenue in North America and Europe at the end of 2004, with only moderate growth predicted. In general, most consumers seem to feel that existing technology works well enough, and that a thorough overhaul to digitize their home is unnecessary. Some technology vendors are undeterred, though, and believe it is necessary to demonstrate the potential of their products in order to generate interest. Central to the vision of the digital home is interoperability between the file formats and codecs that define digital information and the digital right management software that dictate the terms of its use. The incompatibility among products of different companies and different file formats, coupled with the uncertainty of which will last, has consumers reluctant to pour money into emerging technologies. Within a labyrinthine marketplace, a coherent formula for the digital home that resonates with consumers must emerge before it ever becomes a reality.
    Click Here to View Full Article

  • "Lab Steps Up to Challenges of RFID Tech"
    EE Times (09/05/05) No. 1387, P. 12; Yoshida, Junko

    The RFID Lab at the University of Wisconsin is working to resolve a litany of challenges standing in the way of the commercialization of RFID (radio- frequency identification) technology. Difficulties involving logistics, signal strength and consistency, and everything in between are preventing RFID from realizing its cost-effective potential, but one great step forward for the technology will be its forthcoming ratification by the Electronic Product Code Generation 2 that will ensure that all tags based on the standard will be readable anywhere in the world. Alfonso Gutierrez, who heads the lab, believes that beyond standardization, there are many other issues blocking RFID implementation. To lower costs, the industry must be able to mass-produce millions of RFID tags, though the technology is resistant to a one-size-fits-all approach, which Gutierrez identifies as a central obstacle to the individual tagging process that would realize the technology's full potential. The UW RFID lab is experimenting with conductive inks and applying RFID tags to substrates, which it hopes will accelerate implementation, in keeping with its specific focus on materials science and antennae design. One of the lab's most important features is its anechoic chamber, which simulates a pure environment for tag antennae design, measures the tag-chip impedance, and creates parameters such as polarization, gain, and radiation. Eventually, the lab expects to move toward the integration of sensors into RFID tags, which could produce an affordable network capable of communicating detailed information wirelessly.
    Click Here to View Full Article

  • "Digital-Divide Efforts Are Getting More Attention"
    Internet Computing (08/05) Vol. 9, No. 4, P. 8; Goth, Greg

    Excitement is building over large-scale and small-scale efforts to bridge the digital divide between technology haves and have-nots around the world, although these initiatives have their share of skeptics. Perhaps the most ambitious project is the MIT Media Lab's program to fabricate and distribute hundreds of millions of $100 laptops, on the order of one laptop per child, over the next few years, using a highly heterogeneous connectivity architecture. Media Lab founder Nicholas Negroponte envisions the battery-powered laptops running the Linux operating system and featuring 1GB of memory and 500MHz processors, but Samuel Danofsky with the United Nations Information and Communications Technologies (U.N.-ICT) Task Force says there are doubts as to whether the MIT project's goal is realistic. The U.N. task force is split into four working groups, one of which is attempting to define cross-cultural open access. Among the concepts the group is considering is FiberAfrica, a coordinated fiber-optic infrastructure shared by all stakeholders that would enable most Africans to enjoy virtually free data connectivity within a reasonable distance. Danofsky thinks the coordination of last-mile global-access initiatives could be improved by combining efforts such as FiberAfrica with U.N. organizational resources, but others believe such an approach is too complicated. Opportunity Access founder Charles Moore claims his project to provide tech resources for Costa Rican students, social-service agencies, and others is doing well without a reliance on a larger coordinating entity. Meanwhile, software entrepreneur Donna Auguste says the decision to set up or broaden a digital-divide project must be based on whether it is sustainable in the local environment.
    Click Here to View Full Article

  • State of the Art: How We Measure Up"
    The New Atlantis (08/05)

    "High-tech luminaries such as Microsoft Chairman Bill Gates have called attention to an apparent decline in the performance of U.S. students in math and science that could threaten America's competitive edge in the global economy. Yet there is no assurance that the standard measures of math and science education throughout the world--the Program for International Student Assessment (PISA) and the Trends in International Mathematics and Science Study (TIMSS)--accurately reflect America's real educational standing. A multitude of factors (such as demographics, culture, differing school systems, curricula, and sampling accuracy) may affect the results of the tests in such as way as to compound the uncertainty of performance ratings. For example, the American educational system's broad curricular focus is likely to result in disproportionately poor performance on standardized tests, while students in countries whose curricula focus on fewer subject areas are likely to do better. Also casting doubt on the accuracy of such tests is the fact that the U.S. is still churning out new Ph.D.s in huge numbers--indeed, studies indicate the U.S. job market cannot keep up with the flood of engineering and science graduates with advanced degrees. Furthermore, no nation that has long surpassed the U.S. in educational studies can yet compete with the U.S. in terms of research productivity. The poor performance of U.S. minorities in math and science is a serious concern, suggesting that some of these groups will be unable to secure high-paying jobs as U.S. companies hire better-educated foreign professionals. Increased competition from China and other countries should also spur the U.S. to improve its science and math education.
    Click Here to View Full Article

    [ Archives ]  [ Home ]