Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Thunderstone or ACM. To send comments, please write to technews@hq.acm.org.
Volume 7, Issue 819: Friday, July 22, 2005

  • "Senators Promise 'Brain Drain' Bill"
    InternetNews.com (07/21/05); Mark, Roy

    U.S. Senators plan to present legislation to replenish the ranks of America's science and engineering graduates, which many agree are essential to maintaining the country's leadership in global innovation. "New ideas, the development of new technologies and innovation will lead to a better educated workforce, a higher standard of living in the United States, and a strengthened American economy," declared Sen. John Ensign (R-Nev.). Sen. Joseph Lieberman (D-Conn.) said the number of jobs requiring technical training is growing at a rate five times that of other occupations, while the average number of students entering those fields is falling concurrent with the rising age of the American science and engineering workforce. Meanwhile, Sen. George Allen (R-Va.) estimated that America produces an average 50,000 engineering graduates annually, compared to 150,000 from India and 250,000 from China. The lawmakers did not offer a detailed description of the bill they intend to propose next week, although they said it is based on the 2004 National Innovation Initiative Report from the Council on Competitiveness. The report calls for the creation of 5,000 new graduate fellowships underwritten by federal R&D agencies, an overhaul of immigration laws to permit science and engineering students from abroad to reside and find employment in the United States, and the construction of 10 "innovation hot spots" over the next five years.

  • "HP Drops 4 Research Groups in Downsizing"
    San Francisco Chronicle (07/22/05) P. C1; Pimentel, Benjamin

    Four research groups at HP Labs will be dropped as part of Hewlett-Packard's latest downsizing effort, one of which was headed by renowned computing pioneer Alan Kay. Kay won the ACM's A.M. Turing Award and the National Academy of Engineering Charles Stark Draper Prize, and was a co-designer of ARPAnet, the Internet's predecessor. HP Labs' Dave Berman says Kay's group dealt with advanced software research, but insists that HP's research agenda will remain "rich and varied" despite the loss. The other groups eliminated by the downsizing include a pair of Palo Alto labs focusing on consumer applications and emerging technologies, and a Cambridge, Mass., group whose work covered health care and medical issues. Berman cites HP Labs' continuing quantum computing and nanotechnology research as evidence of the company's dedication to R&D-supported innovation. "We are trying to refocus our research into the areas of greatest promise and our core strength and those areas which are most important to HP in the medium and long term," Berman says. Analyst Michael Dortch warns that HP could put itself at a disadvantage if it no longer has access to talent of Kay's caliber. He says, "How many times in a lifetime does a company like HP get access to a mind like Alan Kay's?"
    Click Here to View Full Article

  • "Europe Moving in R&D Slow Lane"
    BBC News (07/20/05)

    Europe's science and technology R&D investment is paltry compared to America and Asian countries, according to the latest statistics, and EU research commissioner Janez Potocnik warned at a recent media conference that "Europe will lose the opportunity to become a leading global knowledge-based economy" unless action is taken. The European bloc committed less than 2 percent of its GDP to R&D in 2003, compared to America's 2.59 percent investment and Japan's 3.15 percent investment. China's R&D investment is even smaller than Europe's at 1.3 percent, but the country is on track to pass by Europe with an approximately 10 percent increase in investment per year. Potocnik said Europe's popularity as a center for research is waning: Statistics show that EU companies in America spent 54 percent on R&D between 1997 and 2002, while U.S. companies in the EU spent 38 percent over the same period. A major underwriter of public-level research in Europe, the EU's Framework 6 program, is about to be succeeded by a Framework 7 program, which will establish a European Research Council (ERC) modeled after the U.S. National Science Foundation and the National Institutes for Health. Advocates believe the ERC will help improve the quality of European scientific research by more clearly defining a funding distribution methodology that fuels competition. The council and its approach to funding will be influenced by high-profile EU scientists that this week were named the ERC's first members.
    Click Here to View Full Article

  • "Watching the Future Unfold"
    HPC Wire (07/22/05) Vol. 14, No. 29; Barker, Trish

    Researchers at the University of Illinois at Urbana-Champaign have developed the Land-use Evolution and Impact Assessment Model (LEAM), a computer simulation that enables urban planners, government officials, citizens, and other interested parties to visualize and measure the likely outcomes of policy decisions. The model evaluates the growth potential of a given area by dividing that area into cells and factoring in data on economics, utilities, transportation, and neighboring land use for each cell. Each factor is valuated to ascertain the likelihood of change within specific cells, as well as the most probable kind of change. The project, which is funded by the State of Illinois, the Defense Department, and the National Science Foundation, draws its computational muscle from high-performance computing clusters at the National Center for Supercomputing Applications (NCSA), and stores the massive volumes of data produced by the simulations on NCSA's 3-petabyte DiskXtender mass storage system. Former NCSA computing and communications associate director Jeff Terstriep was instrumental in making these resources accessible to the LEAM team by parallelizing the LEAM code and benchmarking the code on up to 1,024 processors. As a result, simulations that took hours to complete could now be carried out in less than half an hour using the center's IBM p690 system. LEAM simulations churn out 300 MB to 500 MB of data per computational run, and Terstriep says DiskXtender delivers three core advantages to the LEAM team: A high access rate, bandwidth, and permanence.
    Click Here to View Full Article

  • "Driven to Distraction by Technology"
    CNet (07/21/05); Fried, Ina

    Digital communications technologies designed to boost worker efficiency--email, instant messaging, and so on--are having the opposite effect with the constant interruptions they present. A Hewlett-Packard study estimates that 62 percent of British adults have an email addiction, and check their messages incessantly, even if such activities disrupt conferences, social gatherings, and other important events. Journalist Carl Honore reports a corporate cultural attitude that emphasizes rapid response to communications, noting that "People start to look at you with contempt or disgust if you shift away from the technology." He believes a collective effort to unplug from disruptive communications could be beneficial to businesses, and cites the introduction of "Email free Fridays" at Veritas Software's marketing department as a case in point: By prohibiting emails between employees on Friday, more face-to-face communication takes place, and executive VP Jeremy Burton finds the amount of email filling his in-box reduced by half. IBM Almaden Research Center manager Dan Russell has dramatically reduced the distractions he once faced by not bringing his cell phone into work and reading his email just twice a day, keeping the settings that deliver a message as soon as it comes in deactivated. Russell says humans cannot multitask well, which supports some studies suggesting that sending multiple messages simultaneously is an impossible task for people. Microsoft Chairman Bill Gates says software needs to filter information better and present it in more useful ways, while Chris Capossela with Microsoft's Information Worker unit says improved software alone cannot eliminate digital distractions; a change of priorities is also needed.
    Click Here to View Full Article

  • "IEEE Starts Hammering Out Mesh Network Standard"
    IDG News Service (07/20/05); Lawson, Stephen

    Fifteen proposals for a wireless mesh network standard are being submitted to the IEEE 802.11s working group this week, and Nortel Networks' Bilel Jamoussi says that number will be pared down to a single draft perhaps as soon as May 2006. Mesh networks allow multiple access points to carry one another's traffic, making wired connections in wireless local access networks (WLANs) less necessary. A traditional wireless access point requires an individual wired connection to a backbone network, but a wireless mesh can provide a single wire for numerous access points, permitting Internet-destined traffic to hop from access point to access point until it reaches the wired link; the need for leased lines is reduced, and costs are lowered, according to Jamoussi. A new access point added to the mesh can be automatically configured for security, quality of service, and other characteristics. Although wireless mesh networks are already available, Jamoussi says an access point from one vendor may not necessarily interoperate with equipment from other vendors; a standard would lessen the work load for product developers, lower product prices, and allow customers to choose mesh vendors. Nortel is a member of the Wi-Mesh Alliance, a group whose 802.11s proposal is designed to work for metropolitan, military, and consumer/small business mesh technology applications. Anuj Batra with Texas Instruments acknowledges that there is a lot of overlap between the various 802.11s proposals, which is a positive sign; TI is part of the Simple, Efficient, and Extensible Mesh (SEEMesh) group, which is submitting its own proposal to the IEEE.
    Click Here to View Full Article

  • "Sun Plans to Make All Its Software Free"
    InfoWorld (07/21/05); Krill, Paul

    Sun Microsystems President Jonathan Schwartz announced at the AlwaysOn conference on July 21 that his company intends to ultimately make all of its software freely available as a community-building strategy. Schwartz estimated that 2 million licenses for the open-source version of Sun's Solaris operating system have been downloaded since its release, and praised open source as a driver of innovation as well as cost reduction. He said, "We've been trying to faithfully explore how to deliver our products and technologies for free;" besides Solaris, Java and other Sun technologies can also be downloaded for free. Schwartz argued that technology commoditization builds up markets by generating continuous demand. The AlwaysOn conference was also the stage for a "Future Disruptive Technologies" session where Stanford and MIT panelists pointed to a lack of interest in computer science and engineering among U.S. students. Stanford School of Engineering dean Jim Plummer noted that over 50 percent of computer science and engineering students are foreigners. MIT associate provost Alice Gast said breakthroughs in one scientific discipline can yield advantages for another discipline: For instance, laser research is currently being applied to surgery, while quantum mechanics research may one day be used to transcend the limitations of Moore's Law.
    Click Here to View Full Article

  • "U.S. Universities, Industry in Win-Win Agreement With India to Improve Engineering Education"
    UCSD News (07/20/05); Ramsey, Doug

    American universities have entered into a partnership with Indian institutions to improve science and engineering education in India via a new satellite e-learning network. UC San Diego, UC Berkeley, Carnegie Mellon, Cornell University, Case Western Reserve University, and the State University of New York at Buffalo will encourage engineering faculty to spend some of their sabbatical at India's AMRITA University, which will allow its e-learning center to be accessed through an Edusat satellite launched by the Indian Space Research Organization to deliver educational programming. "The U.S. universities in this agreement are first-tier engineering schools that can help offset the imbalance in the quality of professors in India's fastest growing colleges and universities," said AMRITA University Vice Chancellor Venkat Rangan. The initiative could also help counter the recent fall-off in applications to U.S. engineering schools from India and other nations, and offer U.S. faculty the opportunity to engage in research collaborations in India. Microsoft, QUALCOMM, and Cadence Design Systems will fund U.S. participation in the program; QUALCOMM will facilitate the participation of professors from UCSD and the California Institute for Telecommunications and Information Technology (Calit2), while Microsoft India will team up with AMRITA to establish the International Center of Excellence in e-learning. The U.S. schools have also promised to supply teaching materials for a digital content library AMRITA is developing for future students, non-exclusively. "By expanding opportunities for international academic collaborations in critical fields, this partnership will not only help keep the University of California competitive--but it will help drive global innovation and economic prosperity," said Gretchen Kalonji of UC's Office of the President at the signing of the Indo-U.S. pact on July 20.
    Click Here to View Full Article

  • "Bots Now Battle Humans for Poker Supremacy"
    FOXSports.com (07/19/05); Roarke, Shawn P.

    Interest in poker-playing computer programs is rising, as is alarm among gamers, gaming companies, and programmers concerned that poker bots could become a new tool for deceit and cheating in the increasingly high-stakes arena of online poker. The fact that even the most advanced poker bots can be beaten by seasoned professional players is of little comfort to most online poker venues, whose clientele is chiefly amateur or neophyte players. University of Alberta computer science professor and poker bot developer Dr. Jonathan Schaeffer is confident that poker bots can consistently win at low-limit tables, which novice players frequently habit. Moreover, the bots can play at multiple tables and never get tired. Essentially, poker bot operators can rake in winnings with a minimum investment of time and resources, and this can make online poker Web sites less trustworthy for players and gamblers. Cyber World Group CEO Steve Baker says poker sites have an imperative to ban the use of poker bots or any other form of collusion. Last week, the Golden Palace Casino sponsored the first organized public poker bot contest where poker-playing programs vied for a $100,000 jackpot in a tournament of limit hold 'em, and Baker thinks the event yielded useful information for those trying to combat such programs. Bots such as those devised by Schaeffer are being used by scientists to test the boundaries of artificial intelligence.
    Click Here to View Full Article

  • "Web Services Specs Meet Open Source"
    eWeek (07/19/05); Taft, Darryl K.

    The licensing agreements Microsoft, IBM, and the Apache Software Foundation will reach concerning WS-Security specifications could have far-reaching implications for the more general future of Web services applications in the open-source framework. Of the three options the Organization for the Advancement of Structured Information Systems (OASIS) allows for--reasonable and non-discriminatory (RAND), royalty-free (RF) on limited terms, and a hybrid of both--none have been a full embrace of open source. Current policy mandates that users of WS-Security approach license holders IBM and Microsoft to broker transfer rights to the application should they aim to redistribute it, though the specification is currently royalty-free. Proponents of open source criticize OASIS for providing a patent shelter. Apache is in negotiations with Microsoft to arrive at a compatible solution that some believe may set a precedent for the future balance between open source and proprietary licensing of Web services. OASIS CEO Patrick Gannon says, "It's gratifying for us to see Apache, IBM, and Microsoft engage in a productive dialogue that hopefully will result in the widest possible adoption of the WS-Security OASIS standard," though he acknowledges the distinct challenges posed by the friction between increasingly popular open-source frameworks and intellectual property concerns. IBM and Microsoft are submitting three new Web service standards to OASIS for review, and Hewlett-Packard has recently turned three of its own Web service applications over to Apache as official open-source projects.
    Click Here to View Full Article

  • "Information Security With Colin Percival"
    O'Reilly ONLamp (07/21/2005); Lucas, Michael W.

    Simon Fraser University visiting researcher Colin Percival described his research on information security in a recent interview, which deals with the security threat posed by hyperthreading. He demonstrated how this technique can be used to exploit vulnerabilities in a system by a hacker who simply needs to run code concurrent to the running of the program he is trying to spy on. Percival found a fundamental vulnerability in Intel's design that allowed him to penetrate the system, raising considerable concern in the security community; in response, Microsoft and Intel were reluctant to acknowledge the security breach, and have been slow to develop patches. Some critics maintain that Percival's exploitation is largely theoretical, though he claims that it is a very real threat. Percival believes that in the future, the task of sifting through source code in search of security errors will be handled by programs, instead of people. Percival's research, published in a paper entitled "Cache Missing for Fun and Profit," proved the existence of a covert channel running between threads on the same processor core, and demonstrated how it could be used as a side channel, as well as offering solutions on how to guard against it. Percival developed his research while working on his doctoral degree and serving as a deputy security officer for FreeBSD. He has also written an open-source, downloadable security tool called FreeBSD Update that enables users to download and install security updates with little complication, addressing what he believes to be the central obstacle to the adoption of new security tools.
    Click Here to View Full Article

  • "What's the Next Big Thing on the Web? It May Be a Small, Simple Thing--Microformats"
    Knowledge@Wharton (07/26/05)

    If information presented on the Web could be understood by people as well as computers, it would spawn new forms of doing business and transform the Web into a massive, searchable, and reconfigurable database. Embedding such meaning within Web pages is a formidable challenge that could be addressed with microformats, according to Technorati technologist Tantek Celik. Microformats are envisioned as simple extensions of standard HTML or XHTML tags that link intelligent data to Web pages, thus providing a framework for sharing people, contact information, social network connections, and other kinds of data that HTML cannot supply by itself. Celik says microformats' reliance on visible data rather than invisible metadata supports a feedback loop that allows people who abuse the system to be spotted quickly: "Because of that feedback loop, we get this much more accurate corpus of data that we can index and search, and prioritize and relevance rank," he says. Celik acknowledges a partial resemblance between microformats and Tim Berners-Lee's Semantic Web, since both initiatives are dedicated to placing as much semantic information on the Web as possible. However, the Semantic Web proposes the introduction of semantic information in a machine-readable format only, while the microformat concept focuses on presenting the information in a primarily human-readable format. Celik says the microformats.org site was established as the hub of a community built around the various microformat development efforts of Technorati and other companies, where interested parties can exchange ideas and refine the microformats. He says the next step is for participating developers and microformats to become more diverse, and for microformat adoption to increase as well.
    Click Here to View Full Article

  • "Call for Homeland Security Cybersecurity Improvements"
    IDG News Service (07/19/05); Gross, Grant

    The U.S. Department of Homeland Security (DHS) does not have recovery plans in case of a widespread Internet attack, Government Accountability Office IT management director David Powner said yesterday, speaking before the Senate Homeland Security and Governmental Affairs Committee. Powner told lawmakers that DHS must implement an Internet recovery plan and a national cybersecurity threat assessment to better protect U.S. cybersecurity. Powner also said the GAO believes DHS must develop better relationships with state and local governments, private industry, and other federal agencies to counter cyber threats. Powner said that although DHS is making progress, "large portions of our critical infrastructure are unprepared to effectively handle a cybersecurity attack." Sen. Tom Coburn (R-Okla.) agreed with Powner and called for better coordinated cybersecurity prevention and recovery techniques. Meanwhile, DHS National Cyber Security Division acting director Andy Purdy asserted that the agency is implementing several plans to boost cybersecurity and decrease vulnerability. Sen. Thomas Carper (D-Del.) said DHS must put a higher priority on cyber security issues, cautioning that a joint physical and cyber attack could cripple response efforts. He said, "Cybersecurity plays an important role in the protection of our critical infrastructure."
    Click Here to View Full Article

  • "Standards Activists Target Scripts"
    CNet (07/18/05); Festa, Paul

    The DOM Scripting Task Force launched by the Web Standards Project (WaSP) on July 18 is tasked with encouraging compliance with the World Wide Web Consortium's Document Object Model (DOM) and similar Web standards, and with setting up scripting guidelines. The group's JavaScript manifesto says JavaScript--and, by extension, Web development--is being held back from its full potential as a result of obsolete, uninformed, and inaccessible development techniques; the task force's proposed solution is the adoption of low-profile DOM scripting. WaSP compared the current Web programming model to a stool supported by a tripod of structure (XHTML), presentation (Cascading Style Sheets), and behavior (DOM scripting), and the group identified "obtrusive" DOM scripting as the factor behind scripts' negative perception. Supporters warn that the trade-offs of Web scripting rejuvenation include a loss of accessibility and observance of Web standards. The presentation of content on many script-reliant pages is such that blind users or others with handicaps cannot access them, while Web authors determined to use the latest scripting methods may strangle older browsers with their code. It is WaSP's intention to have authors deliver scripts that "gracefully degrade" with older browsers, providing some, if not all, data and functionality.
    Click Here to View Full Article

  • "Fighting a Broadband Battle"
    Wall Street Journal (07/19/05) P. A4; Schatz, Amy

    Kevin Martin, the recently appointed chairman of the FCC, claims that his top priority will be to bring high-speed Internet access to the entire country. To achieve that end, he has maintained that telephone and cable companies should be under no obligation to share their lines with competitors, and he has argued that states should offer wireless Internet service in sparsely populated areas where current providers are reluctant to do so, though he would prefer those connections to be offered privately. However, many telephone companies have petitioned lawmakers to prevent local governments from establishing Internet connections for the public. By not requiring telephone and cable companies to share their connections, which the Supreme Court has given Martin the freedom to do, Internet providers will have a greater incentive to invest in their own networks and broaden their scope of service, Martin says. Although the number of U.S. households and businesses with high-speed connections jumped by 34 percent in 2004, one study ranked it 16th among developed nations in broadband access. High-speed Internet access is one of the few telecommunications issues embraced by the White House; President Bush last year called for "universal, affordable access" to broadband by 2007.

  • "The Trout Will Have to Wait: Barrett's Busy"
    EE Times (07/18/05) No. 1380, P. 1; Fuller, Brian; Merritt, Rick

    In a recent interview, outgoing Intel CEO Craig Barrett outlined his vision for Intel's future and reflected on its past. He describes the stewardships of Robert Noyce and Gordon Moore as having been an era of critical advancement, where the fundamental research that occurred paved the way for important developments, such as microprocessors, SRAMs, EPROMs, and DRAMs; then, under Andy Grove, Intel began to come into its own with the microprocessor as its market signature. Recently, Barrett has led the company in the direction of communications technology, where he believes it will continue to head in the future. He expresses concerns over foreign competition from China and others, such as India, Vietnam, and the Czech Republic, as well as a lack of adequate education in the United States. Barrett advocates merit pay for teachers and better methods of measuring school performance as areas that need improving. He cites the investment into research that led to process technologies, UV-lithography, and other developments as his major accomplishment. Going forward, new Intel CEO Paul Otellini's biggest challenge will be to maintain Intel's standard as a leader in growth and innovation. In terms of future competition, Barrett is most concerned about companies in the model of Texas Instruments and Samsung that invest heavily in technology.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Linux Goes Global"
    Computerworld (07/18/05) P. 31; Betts, Mitch; Lemon, Sumner; Nystedt, Dan

    Linux is acquiring an international reach because it makes practical sense in many instances: Linux's growing popularity in Europe, Asia, and elsewhere stems from its ability to lower IT costs and improve system performance. European Linux adopters cite the open-source operating system's flexibility and reliability; support for Linux has been strongest in the financial services sector, while its deepest area of penetration remains servers in the public sector. IDC reports that most Australian companies outside the financial services sector are deploying Linux on their servers, while the public sector is promoting the adoption of open-source software. Australian Linux deployments are generally smaller, but government support could encourage larger implementations. The Chinese government has been a major driver of Linux's spread in Asia, and demand is also healthy among Chinese companies that desire better performance and vendor support; an effort by the Taiwanese government has led to the promotion of Linux in embedded software. IDC Latin America's Ricardo Villate says the massive adoption of Linux servers by Latin American enterprises stems not just from cost efficiency, but also from the servers' ability to run critical business software. Deployments of Linux and open-source software across the African continent often reflect the distribution of basic utilities such as electricity or the presence of phone lines or high-speed Internet access. SchoolNet Namibia founding director Joris Komen says the price of hardware is a major obstacle for more impoverished African nations, and his organization is one of several working to make a widescale deployment of open source programs and Linux a reality.
    Click Here to View Full Article

  • "Business's Digital Black Cloud"
    Economist (07/16/05) Vol. 376, No. 8435, P. 65

    The development of faster computer chips is enabling chipmakers to release increasingly powerful hardware at competitive prices, thereby disrupting the traditional business model of the business-software industry. To keep licensing costs down, companies that rely on enterprise software to facilitate their business transactions may have to settle for a more expensive software billing plan, or move to open-source programs whose lack of licensing fees could be offset by hidden costs. The move to dual-core processing, which promises to double the performance of a single processor by placing two cores on a single chip, has caused some software firms to argue that customers paying license fees based on the number of processors running their software will be getting a free ride on half the new cores being implemented. Leading software suppliers are offering customers different licensing models in an effort to stand out from the competition: IBM has announced identical licenses for computers running single-core or dual-core versions of the Opteron or Xeon processors, while Microsoft has promised to license its server software on a per-processor basis, thus establishing a single license for any dual-core Opteron, Xeon, or Itanium server. Users, however, balk at the idea that suppliers should charge more for upgrades that come from purchasing improved hardware, while virtualization and rapid provisioning are further complicating the licensing problem. Of interest to some people is a utility licensing model in which users are charged only for what they use. Meanwhile, the open-source software licensing model is gaining adherents even among proprietary software suppliers.
    Click Here to View Full Article

  • "Eternal Bits"
    IEEE Spectrum (07/05) Vol. 42, No. 7, P. 22; Smith, MacKenzie

    Preserving digital information in the face of impermanent data formats and storage media is a formidable challenge complicated by technical as well as legal factors. Researchers at the MIT Libraries are focusing on the long-term maintenance and sharing of digital content with DSpace, a digital repository that employs an open-source software application to accept digital materials and place them online in an accessible manner, as well as save them for succeeding generations. DSpace is designed to enable the transference of materials to new formats to ensure that information is readable, playable, and otherwise accessible centuries from now, as well as attach appropriate tags to files to establish legal provenance. Users can deposit archival items in the DSpace repository by submitting any files via a simple Web-based interface; the system then organizes bit streams into related sets and tags them with metadata so they can be recovered through searches later on. Items are made up of clustered, related content and its metadata, and assembled into collections of logically related material. Once items pass the review and approval process to see if they satisfy DSpace community standards, they are officially entered into the repository and made available on the DSpace Web site. DSpace boasts a modular architecture to support the creation of widely expandable multidisciplinary archives. The DSpace curator and his successors are responsible for maintaining the availability of digital content over time, and MIT and the University of Cambridge are closely collaborating on a preservation strategy for each format that the DSpace project plans to support.
    Click Here to View Full Article