Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.

To send comments, please write to technews@hq.acm.org.

Volume 5, Issue 563:  Monday, October 27, 2003

  • "U.S. May Ease Entry for High-Tech Workers"
    Wall Street Journal (10/27/03) P. A2; Schroeder, Michael

    Spurred by worries among American multinationals and high-tech firms that the current H-1B visa cap of 65,000 will prevent thousands of skilled foreign professionals from entering the country in 2004, Senate Judiciary Committee Chairman Sen. Orrin Hatch (R-Utah) is pushing for expanded exemptions, which would clear the way for congressional consideration of proposed visa amendments. A representative of Hatch's noted that any initiative put forward would boast additional safeguards for domestic workers, including the reinstatement of a $1,000 fee for every visa that would be devoted to retraining American workers. Intel is lobbying for a visa cap exemption for foreign students with graduate technical degrees: Intel Chairman Andy Grove recently noted that 50 percent of students in U.S.-based technical graduate programs are foreign-born, and argued that the most highly skilled students should be allowed to work in the United States so that the country can sustain its economic competitiveness. The current rules only allow exemptions for H-1B holders who work at universities or nonprofit research and development organizations. American Immigration Lawyers president-elect Paul Zulkie reports that the 2004 H-1B cap is actually closer to 35,000, and estimates that it will probably be reached by March. His organization is lobbying for the cap to be boosted to 115,000. Meanwhile, India's National Association of Software and Service Companies thinks the cap should be raised to between 120,000 and 130,000. Proponents of H-1B exemption programs will have to overcome widespread feelings that the visa program itself is a big contributor to the U.S. workforce's current troubles--anti-H-1B advocates claim, for instance, that many American companies are replacing domestic workers with foreigners because they are willing to work for less.

  • "Antispam Methods Aim to Merge"
    CNet (10/24/03); Festa, Paul

    A new subcommittee established in October by the Internet Research Task Force's Anti-Spam Research Group (ASRG) seeks to reconcile and merge competing email sender verification protocols. Proposed measures include Reverse Mail Exchange, Sender Permitted From (SPF), and the Designated Mailers Protocol, which are designed to verify the identity of an email's sender without replacing the Simple Mail Transfer Protocol. All of these schemes are based on the revision of the Domain Name System database so that email servers can post associated IP addresses, enabling ISP recipients to instantly confirm a message's origin. Such a system would certify that email servers and individual address owners are not spamming. "We can solve spam with a technical solution, rather than by going through the Congress or by implementing micropayments," declared Meng Wong, CTO of ASRG subcommittee member Pobox.com, which supports the SPF protocol. He added that sender verification systems must operate in tandem with a reputation system that would allow recipients to identify the domains of established spammers. "Once you have reputation systems that work on the basis of domains, which spammers cannot forge, then no matter how many machines you hack into, you still have to use the spammer's domain," Wong explained. ISPs and antispam firms agree that halting the spread of spam is a difficult challenge because of the prevalence of email address spoofing.
    Click Here to View Full Article

  • "Smart Dust Collecting in the Enterprise"
    InternetNews.com (10/24/03); Singer, Michael

    Intel and the University of California, Berkeley, are developing wireless sensor technology intended to make ubiquitous sensor networks widespread in the enterprise. Intel-backed researchers at the Crossbow startup are developing a new operating system for the sensor "motes" called TinyOS while Berkeley researchers are working on improved mote designs that have different monitoring capabilities. Speaking at a Wireless Communications Alliance meeting earlier this month, Crossbow chief engineer Alan Broad said TinyOS supports a number of hardware platforms and sensor cards, and works using just 8 KB of memory. Intel and UC Berkeley have tested TinyOS on Maine's Great Duck Island since the spring of 2002; in that time, the network has expanded dramatically and now collects data on waterfowl living conditions and outside weather from many more points on the island. The network uses a "bucket brigade" technique to pass data throughout the network, reaching more than 1,000 feet deep in the forest in some areas. UC Berkeley Sensor and Actuator Center (BSAC) co-director Kris Pister said the privacy concerns surrounding wireless sensor network technology are outweighed by the potential benefits, and that every technology has potential for misuse. As for environmental impact, Pister said accidentally inhaling a mote would be comparable to inhaling a gnat, given the small size. Besides environmental monitoring, motes could be used in computer interface schemes where computers could sense hand movements and interpret them as commands; and in commercial warehouse operations, motes could support a system Pister likened to "FedEx tracking on steroids." BSAC researchers are also working on various mote forms: Flashy Dust is a uni-directional communication and sensing mote, while Draft Dust is a bi-directional mote shaped like an upside-down bowl and measures 63-mm; the cubicle-shaped, solar-powered bidirectional Golem Dust mote measures just 11.7-mm.
    Click Here to View Full Article

  • "Iowa State University Ready to Hit Hackers Head-On"
    Newswise (10/24/03)

    Iowa State University researchers plan to build a virtual Internet to serve as a cyber-defense testbed through a Justice Department grant of almost $500,000. ISU researcher Doug Jacobson explains that with the Internet-Scale Event and Attack Generation Environment (ISEAGE), "We will be able to carry out computer attacks exactly as they happen in the real world." Among the uses Jacobson sees for ISEAGE are understanding the vulnerabilities and defensive strategies of civilian and government infrastructure; assessing the level of security among existing business and industry computer systems; security product reliability testing; and supporting academic and research initiatives. Additionally, Jacobson believes ISEAGE will be instrumental in creating new cybercrime forensics techniques to track down the culprits behind systems attacks. ISEAGE may support a computer crime investigation project established by ISU's Information Assurance Center, Iowa State's Department of Public Safety, and the Midwest Forensics Resource Center to bolster Iowa law enforcement. Jacobson says the ISEAGE lab, which will reside at the ISU Research Park, will need $3 million to $5 million to get fully up and running. He promises that "This facility will draw good people, good resources and will become a focal point for security research."
    Click Here to View Full Article

  • "Issue of Human Writes: Putting All World Languages in Computer Text"
    SiliconValley.com (10/24/03); Cassidy, Mike

    Computerizing all languages in the world is the goal of Unicode 4.0, a massive database of 50 key writing systems established by the nonprofit Unicode Consortium. Almost 100 alphabets or "scripts" needed to write obscure languages remain to be encoded, and this initiative is being promoted and undertaken mostly by volunteers. Scholars and others think such a system would be tremendously useful as a tool to electronically publish texts in ancient languages. There are also many contemporary dialects--Chakma, Balinese, and Saurashtra--awaiting encoding. Deborah Anderson of the University of California-Berkeley is attempting to further the Unicode effort by leading UC-Berkeley's Script Encoding Initiative. She helps prepare grant proposals from obscure script experts for submission to the Unicode Consortium and other international standards committees; the proposals detail the script's alphabet, punctuation marks, and usage. Software developers apply the standards to make sure that everyone is using the same encoding system. The Unicode Consortium receives the lion's share of its funding from large technology companies that wish to benefit from a universal computer communications standard. However, interest has started to flag among these investors, now that practically all commercially lucrative languages have encoded scripts.
    Click Here to View Full Article

  • "Drive Safely in a Car With the Gift of the Gab"
    Scotsman (UK) (10/26/03); Stoke, Christina

    Edinburgh University researchers aim to make motorists capable of operating their vehicles by vocal commands, and enable cars to offer advice and even warn drivers when they are driving recklessly. Dr. Oliver Lemon of the university's Human Communication Research Center is developing such a vehicle with BMW and Bosch, using cutting-edge voice communication and computerized speech technology. Lemon says, "The idea behind the project is to have a computer interface which operates the radio, CD player, and heating." He explains that "rather than taking your hands off the wheel to press a button or look at the screen, you have a conversation with your car. It understands your voice, and replies in whatever voice you choose for it." The car's voice will be capable of emotional expression under certain conditions--it will be able to evoke anger or fear in response to bad driving. Lemon adds that the car will be given the ability to sense the driver's emotions in order to determine levels of stress or fatigue, which could lead to accidents. The vehicle will also be programmed to pick up erratic driving patterns, which are another sign of stress or tiredness. Lemon reports that developers are investigating how the car could be enabled to distinguish between the driver's spoken commands and conversations he or she is having with passengers or on a mobile phone. "Either the car will only answer requests which begin with its name, or we will teach it to analyze speech patterns so it knows when to ignore normal conversation," he says. Lemon adds that the car will also be trained to only answer the voice of its owner. Lemon's research team believes a prototype vehicle could emerge within three years, while luxury models outfitted with the technology could be available by 2008.
    Click Here to View Full Article

  • "Apple's Latest 0.1 Adds a Lot"
    New York Times (10/23/03) P. E1; Pogue, David

    Apple's latest Mac OS X version, 10.3, adds many features that make the ultra-secure operating system more useful. Mac users have taken refuge in the security of the Panther OS in the past year as Windows users continue to struggle with viruses, spam, and software vulnerabilities. Among Panther's 150 new features is FileVault, which encrypts personal files when a user is not logged in; Apple claims the algorithm is so secure that it would take a 149-trillion-year decryption effort to break it. The Secure Empty Trash feature is another key privacy amenity, not only deleting files from the system, but overwriting the hard disk space with unintelligible data so that data-recovery investigations cannot collect "erased" information. Mac OS X Mail allows email only from address book addresses and recent correspondents, and can automatically screen out graphics that alert a spammer to an active email account when opened. Panther also includes a number of enhancements to the user interface, such as a super taskbar-type feature called Expose, which tiles open programs and documents on the screen on command, allowing users to quickly switch between tasks; fax operations and putting files in Zip format is also much easier with new built-in features. Mac OS X also incorporates Windows' Fast User Switching concept, but adds eye-candy by making the entire screen seem to rotate as though each users' display is a different face on a cube. The TextEdit and Preview programs are simple word-processing and PDF viewer programs, respectively, ensuring that Mac users have options should Microsoft, Adobe, or any other major software vendor ever stop making Mac versions of their products. The Mac OS X 10.3's $130 price affords a feeling of true ownership, unlike Windows, which with its sign-up prompts and "activation" requirements makes the OS feel like rented property.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Write Once, Publish Often"
    e4Engineering (10/21/03); Firth, Simon

    Researchers at Hewlett-Packard's Bristol, UK, lab are developing new publishing software that automatically converts a master document into formats suitable for the Web, print, personal digital assistants (PDA), or other channels. Called Multi-Channel Publishing, the tool will eliminate much of the graphics and layout customization required to convert a word-processed document into PDA format, for instance. With Multi-Channel Publishing, a master document is built using content, layout, styling, and semantic components, which are automatically put together by software called the Formatting Objects Authoring Tool (FOA). FOA is an open-source Java-based program created by HP researcher Fabio Giannetti that prevents document authors from having to code in XSLT or XSL-FO languages. Giannetti says FOA is to publishing what Macromedia's Dreamweaver was to Web site creation. Using set rules, Multi-Channel Publishing documents know to always put a letterhead at the top while allowing page text to be split between two screen shots, if needed. Names, images, and personal addresses are dealt with according to their role within the document and the presentation format used. Multi-Channel Publishing can also deal with variable information, such as additional information customized for an individual, language group, or culture. HP says FOA is one of its most successful open-source projects to date, having been downloaded from sourceforge.com 16,000 times so far.
    Click Here to View Full Article

  • "Congress Turns Attention to International Piracy"
    InternetNews.com (10/22/03); Mark, Roy

    Rep. Adam Schiff (D-Calif.), who says digital piracy has caused enormous losses for the movie and recording industries, is the co-chair of the newly formed Congressional International Anti-Piracy Caucus, a bipartisan group of House and Senate members that plans to focus on piracy problems overseas and work with the Bush administration as the White House attempts to strengthen intellectual property protections abroad. Along with Schiff, Sens. Gordon Smith (R-Ore.), Joe Biden (D-Del.) and Rep. Bob Goodlatte (R-Va.) co-chair the caucus. "A vibrant sector of the U.S. economy is at tremendous risk due to widespread piracy of U.S.-made movies, music, software, video games, and other creative works," says Goodlatte. "The caucus will play an important role in defending the rights of creators and distributors to be compensated for the work that they do." The group plans to brief congressional delegations traveling overseas on piracy issues, and assist House and Senate committees holding hearings and reviewing legislation on piracy. The caucus also plans to demonstrate new technologies developed to protect creative works from piracy.
    Click Here to View Full Article

  • "New Typeface to Help Dyslexics"
    Wired News (10/21/03); Asaravala, Amit

    A Dutch designer has developed a new typeface that will make it easier for dyslexics to read words on the Web. Created by Natascha Frensch, Read Regular makes each letter considerably unique so that dyslexics will not confuse one character with another, and the typeface also uses simplified forms and extends the openings in letters such as c and e. Most Web sites that attempt to accommodate dyslexics use the sans-serif Arial typeface, but its similar forms for letters such as b and d, p and q, and u and n often causes problems for people who have some form of dyslexia. Although some organizations have turned to the Comic Sans typeface, others say its thick and asymmetrical characters do not present a professional appearance. Frensch, who is dyslexic, started the Read Regular project about three years ago while studying for her master's degree at the Royal College of Art in London. "From the start, Read Regular has been a personal journey, which, through encouragement from the people around me, has developed into a possible solution for others," she says. Frensch says Read Regular will be available to the public once she addresses some licensing and distribution issues. The U.K.'s Dyslexia Research Trust says that up to 10 percent of English readers have some form of dsylexia.
    Click Here to View Full Article

  • "Google Researcher Lectures on Internet Growth"
    Daily Nebraskan (10/24/03); Seravalli, Rachael

    Google.com senior research scientist Mehran Sahami discussed the progress of Web browsers and future challenges for the technology with an audience of computer science and engineering students at the University of Nebraska-Lincoln on Oct. 23. UNL assistant computer science professor Leen-Kiat Soh said he hoped the lecture would spark students' interest in computer science, and show them that research in the field offers both money-making and problem-solving opportunities. The number of Web users more than doubled to 320 million between 1999 and 2002; in the same period, the number of Web pages skyrocketed from 500 million to roughly 6 billion, and the number of daily Web queries increased fivefold to 500 million. "Obviously, we will need to build a system that will surpass those needs," Sahami noted. He said that Google.com researchers' chief goal is to build a browser that can locate the exact information users want with greater speed and ease, and which boasts significantly improved spam-deterrent measures. Sahami said that older browsers listed Web documents according to the frequency of the search term within the document, and this spurred spammers to create fraudulent Web pages embedded with specific words that allow them to be ranked high in search results. He added that the earliest browsers operated on the assumption that Web pages were more or less comprehensible, queries were lengthy, and words were spelled correctly, when in reality the opposite applies. So that Web results are accurate and fast, Google.com employs a combination of ranking technologies that gauge the words in a query and the Web sites that could correlate.
    Click Here to View Full Article

  • "Why Do We Care About Names and Numbers?"
    CircleID (10/22/03); McNamee, Joe

    In a CircleID article, Joe McNamee reflects on the findings of a report McNamee and Tiina Satuli authored for the European Commission entitled, "Policy Implications of Convergence in the Field of Naming, Numbering, and Addressing." McNamee describes how names and numbers stand for the resources that let telecommunications and Internet interests pinpoint reliable endpoints, and that fact has given rise to the issue that names and numbers are limited--which means a control system is needed to administer them. Scarcity related to the Internet and IP addresses is not as simply managed as scarcity in relation to telephony, given intellectual property concerns related to domain names and concerns about system integrity. For this reason, distribution of resources should follow an organized method regardless of whether government groups or businesses handle allocations. Without a set plan for naming and numbering, the goals of regulatory policy change and competition can be compromised. Convergence of communications technologies makes handling of names and numbers more complex, leading to various issues about regulation, though the Internet's foundation in open standards balances some issues that emerge. Regulatory issues have even more weight in relation to the convergence of Internet communications and standard telephony, because the two areas have varied regulatory histories. McNamee stresses that regulators should "understand the technological and market developments that are underway, and have coherent views on how to safeguard policy objectives like competition and respect for privacy in the converging marketplace." Legislation related to convergence is coming into place, but many questions still exist.
    Click Here to View Full Article

  • "But What About SCSI?"
    Computerworld Singapore (10/30/03) Vol. 10, No. 3; Wei, Lee Ser

    Although storage technologies serial ATA and Fibre Channel have garnered much of the limelight in recent months, the upcoming serial attached SCSI (SAS) standard will probably prove the most important for enterprise systems. SAS provides a perfect middle ground between the price emphasis of serial ATA and the performance emphasis of Fibre Channel. SAS is fully compatible with SCSI logic and shares the same physical and electrical connection interface with serial ATA, a feature that will allow companies to deploy both technologies in their data centers. International Data research director Robert Gray says the ability to build with SAS and serial ATA where appropriate will decrease total cost of ownership for businesses. SAS tackles the issue of multiple devices by using expanders that boast 64 connections each, and these expanders can be added in an array so a total of 4,096 device connections are possible. Despite its versatility, SAS uses far fewer wires than its predecessor, parallel SCSI, thus reducing cross talk and allowing it to be used in much smaller computer form factors. Compatibility with previous storage technologies as well as upcoming standards such as Internet SCSI (iSCSI) make SAS a safe bet for data center operations. Development components for SAS are expected for evaluation later this year, and will begin shipping in 2004; the SCSI Trade Association supporting SAS includes the same vendors that support serial ATA.
    Click Here to View Full Article

  • "Computing the Gains"
    Economist (10/25/03) Vol. 369, No. 8347, P. 70

    Using internationally comparable data and a new technique of studying sources of productivity growth, Harvard economist Dale Jorgenson has concluded that Japan and Europe, not just the United States, have made productivity gains through IT investments in the late 1990s. IT's role outside America is often downplayed because American statistics consider firms' software expenditures to be an investment, while in Europe and Japan such spending is classified as a current business expense. Additionally, official statistics in Japan and much of Europe make fewer adjustments than in America for computer quality improvements. The method employed by Jorgenson uses European and Japanese data that are adjusted to include price deflators and software expenditure metrics similar to those used in the United States, and uncovers evidence that all G7 economies expanded their productivity growth in the late 1990s thanks to an upsurge in IT investments. IT capital spending made a contribution to GDP growth in Japan whose size was comparable to the United States, while all European economies also experienced a rise in their IT capital stock--although in both regions the boost was partly offset by lower investment in non-IT areas. Jorgenson has also determined that, as in America during the latter half of the 1990s, Japan and all European nations--with the exception of Italy--experienced faster productivity growth, when total factor productivity (TFT) is taken into account: Japan's TFT growth in the late 1990s increased to 1.1 percent annually, compared to America's 0.6 percent, while Germany, England, and Canada boasted similar rates of overall TFT growth. Based on this analysis, Europe's IT-based productivity growth trails America's, but gradual labor-market reforms in Europe could lead to greater gains in the long run from new technology.
    Click Here to View Full Article

  • "Issues in Critical Infrastructure Protection"
    Contingency Planning & Management (10/03) Vol. 8, No. 6, P. 36; Reagor, Barbara T.

    The U.S. telecommunications infrastructure has been strengthened in order to make the country's communications backbone more resilient and secure against natural, accidental, and deliberate disruptions, but many formidable challenges remain. There is no consistent interconnection between systems and networks, while many infrastructures boast only partial linkage; major organizations--especially first responders--suffer from a paucity of technology integration; emergency response and recovery traits for developing network architectures have yet to take root; and demands for bandwidth, interoperability, and compliance with many technical specifications in order to support multimedia communications are stressing existing infrastructures. From a homeland security viewpoint, the telecommunications infrastructure must be shielded from cyberterrorism and unauthorized access, and bolstered through speedy threat evaluation, punctual distribution of relevant data to first responders, cross-agency communication, and quick implementation of response and recovery plans. The country has several well-entrenched initiatives to respond to natural disasters, man-made accidents, individual terrorist and sabotage acts, and general acts of war, but additional efforts must be made to contend with new threats, such as electromagnetic pulses. The Telecommunications Service Priority system set up by the FCC in 1988 supplies a regulatory, administrative, and operational architecture to support priority restoration and provisioning of national security and emergency preparedness (NS/EP) telecommunications services, while the Wireless Priority Service was introduced this past January to provide similar NS/EP priorities for the wireless domain. The Homeland Security Department has embarked on several projects to shore up the telecommunications infrastructure, including the delineation of a proper security threshold, mapping out the national infrastructure to find more areas where resilience can be added, infrastructure vulnerability and risk assessment through collaboration with public and private sector representatives, and the leveraging of combined infrastructure assets via coordination with America's trading partners.

  • "Watching the Watchers"
    Discover (10/03) Vol. 24, No. 10, P. 24; Johnson, Steven

    MIT graduate student Ryan McKinley predicated his Government Information Awareness Web site on the supposition that if the government uses information technology to monitor Americans through programs such as Terrorism Information Awareness, then citizens should be able to monitor elected officials in the same way. McKinley's project is designed to cull publicly available data from government sites, C-SPAN, telephone directories, and other official sources, and integrate it into one database that traces intergovernmental relationships. Visitors to McKinley's site receive factoids about political figures and their constituencies as they browse. McKinley is developing tools that allow data to be tracked as it is entered. Government Information Awareness has been set up to accept contributions from everyday users, and a major goal of McKinley's is to devise an effective system for weeding out bad data. Using eBay's decentralized e-commerce strategy as a model, McKinley's system has users score each other's contributions according to quality. "The hope is with that you can come up with pretty simple rules to keep out the things that would make this kind of database unusable," states McKinley. "I honestly believe that there's enough genuine interest out there for people to put time into this, and so the amount of garbage that goes in will be pretty small."
    Click Here to View Full Article

  • "Open Source in Embedded Products"
    Siliconindia (10/03) Vol. 9, No. 6, P. 36; Singh, Inder

    Although more attention has been paid to the growing use of Linux and related open source offerings in server products, there has also been comparative penetration of such products in the embedded market, where they are showing up in telephone switches, consumer entertainment devices, missile control systems, and other things. Price reductions in MIPS and megabytes are being accompanied by surging embedded software complexity, and open source software is viewed as an important tool to help deliver affordable embedded systems on time. Market research reports list Linux as the fastest growing and most popular embedded operating system, and that popularity will likely increase with the release of the Linux 2.6 kernel, which promises augmented real-time performance, simpler porting to new computers, large memory module support, microcontroller support, and a better I/O system. Linux is especially valued for its potential to provide an open, standard platform for embedded software that integrates with multiple vendors. The Embedded Linux Consortium's Platform Specification is expected to accelerate Linux's emergence as the leading open standards-based embedded software platform even further. The advantages of Linux over proprietary embedded operating systems include its royalty-free structure and freely available source code; wider hardware support than proprietary real-time operating systems; no single vendor lock-in; and a wealth of online resources and developer communities to aid embedded product development. Linux is also highly regarded for its durability, reliability, security, and well-integrated networking support.

  • "Where Are All the Women IT Leaders?"
    CIO Insight (10/03) No. 31, P. 76; D'Agostino, Debra

    An October survey from CIO Insight reveals a significant shortage of women in IT leadership positions--in fact, the number of female IT executives under 40 is less than 50 percent lower than the number of female IT executives over 40. Some analysts call the economic downturn a key factor in the erosion of women IT leaders: "Downsizing means that if women haven't already reached that stage of their careers, then they may not be in the group that gets to stay," reports MAPICS CIO Sandra Hofmann. Meanwhile, the Information Technology Association of America (ITAA) estimates that women accounted for just 22 percent of all computer science and engineering undergraduate degrees from 1998 to 2000, while the percentage of women in the general IT workforce fell from 41 percent in 1996 to 34.9 percent in 2002. ITAA President Harris Miller chiefly blames these low percentages on a lingering "geeky" image of IT workers that discourages women from pursuing tech careers. The first step toward boosting the number of women IT leaders is to get more school-age girls interested in technology. Hofmann adds that there should be more female mentors available to other women in the IT field. A study conducted last year by the Center for Women's Business Research shows an increase in women entrepreneurs: Between 1997 and 2002, female private business owners rose 14 percent to total 6.2 million. National Association for Female Executives President Betty Spence attributes this elevation to women's dissatisfaction in having few opportunities for promotion in male-dominated tech businesses.
    Click Here to View Full Article

    For information on ACM's Committee on Women and Computing, visit http://www.acm.org/women.