Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to technews@hq.acm.org.
Volume 7, Issue 829:  August 15, 2005

  • "In Silicon Valley, a Debate Over the Size of the Web"
    New York Times (08/15/05) P. C6; Markoff, John

    Debate erupted over how big the World Wide Web is last week when Yahoo! declared at an Internet search engine conference that there were upwards of 19.2 billion documents in its search engine index, more than double the 8.1 billion currently reported by Google; this led to Google raising questions about Yahoo!'s accounting methods. "The comprehensiveness of any search engine should be measured by real Web pages that can be returned in response to real search queries and verified to be unique," said Google co-founder Sergey Brin on Aug. 12, suggesting that Yahoo! inflated its index with duplicate entries. Jeff Weiner of Yahoo!'s search and marketplace group insisted that the document count in its index was accurate. However, both sides of the Web size debate agree that the relation of index size to the quality of results returned is loose, and perhaps somewhat converse. Researchers at the National Center for Supercomputer Applications ran a random sample of 10,012 queries on both the Yahoo! and Google indices on Aug. 14, and found that Google returned 166.9 percent more results than Yahoo!, on average, while Yahoo! turned up more results than Google in a mere 3 percent of cases. Both search engines are fiercely protective of their collection techniques' underlying software, and the continued secrecy will make accurate Web or index size estimates very difficult, according to search engine experts. "The whole question of how big indexes are has clearly become extremely political and commercial," laments Stanford University professor Christopher Manning.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Searching for Skills"
    Asbury Park Press (NJ) (08/15/05); Ash, Lorraine

    The H-1B visa program was founded, ostensibly, so that U.S. industry could import skilled foreign workers to address a shortage of cutting-edge tech, science, and engineering positions. NexGen Infosys owner Manoj Prasad, who came to the United States 10 years ago on an H-1B visa and is now an American citizen, agrees with many people that fresh H-1B tech graduates are more likely to be hired by U.S. companies than industry veterans who have not brought their skills up to date. Yet Prasad acknowledges that hiring available domestic talent is preferable, given how much it costs and how much time it takes to locate, sponsor, and fund the arrival of an H-1B worker; what is more, H-1Bs are ineligible for any jobs requiring security clearance. Prasad and other tech industry players argue that the best workforce mixes domestic and H-1B professionals, and Prasad says he would rather import an H-1B worker to fill a domestic position than offshore the job. However, economist and National Research Council committee member Eileen Appelbaum contends that the H-1B option is usually framed in an unacceptable way: "Industry said in 2001, 'Let us have the H-1B visas and we'll do the work here, or you can say no and we'll just move the work offshore,'" she recalls. "Well, they got all the H-1Bs they wanted, and they still moved work offshore." The current yearly H-1B cap of 65,000 was recently raised by an additional 20,000 visas for people who must have earned at least a master's degree at a U.S. school, which industry representatives say is an attempt to encourage talent nurtured in the United States to stay on. This is an increasingly difficult task at a time when overseas opportunities and better schools in more nations are luring workers away, according to Compete America Chair Sandra Boyd.
    Click Here to View Full Article

  • "Fewer Women Find Their Way Into Tech"
    Denver Business Journal (08/14/05); Mook, Bob

    With the number of women venturing into technology careers at its lowest point since the 1970s, nonprofits such as the National Center for Women and Information Technology (NCWIT) at the University of Colorado's Boulder campus are endeavoring to find out why. A recent survey found that one quarter of 1 percent of incoming female college freshman list computer science as a probable major, down from the mid 1980s mark of 4.25 percent. Girls are frequently dismissive of math and science at the secondary school level, as only 15 percent of the students who took the science Advanced Placement exam in 2004 were female, while girls accounted for 55 percent of the overall number of students who took AP tests. The dot-com collapse has eroded general interest in technology, though IT suffers from an image problem that specifically deters women from pursuing it as a career, said NCWIT CEO Lucy Sanders. Despite the Bureau of Labor Statistics' estimate that 1.5 million IT jobs will be created by 2012, many young people are discouraged by concerns over the emerging trend of offshoring tech jobs. Sanders is concerned that the exclusion of women from the IT sector will undermine the healthy collaboration between genders that often generates the best results; rather than a gender-specific high school curriculum, Sanders advocates an effort by educators to overhaul the image of technology to make it more appealing to girls by debunking the myth of the isolated programmer alone in a cubicle for eight hours a day and emphasizing the broad relevance of IT in a diversity of fields.
    Click Here to View Full Article

    For information on ACM's Committee on Women and Computing, visit http://www.acm.org/women.

  • "H-1B Visa Limits Reached for '06"
    SiliconValley.com (08/13/05); Johnson, Steve

    The U.S. Citizenship and Immigration Service announced on Aug. 12 that all 65,000 H-1B visas for fiscal year 2006 have been depleted in record time, whereas the H-1B cap was not reached in fiscal 2005 until October. The announcement spurred business interests to push for expansions to the H-1B program: ITAA President Harris Miller said the program is critical to American competitiveness in the high technology market. "We believe a significant increase is required to meet the need for specialized skills and keep companies--and, as a result, jobs for U.S. workers--growing at a steady pace," he declared. A law enacted last year provided an additional 20,000 H-1Bs for foreign workers with master's or higher degrees from U.S. institutions, and 8,000 of these visas have thus far been apportioned for fiscal 2006. Intel's Tracy Koon said her company has been unable to find highly skilled computer engineers and other professionals in the United States, adding that the number of U.S. students enrolling in engineering programs is insufficient. However, Ira Mehlman with the Federation for American Immigration Reform claimed U.S. companies should not be courting foreign workers, given the vast pool of recently out-of-work domestic talent.
    Click Here to View Full Article

  • "NIST Creates Online Treasure Trove of Security Woes"
    Federal Computer Week (08/15/05); Yasin, Rutrell

    The National Institute of Standards and Technology's (NIST) National Vulnerability Database (NVD) is a comprehensive repository of cybersecurity data culled from all publicly available vulnerability resources that also supplies references to industry resources. NVD creator and NIST computer scientist Peter Mell says about 12,000 vulnerability entries have been posted on the NVD Web site, with roughly 10 new postings added daily. The public will be able to use NVD to gain detailed information on flaws in specific products and trends in industry segments, while developers who must import vulnerability data into their security offerings could benefit as well, according to Mell. The database is constructed wholly on the Common Vulnerabilities and Exposures (CVE) naming standard maintained by Mitre, and which is used by some 300 security products to spot vulnerabilities and expedite interoperability between those products; Mell says NVD will further assist in the facilitation of compatibility by augmenting the CVE standard with detailed vulnerability data. The public can freely avail themselves of NVD's vulnerability information as an XML feed, and Mell says the database can also produce statistics that extrapolate vulnerability-discovery trends. Unlike the Homeland Security Department's Technical Cyber Security Alerts and Vulnerability Notes, which only notify the public about the most critical flaws, NVD offers "an encyclopedia of everything," reports Mell. SANS Institute research director Alan Paller notes that users can employ NVD to answer difficult queries such as whether software from specific vendors is flawed. NVD is sponsored by the DHS' National Cyber Security Division as a complement to the department's suite of vulnerability management products, Mell says.
    Click Here to View Full Article

  • "Long Live AI"
    Forbes (08/15/05) Vol. 176, No. 3, P. 30; Kurzweil, Ray

    Ray Kurzweil, author of the forthcoming book, "The Singularity Is Near: When Humans Transcend Biology," envisions advancements in artificial intelligence that will lead to better health, a cleaner environment, and other innovations that promise to radically change commerce, business, and society. He contends that our economy is saturated by "narrow" AI technology in which intelligent machines equal or outperform human intelligence for specific tasks, examples of which include email, heart disease diagnosis, landing aircraft, and guiding autonomous weapons. Kurzweil figures meeting the hardware needs for "strong" AI technology that exhibits the full range of human intelligence is an attainable goal, one that should be available by 2020. Fulfilling the software needs requires reverse-engineering the human brain, and the author reasons that effective mathematical models for most of the brain will be realized within 20 years, thanks to exponential advances in spatial and temporal brain scanning resolution. Also hastening progress toward strong AI is machines' increasing ability to recognize patterns, and Kurzweil anticipates a melding of AI and nanotechnology by the 2020s that yields robots the size of blood cells that travel through the body, communicating with one another via a wireless local area network and sending data and software to and from the Internet. These devices will be able to maintain health and extend life by killing disease and reversing the effects of age, as well as interact with neurons to enhance and/or supplant sensory input and facilitate fully immersive virtual reality, according to Kurzweil.

  • "IQ Test for AI Devices Gets Experts Thinking"
    New Scientist (08/12/05); Graham-Rowe, Duncan

    Researchers Shane Legg and Marcus Hutter with the Swiss Institute for Artificial Intelligence have devised a concept for an IQ test for artificially intelligent machines. The test is envisioned as an alternative to traditional measures of human intelligence, which would frequently conflict with systems whose senses, environments, and cognitive capabilities are dissimilar from those of people. Legg says the ability to reach goals in a broad spectrum of environments is generally thought to be the critical mechanism for human intelligence, and this idea can be extended to an AI system by gauging its ability to perform complex tasks within its specific environment, and then measuring the complexity of its environment against those of other AI systems. However, Legg acknowledges that the AI community will have to reach a consensus on the definition of the average environment, which will be a difficult task. Blay Whitby of the University of Sussex says many people are likely to oppose Legg and Hutter's test for a number of reasons: Not only do some people doubt that goals are an element of intelligence, but many might object to the possibility that the test would suddenly rate numerous computer programs as intelligent. Still, Whitby thinks the test is a good starting point, since it challenges the AI community to develop a universally accepted classification of intelligence. "This is a very important--perhaps the most important--issue to be resolved for the future of AI," he argues.
    Click Here to View Full Article

  • "Exploring the Life Robotic"
    Wellesley Townman (08/11/05); Hinchliffe, Beth

    Wellesley College physics professor Robbie Berg combines learning and excitement in initiatives such as his Math and Science Camp for Girls and the 16th Annual Young Science Program sponsored by the Committee for the Branch Libraries. Berg discussed and demonstrated robotics before an audience of children at the Hills Branch Library on Aug. 9, where he was impressed with the sophisticated answers some attendees gave to his challenging questions. The machines Berg demonstrated were created by his students, and the most notable invention was a robot firefighter that could negotiate a maze and put out a candle without touching any walls, extinguishing the flame with the release of air from an attached balloon that bursts from the candle's heat. Berg spotlighted a new approach to programming that is funded by a Lego grant. "The actual typing can get in the way of the ideas of programming, especially for little kids," he explained. "So we've created PICO blocks, a whole new way of thinking about programming, where you can drag interlocking icons to build up your program." His demonstration involved the faithful replication of a single note on a computer with a few keystrokes, and Berg wowed the audience even more by attaching pickles to a pencil, linking it to clips on the computer, and sliding the pickles back and forth, causing the programmed note to change in pitch.
    Click Here to View Full Article

  • "Kept Alive by Open Source"
    InternetNews.com (08/05/05); Kerner, Sean Michael

    Although the continuous influx of new software often renders older programs commercially obsolete, the open-source movement has extended the lifespan of many otherwise archaic applications, such as Gopher and DecNet. Software developers look to breathe new life into old programs for a variety of reasons: In some cases there is a legitimate problem to be solved, while other times it is simply a case of nostalgia. One method for preserving old applications employs new operating systems, most frequently Linux, as a platform on which to run them. The DecNet for Linux project has revived the peer-to-peer sharing application from the 1970s that had faded into obscurity due to the pervasive emergence of TCP/IP; and the beloved Atari 2600 game system is still alive and well thanks to the open-source project known as Stella, which emulates the original on Linux, Mac OS X, and Windows. Open-source movements can also revive old computers through the application of modern operating systems, such as the LUnix project that enables the Commodore 64 to run as a Linux machine. Similarly, the Linux/APUS Kernel project has revived the AmigaOS, a predecessor to Commodore. Amiga is the singular focus of the Amiga Research Operating System (AROS), which aims to update the AmigaOS 3.1 operating system. Amiga is still available commercially, but AROS developer Aaron Digulla believes the system's real hopes for survival lie with its open-source application: "AROS is one of the few things around the Amiga which seem to be alive," he says.
    Click Here to View Full Article

  • "Speech Verification Secures More Enterprise Apps"
    eWeek (08/03/05); Dyszel, Bill

    Voice prints are quickly replacing typed passwords and PIN numbers for user-friendly and more effective user authentication, according to a recent panel at SpeechTEK 2005. However, Diaphonics CEO Andy Osburn explains that voice prints are suitable for only certain situations. The technology is beneficial for law enforcement agencies who need to provide field-based officers with quick access to data and also for banks who need to provide mobile workers with access to wire transfers. Sometimes, the technology's simplicity will sometimes lead users to fear the safety of their data and therefore decrease usage of the system, says Intervoice design specialist Jennifer Wilmer.
    Click Here to View Full Article

  • "A Conversation with David Anderson"
    Queue (08/05) Vol. 3, No. 6, P. 18

    David Anderson of the U.C. Berkeley Space Sciences Laboratory directs grassroots supercomputing efforts such as SETI@home and the Berkeley Open Infrastructure for Network Computing (BOINC), in which volunteers donate their PCs' unused computing cycles for scientific research. "Volunteer computing can provide computing power and storage capacity way beyond what can be achieved with supercomputers, clusters, or grids," says Anderson, who notes that the BOINC project was created to address the shortcomings of early volunteer computing projects. These deficiencies included software development that was tougher and costlier than projected, and the resulting systems' lack of interoperability. BOINC establishes an infrastructure that can be used by both existing and future volunteer computing projects by accommodating those projects' requirements in a generic and simple manner. Anderson says a distinction must be drawn between volunteer computing and grid computing: The former involves the contribution of computing time by participants who are not accountable to projects, which means the lack of volunteers' trustworthiness must be reflected in the infrastructure software via a redundant computing mechanism; the latter encompasses the sharing of secure, trusted, and centrally managed resources within or between organizations that are accountable to each other. BOINC features a digital signature mechanism to prevent its use as a tool for virus propagation. Volunteer computing can make up for the lack of financial resources needed to fulfill the computing power requirements of projects such as SETI@home, although Anderson says volunteers must have an incentive for donating their CPU cycles--namely, the feeling that they are making a worthwhile contribution to an interesting area of research.

  • "Driving IT Initiatives and Network Innovation"
    Campus Technology (08/05) Vol. 18, No. 12, P. 20; Grush, Mary

    Duke University CIO and National LambdaRail Chairman (NLR) Tracy Futhey reports that the campus networked environment is expanding in terms of access as well as the kinds of users exploring new avenues of application. "The more you've got broad access and experimentation through this commodity network...and the fact that so many people are being connected so easily and inexpensively, without much effort, leads to a lot of individual situations where we can try new applications," she says. "These new uses are being tried by individuals who are not necessarily developers, but may be people who have a different perspective and a great idea worth trying." Futhey explains that experimentation on campus networks is being encouraged by three drivers: Students, who are willing to adopt new technologies to make their lives more convenient, at least initially; the IT organization, which is focused on furthering its agenda through innovation; and the faculty, through their enthusiasm to employ and support technology in the classroom as well as in the research environment. Futhey also mentions a fourth driver, which is Duke's devotion to applying technology to all levels of campus life and the campus environment as part of the institution's academic strategic plan. She is excited by progress on the national, regional, and campus level in achieving end-to-end high-speed access through newly available optical networking capabilities, and cites national-level projects such as Internet2 and NLR. Futhey also thinks grid and cluster computing are significant tools for improving research computing environments as well as giving faculty opportunities for cross-discipline collaboration.
    Click Here to View Full Article

  • "Extra-Preneurship"
    Futurist (08/05) Vol. 39, No. 4, P. 47; Snyder, David Pearce

    Traditional hierarchical business-management systems are being transformed by information technologies into extra-preneurships, or integrated virtual networks based on collaboration and self-actualization. Outsourcing is forcing organizational changes on the large enterprise so it can sustain itself in a global, tech-driven economy, but the emergent practice of open sourcing has even more potential to improve economic performance by allowing ordinary workers to create economic value. It is expected that a tidal wave of innovations rather than a single technology will drive the next techno-economic revolution. An open collaborative environment where rank-and-file employees share their knowledge with their counterparts both inside and outside the organization will be needed if this tsunami is to be successfully weathered. Open collaboration will also enable workers to make their own work incomparatively valuable so they can earn higher compensation than what the global market pays their peers for comparable work elsewhere. Open knowledge-sharing networks are scarce among blue-collar and gray-collar employees because of three main obstacles: A lack of on-the-job Internet access, which should change over the coming years as the telephone service and the Internet are integrated; not enough free time on the job for rank-and-file workers to concentrate on uncompensated work; and an apparent dearth of technical knowledge. It is practically a given that large institutions with a nationwide presence will need to support and operate online collaborative networks, and labor unions, employment agencies, and trade/industrial association networks are likely candidates.

  • "From Conception to Adolescence, the Speech Industry Is Developing a New Breed of Designers/Developers"
    Speech Technology (08/05) Vol. 9, No. 10, P. 61; Owens, Stephanie

    Speech technology standards such as VoiceXML and SALT are allowing skills to be transferred between the previously separate fields of design and development, and individuals who combine such expertise will become increasingly desirable in the speech industry as market innovation and compelling product offerings fuel enterprises' search for superior customer solutions. This talent is being cultivated in places such as Tufts University's Experimental College, which hosts a class that teaches students how to design and produce speech recognition systems. Students spend half their class time making the systems work, and the other half learning that the design of every aspect of a system is critical to the caller. Areas of concentration include working with the system's audio portion by recording and enhancing their own voices and friends' voices, as well as usability testing with professional voice talent. Enterprises are after people with experience in human factors, who are adept at meeting technical challenges such as call flow reversion, good-sounding prompts, and creating systems that can be pitched fairly rapidly. Human factors are "the scientific discipline concerned with the understanding of interactions among humans and other elements of a system, and the profession that applies theory, principles, data, and other methods to design in order to optimize human well-being and overall system performance." Human factor-oriented curricula became available to students once speech technologies entered the mix.
    Click Here to View Full Article

  • "Reinventing the Smart Phone"
    Software Development (08/05) Vol. 13, No. 8, P. 36; Lum, Rosalyn

    In a roundtable discussion, members of Forum Nokia PRO concur that the personal digital assistant (PDA) has been or will be supplanted by the smart phone in their local markets; but the majority foresee the emergence of a converged device. Consilient director of product management Steve Coache says the Canadian PDA market is smaller and more fragmented than the U.S. market, and cites a lack of standards for mobile device integration. He expects a tech convergence with handhelds and devices enabling compatibility with the rollout of WiFi and improved wireless bandwidth, and stresses the need for a standard device-management interface. Zenitum Entertainment Computing President Albert Kim anticipates the emergence of wearable cell phones, although their marketability will hinge upon a larger video screen to display multimedia information; he says Korean consumers do not like PDA phones because of their bulkiness, adding that new PDAs must be leaner and offer dual communication channels. CEO of Hong Kong-based Barefoot Software Mark Hillsdon and Epocware/Paragon Software's Michael Fadeev agree that although a standard operating system would be advantageous from a developer's perspective, the market would stand to benefit more from competition between rival platforms. Christof Hellmis of Germany's gate5 explains that smart phones are more popular than PDAs in Europe, and expects America's love affair with data-centric devices to continue, while connected PDAs are likely to incorporate phone capabilities. U.S. Quickoffice's Craig Senick says many people desire a mobile phone that incorporates PDA functionality and other technologies, and he expects the U.S. smart-phone market to catch up with other markets once network speed and penetration rates increase.
    Click Here to View Full Article
    (Access to the full article is available to paid subscribers only.)

  • "Social Machines"
    Technology Review (08/01/05) Vol. 108, No. 8, P. 44; Roush, Wade

    Technologies that deliver uninterrupted connectivity are evolving into an infrastructure that supports continuous computing. Continuous-computing technologies follow us throughout our lives because they are more readily adaptable to our locations, schedules, and preferences, and they enable us to create online identities that are more accurate reflections of who we really are. With such devices firmly rooted in the social architecture, spending time "on the computer" will no longer feel like a distraction or interruption. Continuous computing stems from three enabling elements: The first is the increased simplicity and affordability of Internet access. The second element is the proliferation of cheap, wireless computing devices, wireless laptops in particular. The third driving force behind continuous computing is the Web's transformation into a platform for personal publishing and social software through Web-based services built using standardized programming tools and languages primarily developed by the open-source software community. With such tools, Web developers can construct "social services" that collate and redistribute the knowledge of large communities. These services will become more powerful as more and more people use them.
    Click Here to View Full Article

  • "Instant Messaging: A New Target For Hackers"
    Computer (07/05) Vol. 38, No. 7, P. 20; Leavitt, Neal

    The growing popularity of instant messaging (IM), especially among businesses, has made it an increasingly attractive target to phishers, malware authors, and other attackers. IMlogic CTO Jon Sakoda says IM attacks can propagate rapidly thanks to IM's real-time capabilities. Other factors encouraging IM attackers include a lack of safe computing practice among users; the false sense of security users feel due to IM's immediacy and informality; growing functionality and complexity of IM systems; and an absence of corporate IM-use policies. Messaging providers and security companies are attempting to thwart or mitigate IM attacks by monitoring and analyzing IM security risks through the IMlogic Threat Center and similar efforts, and are also educating consumers about safe computing practices. Many IM virus outbreaks cannot be halted by traditional antivirus technology, which fails to keep up with the rapid spread of IM communications. However, virus throttling shows promise as a method for slowing down and limiting the damage of messaging worm propagation. Furthermore, major IM networks are amending their clients to combat buffer overflow attacks enabled by substandard programming and memory management.