HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.

To send comments, please write to [email protected].

Volume 5, Issue 458: Friday, February 14, 2003

  • "Software Success Has India Worried"
    New York Times (02/13/03) P. W1; Rai, Saritha

    India's fast-growing software industry is worried that political trends in the United States will mean a pullback from offshore outsourcing from that country. India's software industry trade group, Nasscom, held a meeting in Bombay this week with executives from American firms, but discussion on new legislation in New Jersey was cut short by one of Nasscom's vice chairman out of fear it would make the issue even larger in the public eye. American delegates said the issue was best played down because of the strong emotions evoked. Like manufacturing jobs that left the United States in previous decades, white-collar IT jobs are now leaving the United States as the majority of the world's largest 500 companies outsource overseas. Recently, Sen. Shirley K. Turner (D-N.J.) proposed a bill that would require public agencies to contract only to firms using American-based workers unless the skills necessary could only be procured elsewhere. Other states, including Connecticut, Maryland, Wisconsin, and Missouri are looking at similar legislation, and Nasscom has dispatched a public relations firm and lobbyists to turn political sentiment in the United States. The group is also considering appealing to the World Trade Organization, since Nasscom President Kiran Karnik says such laws amount to nontariff trade barriers. Research firm Gartner suggested recently that Indian software firms take a proactive stance to promote the benefits of overseas outsourcing to the public and not just to corporate clients.
    (Access to this site is free; however, first-time visitors must register.)

  • "Development Continues on Pentagon's Massive Data-Mining System"
    Associated Press (02/13/03); Crenson, Sharon L.

    The Defense Department has awarded over $20 million in contracts to develop the Total Information Awareness (TIA) system, a program that aims to integrate private and public databases so they can be mined for signs of potential terrorist activity. Jan Walker of the Defense Advanced Research Projects Agency (DARPA) reports that 26 bids were submitted, while four companies received contracts; DARPA has outlined five years for TIA development, three of which would be spent on developing ideas and demonstrations, and two to focus on the most promising concepts. Critics of TIA such as Sen. Ron Wyden (D-Ore.) are concerned that the system will be used to spy on innocent citizens and undermine Americans' civil liberties, while the Defense Department responded to this criticism by announcing a TIA oversight panel and denying allegations that it is gathering a massive database. TIA contractors include Raytheon, Telcordia, and Cycorp, which is working on a prototype database; Cycorp President Doug Lenat notes that company researchers have already created a system designed to recognize phone-calling patterns that could be typical of potential overseas terrorists. The contractors note that they are using bogus data in their TIA work.

    To read about ACM's reactions to TIA, visit http://www.acm.org/announcements/tia.html.

  • "Data Flood Feeds Need for Speed"
    Wired News (02/13/03); Dean, Katie

    An international team of physicists raised the bar for high-performance computing last November by sending 6.7 GB of uncompressed data at 923 Mbps between California and Amsterdam in 58 seconds using the Internet2 broadband network. The transfer took place at a rate 3,500 times faster than an average household broadband connection; the California PC ran RedHat Linux and the Amsterdam-based PC ran Debian GNU/Linux. Data transmission was handled by 9,000-byte jumbo frames, which outsize the packets usually sent over the Internet by a factor of six, while a router was supplied by Cisco. Stanford Linear Accelerator Center director of computer services Les Cottrell says such speed upgrades are essential if physics researchers around the world are to share knowledge and collaborate in real time, especially with terabytes of data being generated each day. High-speed computing, for example, will be critical if American scientists are to access data gathered by the Large Hadron Collider (LHC) in Switzerland, notes Internet2's Greg Wood. Cottrell adds that high-speed data transfer will also be a boon to the fields of astronomy, the Human Genome Project, and telemedicine. The international team hopes to beat their speed record through a collaborative project with Intel and the Los Alamos National Laboratory.

  • "Scheme Smoothes Parallel Processing"
    Technology Research News (02/19/03); Patch, Kimberly

    A team of scientists from several universities and research institutes have derived a way to coordinate more closely massive parallel processing applications from models found in nature. The researchers studied the growth of crystals and used a mathematical equation from that process to keep parallel processing in sync. Simulating large battlefield scenarios or managing cell phone networks requires huge amounts of processing power and programs that distribute the workload among connected machines. Keeping those distributed operations running smoothly and in sync has previously been achieved by central systems, but the newly discovered method found a more efficient way that does not use central management. Instead, each computer in the network is linked to those nearest it, as well as to one other random computer further away. The result is a small-world network similar to the social networks that allegedly link every person on the planet through six degrees of separation. The computers do not have to wait for the entire group to come into sync periodically, as with traditional models, but only have to check with one other computer. How closely the processing is synched can be varied with the frequency of the checks, which is controllable through the equation used. The research team found that the system model scales well to large networks and is likely to be ready for application in two years. Researchers involved in the project are from Mississippi State University, Rensselaer Polytechnic Institute, Los Alamos National Laboratory, and Florida State University.
    Click Here to View Full Article

  • "Some Experts Say Cyberterrorism Is Very Unlikely"
    Minneapolis Star Tribune (02/13/03); Alexander, Steve

    Dire predictions of terrorists launching "an electronic digital Pearl Harbor" against critical U.S. infrastructure espoused by former White House cybersecurity adviser Richard Clarke and others are being disputed by some experts, given the complexities involved in launching a coordinated cyberattack and a shortfall of technical skills in potential terrorist states. Center for Strategic and International Studies analyst James Lewis downplays Internet security problems, noting that essential U.S. networks are far more heterogeneous, profuse, redundant, and capable of self-repair than alarmists have indicated. He adds that the intricacy of critical infrastructure systems--some of which are very old and are separate from the Internet--is another barrier. "A hacker or even a large group of hackers would need to find vulnerabilities in multiple systems to significantly disrupt the power supply, and even then, an attack might only disrupt service for a few hours," Lewis postulates. Righard Zwienenberg of Norman Data Defense Systems reports that cyberterrorists could conceivably cripple the Internet through a global denial of service attack, but notes that they will only have a short window of opportunity to exploit it before computer security firms update their detection software to handle it. Meanwhile, Symantec says in its Internet Security Threat Report for the second half of 2002 that analysis of Internet attacks demonstrates that hackers in possible terrorist nations have insufficient know-how to launch serious cyberattacks, and are further impaired by their reliance on outdated methods. Gartner concludes that such attacks will have minimal impact on the nation, given terrorists' current level of technical proficiency, though Gartner research director French Caldwell predicts that a targeted attack on a critical component of U.S. infrastructure will be developed within the next two years.

  • "Perspective: Explaining the Tech Brain Drain"
    CNet (02/13/03); Kanellos, Michael

    The United States is suffering a significant decline of proficient technology professionals--the National Science Foundation declared last week that the number of graduating science and engineering Ph.D.s slipped to 25,509 in 2001, while 41.1 percent of the doctorates were awarded to American citizens, compared to 43.3 percent in 1998. Meanwhile, France, Spain, China, Russia, India, and other countries are becoming more competitive in their research and development push. The United States is also outsourcing many tech operations to nations with lower labor rates in an effort to cut costs. The United States should adopt a three-pronged strategy in order to revive its tech workforce, writes Michael Kanellos. Invigorating the educational system to lure more high-school graduates to the science and engineering fields is one step. Key to this is clarifying such fields to students. Another step is to encourage the emigration of foreign students and workers to the United States by awarding green cards to graduates and Ph.D.s, for instance. The third stratagem is for the country to analyze its basic tech strengths, such as project management and marketing.

  • "Are Developers Programmers or Engineers?"
    InfoWorld.com (02/12/03); Krill, Paul

    At the recent VSLive show in San Francisco, industry veterans Alan Brown and Alan Cooper discussed many of the problems endemic to software project management. Cooper, who is thought to have fathered the Visual Basic programming language, said that software programmers are frequently mislabeled as engineers, the difference being that engineers find solutions while programmers implement them. "Web designers are called programmers, programmers are called engineers, and engineers are called architects, and architects don't seem to ever get called," he exclaimed. Brown, who directs Rational Software's Rational Development Accelerator Initiative, noted that productivity could be significantly improved through component-based development so that software can be reused across disparate projects. He argued that at one end of the project management spectrum are projects developed according to individual input, and at the other are teams that operate consistently and predictably, and follow an unchanging life cycle. Cooper said that software developers generally work without supervision, which is a recipe for disaster; he advised that software projects should include people who have users' needs in mind to keep developers on track. Furthermore, he pointed out that an "adversarial relationship" between managers and programmers often leads to inaccurate time estimates.

  • "Disposable Science"
    Washington Times (02/13/03) P. B1; Boston, Gabriella

    The buildup of obsolete electronics in landfills can be reduced significantly thanks to a bevy of recycling and reuse programs, some of which are organized by computer manufacturers themselves. For instance, owners of old computers can visit Dell's Dellexchange.com and choose to either auction off their hardware, donate it to a nonprofit, exchange it for a newer model, or recycle it. Meanwhile, Best Buy customers with old electronics can drop them off at annual recycling events held by several stores nationwide. Cell phone makers such as AT&T allow people to turn in outdated products at their retail outlets. Montgomery County, Md., has a permanent place to drop off spent electronics in Rockville, which is managed through its Computer Recycling Program. Recycling the plastic components of electronics is difficult, and one alternative is to use a biodegradable polymer, according to Philip DeShong of the University of Maryland. Some manufacturers are designing more environmentally-friendly products, such as smaller cell phones and desktops with lead-free flat screen monitors. However, the EPA estimates that over 3.2 million tons of used electronics are dumped in landfills each year, while only 11 percent of obsolete computers were recycled in 2001. Proponents such as DeShong advocate a two-pronged strategy for electronics recycling: Manufacturers must design their products to be less detrimental to the environment, while consumers must adopt better recycling habits.

  • "Hey, Hasn't My Computer Heard You Somewhere Before?"
    New York Times (02/13/03) P. E8; Headlam, Bruce

    Accenture researchers Dana Le and Owen Richter have created a prototype portable computer that can record conversations, what time they took place, and the location where they occurred. The device, known as a personal awareness assistant, could be used to remind users of critical information gleaned during social interactions--people's names, phone numbers, and so on--without being too obtrusive, according to Le. The first prototypes included a wearable computer with Global Positioning System (GPS) electronics, speech recognition software, an earbud, and two microphones, one of which records ambient sound in a 30-second loop while the other records user cues. The device's refinement was largely based on two realizations: That people are more comfortable carrying cell phone-like devices and earbud accessories rather than a full computer, and only keeping audio files on the device allows it to be less intrusive. Richter says his lab created the personal awareness system to demonstrate future technology to Accenture customers rather than for commercial gain. The Accenture researchers say the most obvious application for the device would use the location-stamping feature, such as to remind wearers to buy products whenever they are near a grocery store; another would be to enhance collaboration by having one person feed information to the wearer via the earbud. However, Steve Mann of the University of Toronto says that many people are uncomfortable wearing earbuds, and is skeptical that audio information alone is enough to support collaboration and other activities. Meanwhile, Charmed Technology COO Joe DeVita muses that the technology's adoption by mainstream users could be hampered, since recording conversations are often viewed as a violation of privacy.
    (Access to this site is free; however, first-time visitors must register.)

  • "Perl Features of the Future--Part 1"
    NewsFactor Network (02/12/03); Brockmeier, Joe

    Perl creator Larry Wall is gearing up for a dramatic makeover of the programming language that resembles natural language in many ways. For version 6, Wall has taken more than 300 requests for change into consideration, from what some consider to be the most dedicated and innovative development community. In his design documents released to the public, Wall has hinted at the changes to be implemented. Nat Torkington, Perl project manager for version 6, says the new iteration will clear out some outdated legacy code that made previous versions of Perl backwards compatible, but complicated the integration of new features. Perl 6 will also have support for some more recent features, such as threading and Unicode, built in from the ground up, with the help of a new interpreter called Parrot. Torkington also says the new regular expressions in Perl 6 add capabilities and make adapting the language to newer technologies easier. However, those involved in the project do not expect Perl 6 to gain fast popularity, and instead expect a gradual migration over several years, similar to how businesses moved from Perl 4 to Perl 5. Perl expert Randal Schwartz says that development on Perl 5 will continue from the most current stable version, Perl 5.8, to Perl 5.10 and even Perl 5.12, since developers realize businesses cannot easily switch over to new platforms for their mission-critical applications.

  • "Perspective: Taking the Easy Way Out on H-1B"
    CNet (02/12/03); Zisman, Ronald A.

    The Immigration and Naturalization Service's annual report estimates that approximately 17 percent of 1,064,318 immigrants who were granted permanent residence in the United States in fiscal 2001 were admitted under employment-based preferences, and only 7.9 percent of that number qualified under a category requiring at least a bachelor degree. Some people in this category were brought in on H-1B visas. The percentage of immigrants in the U.S. population--8 percent--is at its lowest point in almost 80 years. The United States should develop programs to attract more immigrants with IT skills, as the European Economic Commission has done, writes Ronald A. Zisman. He adds that qualifications for U.S. immigration should include being highly educated, first-generation professionals capable of earning high incomes and who pass their work ethic on to their children. Zisman argues that critics concentrating on H-1B workers are overlooking a transformative event in the technology industry--global software development outsourcing, which is being spurred by increasing numbers of educated workers overseas, falling product manufacturing costs, and Internet-based transfer of modular technology. Both European and American companies are outsourcing to overseas firms or building in-house offshore sites, which is significantly reducing the incidence of "bodyshopping" that ran rampant during the IT boom of the 1990s. U.S. companies were expected to spend $7 billion on global software outsourcing in 2002, up from $5.5 billion in 2000.

  • "Forget Moore's Law"
    Red Herring (02/10/03); Malone, Michael S.

    Former Forbes ASAP editor and author of "The Microprocessor: A Biography" Michael S. Malone warns that the high-tech industry's fixation on Moore's Law, which dictates that processing power and chip density double every 18 months, is an invitation to disaster. He does not doubt the veracity of Moore's Law, but writes that re-invigorating the electronics industry should be of greater concern. Malone argues that blind adherence to Moore's Law by entrepreneurs hoping to capitalize on e-commerce--which was encouraged by investors, stock markets, and venture capitalists--led to the dot-com boom and bust and the resulting economic fallout. A similar meltdown occurred in the telecommunications sector, while the biotechnology industry is on track to suffer the same fate. Malone points out that many high-tech experts are realizing that the evolution of business and society cannot keep pace with physics. He cites Google CEO Eric Schmidt's announcement several months ago that his company will not adopt the Itanium superchip as a watershed event that symbolizes a major change in thinking away from Moore's Law. Google has opted instead to embed smaller and less expensive chips in future servers because Schmidt says he is after "maximum functionality" rather than "maximum power." Companies that do not recognize this revolutionary shift in thinking will suffer greatly, predicts Netscape co-founder Marc Andreessen.

  • "Coalition Makes Play For Multiband"
    Internet.com (02/03/03); Lipset, Vikki

    At an IEEE 802.15.3a Task Group conference held last month in Florida, a group of five companies organized its own forum to mull an ultrawideband (UWB) standard involving multiband technology. However, most ultrawideband developers favor a singleband approach and have devoted considerable resources to its development. The group, consisting of Intel, Discrete Time, Time Domain, General Atomics, and Wisair, all decided separately that "It's going to be difficult to deploy this commercially as singleband," according to Mark Bowles of Discrete Time. At present, the FCC allows UWB equipment to run in the 7.5 GHz-wide transmission band. In contrast, multiband technology splits the spectrum into 500 MHz bands that work together to skirt data around interference from outside systems. Ben Manny of Intel Research and Development says multiband technology is advantageous because it can work with multiple systems (especially with 802.11a-based WLANs), can scale to fit faster data rates in coming years, and is flexible enough to adjust to various spectrums worldwide. On the other hand, a multiband approach is more difficult, costly, and power-hungry than singleband technology. The informal coalition will have opportunities to formally present its proposals to IEEE in March and July, but an official standard could take up to 18 months to be established.
    Click Here to View Full Article

  • "Professor Directs Two Tech Efforts"
    Chicago Sun Times (02/13/03); Lundy, Dave

    Kris Hammond, founder and director of Northwestern University's Intelligent Information Laboratory (InfoLab) and Information Technology Development Laboratory (DevLab), studied and developed artificial intelligence for 12 years at the University of Chicago, and moved to Northwestern to apply his research to real-world problems. He says the purpose of InfoLab is "to reduce the friction that people constantly encounter when trying to find information, both online and offline" through a combination of artificial intelligence, information retrieval, and cognitive science. Hammond founded DevLab as a facility to help transfer academically-developed technologies to the commercial sector and build thriving, Chicago-based tech businesses; both labs dovetail with Northwestern's goal of nurturing students' programming and software engineering skills. He says "our students would like to know how to be not just programmers, but software engineers." Projects under development that Hammond thinks hold great potential include Watson, an information retrieval system that employs artificial intelligence, and a program incorporated into a TiVo box that collates closed-caption information from TV programs and presents relevant data in a micro-site built in real time. Hammond thinks the local business community would benefit significantly by using the university as a resource to solve problems, while academics would gain a better knowledge of business problems. He says that DevLab's chief purpose right now is to build value rather than make money, and praises Northwestern for not subscribing to the traditional license-and-leave practices of many tech transfer programs.

  • "Moore Predicts More Computing Advances"
    BBC News (02/11/03); Shiels, Maggie

    Dr. Gordon Moore, the co-founder of Intel, said his dictum that semiconductor density would continue to double every two years will remain true for another decade at least. After that, he said advances in silicon-based computing may slow, but there is no brick wall engineers' creativity cannot surmount. Moore spoke at the International Solid-State Circuits Conference's 50th anniversary meeting in San Francisco. "No exponential is forever," he told the gathering. "Your job is to delay forever." Heat and power leakage are the most dangerous threats to Moore's Law, he said, warning that a laptop requiring a kilowatt of electricity would be "very uncomfortable." Recently, Red Herring magazine reported that Moore's Law threatened the viability of the semiconductor industry since consumers would not be willing to pay for maximum performance when they already have access to suitable functionality (see news item above). Moore said such thinking did not take into account the cost-reduction aspect of rapid technological advance, and said as long as the speed of innovation kept up, companies necessarily had to compete on that front. He also said alternative platforms such as nanotechnology, molecular computing, and quantum computing would not be able to displace silicon technology for a long while, and estimated a cumulative $100 billion had been spent on silicon development to date. However, Moore did say that those technologies would likely supplement silicon-based computing eventually.

  • "Tangled Up in Spam"
    New York Times Magazine (02/09/03) P. 42; Gleick, James

    The rapidly growing tide of junk email, or spam, is inundating ISPs, infringing on email users' privacy, and eroding people's trust in the Internet, and thus far efforts to control it through grass-roots activism, spam filtering software, and legislation have been ineffective. Spammers have been aided by the email explosion of the early 1990s, as well as by service providers hungry for market share and the Internet's relatively open architecture. The FTC keeps a vast database of spam samples submitted by consumers, yet cannot really put pressure on spammers, since spam is not considered obvious fraud and therefore unlawful, despite the fact that spammers use deceptive means to evade being tracked down. Popular spam filtering programs become less effective as time goes on, because spammers are constantly looking for new ways to circumvent such blocks. Another problem with filtering programs, be they built into email programs by ISPs or developed by independent programmers, is the risk of them blocking legitimate email; solutions that may reduce the occurrence of false positives include programs such as SpamAssassin, which labels email as spam using an ever-growing system of rules and points, such as the appearance of certain words, requests for credit card numbers, etc. A more recent filtering technique is probabilistic filtering, in which the software tracks all words in each email and determines their statistical probabilities as spam-indicators--but even this method is not foolproof. Anti-spam legislation has been stymied by both corporate marketers and Internet supporters' reluctance to accept government regulation, although the Direct Marketing Association recently admitted that a lack of legal safeguards will be detrimental to legitimate marketers. Consumer Project on Technology director James Love argues that fining spammers would be an effective deterrent, as would an international anti-spam treaty; another, simpler solution would be to make it illegal to forge Internet headers and require unsolicited bulk email to be tagged as such.
    (Access to this site is free; however, first-time visitors must register.)

  • "Intel Looks to Software"
    Computerworld (02/10/03) Vol. 37, No. 6, P. 33; Anthes, Gary H.

    Despite its recognition as a hardware company, Intel also is a formidable software operation with more than 6,000 programmers in its ranks. Late last year, Intel created four Intel senior fellow positions at the top of its research division, two of them going to director of microprocessor research, Justin R. Rattner, and general manager of software and solutions, Richard Wirt. The pair say compiler software is becoming more effective through OpenMP, which allows programmers to prepare the code for hyperthreading and multithreading beforehand, and through dynamic compilers, which will optimize operations even as the program is running. Rattner says the key to this type of dynamic optimization is program instrumentation that gives the compiler visibility into runtime execution. Wirt says Intel aims to build more and more parallelism into its hardware; first through threaded single processor operations, then by placing multiple threaded processors on the motherboard, and finally through huge strings of multi-core processors. Existing applications can take advantage of this parallel processing with helper threads that line up data that is usually missing and make it ready for the main thread, yielding between 1.3 times and 1.6 times performance boosts on average. Wirt says Simple Object Access Protocol (SOAP) on the commercial side and Message Passing Interface (MPI) on the technical side will let application functions work together in a large distributed computing environment.
    Click Here to View Full Article

  • "The Race to Kill Kazaa"
    Wired (02/03) Vol. 11, No. 2, P. 104; Woody, Todd

    The U.S. entertainment industry has conducted a long and often frustrating legal campaign to shutter the popular Kazaa file-sharing service, and been thwarted at many turns by Kazaa's decentralized operational structure, which is spread out across several nations. The software Kazaa uses is located in Estonia and on a remote island off the British coast; the Kazaa.com domain is registered to LEF Interactive in Australia; and Kazaa's interface, FastTrack, is now owned by Sharman Networks, which is based on the island republic of Vanuatu and runs its servers in Denmark. Attempts by Hollywood lawyers to root out Sharman's investors and board members were deflected because Vanuatu is a haven for maintaining business confidentiality, but this did not stop them from seeking out Sharman CEO and LEF Director Nikki Hemming, who eventually was deposed in Canada. At a hearing in late November, U.S. District Court Judge Stephen Wilson believed there was a case of jurisdiction against Sharman, but Sharman, in anticipation of such a ruling, has been trying to change its image by bundling Kazaa with the Altnet peer-to-peer network, which compensates copyright owners for use of their works and gives them control over their distribution. The prosecution dismissed this claim for legitimacy as a delaying tactic to sell more advertising. Brilliant Digital Entertainment CEO Kevin Bermeister believes that Altnet could be used to encourage more sharing of sanctioned content if Hollywood comes aboard, but the fact remains that the number of downloadable files on Altnet pales in comparison to the volumes of files Kazaa offers for free. Furthermore, the wider appeal of Kazaa's services brings in more users, expanding Sharman's consumer base and revenue opportunities.

  • "Will Computers Replace Engineers?"
    Discover (02/03) Vol. 24, No. 2, P. 40; Haseltine, Eric

    A roundtable of technology experts debated how computers are encroaching on the engineering profession by automating engineering tasks, and what this holds for the future. When asked what he thinks is the most significant effect computers have had, Stevens Institute of Technology professor Lawrence Bernstein described a revolution in structural design that has led to, among other things, earthquake-resistant buildings in Japan; he also anticipated a bioengineering explosion soon thanks to the use of computers in analyzing proteins, RNA, and DNA. Meanwhile, Columbia University professor Al Aho, formerly of Bell Laboratories, said he foresees a time when machines and human beings are interchangeable, and believes that computers with computational power and memory equal to that of human beings will emerge within 20 to 30 years. However, consultant Jeff Harrow did not think computers are "anywhere close" to supplanting engineers, because they lack creativity and are chiefly concerned with carrying out "scut work." In his opinion, the most exciting trend in computing is the application of computers beyond the computing field, computerized surgery being an example. Nicholas Donofrio of IBM said computers play a key role in designing new computers, and forecasted that they will be able to learn more from people in the future; he was very excited that computers are replacing physical construction and testing of products via simulation, which greatly streamlines the development process. Harrow, Aho, Donofrio, and others maintained that there will always be a need for engineers for a number of reasons, including the flood of new ideas and the widening of the field's scope. Harrow said, "[I]f we ever get to the point where there's nobody who understands what...[computers do], we're in deep trouble, because then we'll never be able to make any additional moves forward." He also speculated that the move to biologic computers could spawn self-replicating machines.

[ Archives ] [ Home ]