HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.

To send comments, please write to [email protected].

Volume 5, Issue 450: Monday, January 27, 2003

  • "Big Shift in IT Jobs to Outsourcing Predicted"
    Computerworld Online (01/24/03); Hoffman, Thomas

    Between 35 percent and 45 percent of North American IT professionals will lose their jobs to part-timers, overseas technicians, contractors, and consultants by 2005, predicts Foote Partners. Foote Partners President David Foote attributes this trend to the fact that domestic application development is no longer affordable for American companies. Meanwhile, Meta Group analyst Maria Schafer says the situation Foote predicts is more likely to take place in 2006, and adds that Web design and application development remain growth areas for American IT workers. However, she acknowledges that IT workers in India, South America, Eastern Europe, and other offshore regions can do certain kinds of tasks for up to 50 percent less money than their American counterparts. Four high-level IT managers do not think their companies will be affected by the projected outsourcing trend, given that keeping IT in-house has been more cost-effective for them than their own outsourcing efforts. "The reason we've been successful in light of 9/11 and the economy and the bursting of the dot-com bubble is that we're in control of our own destiny and not locked into long-term contracts that may or may not be relative to our business anymore," notes Lydian Trust CTO John Studdard. Nevertheless, Foote and Gartner analyst Jeremy Grigg advise IT workers to become more adept in technologies such as wireless networking and IT security, or familiarize themselves with project management.
    Click Here to View Full Article

  • "Worm Exposes Laziness and Microsoft Flaws"
    CNet (01/26/03); Lemos, Robert

    Most of the blame for this weekend's outbreak of the Sapphire worm, which exploited a flaw in Microsoft SQL servers and infected over 120,000 computers, has been directed toward administrators who failed to patch their software since the flaw was disclosed six months ago. Ironically, the worm began to proliferate just one day after Microsoft Chairman Bill Gates proclaimed via a memo to customers that his company had made significant strides in the year since its Trustworthy Computing initiative was launched, and mere days after the General Accounting Office estimated that the government spent a minimum of $2.9 billion last year on homeland security-related information technology. The Sapphire worm, rather than creating files and deleting data, resides in memory and is designed to spread rapidly. Many networks were flooded with data sent by the worm, which was too much for many types of network hardware to deal with; as a result, some companies' Internet access was blocked. However, the worm's effects were hardly felt by consumers, according to Symantec's Oliver Friedrichs. "Some administrators might be at fault, but then some corporate managers might be at fault for understaffing, under-budgeting, and under-empowering their IT staff to be able to handle the security of their network," notes eEye Digital Security's chief hacking officer, Marc Maiffret. Incidents.org's Johannes Ullrich describes the incident as "a wake-up call" for those who thought the Internet was more secure because of increased corporate and federal concentration on the problem.

  • "Crime Is Soaring in Cyberspace"
    New York Times (01/27/03) P. C4; Tedeschi, Bob

    Cybersecurity consultants such as Ponemon Institute Chairman Larry Ponemon report that cybercrimes are increasing exponentially, yet quantifying losses is difficult because victimized companies are reluctant to publicly disclose electronic theft for a variety of reasons, including fear that it will inspire other hackers to attack them, shake the confidence of their customers and investors, or make them the target of rival businesses' ridicule. Ponemon adds that companies often hide these losses in their balance sheets, a practice that does not allow for "a clean picture of how expensive it is to have to deal with fraudulent or criminal activities." Mi2g estimates that the number of successful, confirmed worldwide hacker intrusions this month will probably exceed 20,000, compared to 16,000 in October. Last year, the FBI and the Computer Security Institute held a survey of 500 computer security practitioners, and found that 80 percent of respondents admitted that their companies sustained financial losses from hack attacks; the average loss was $2 million, according to 223 respondents who quantified the damage. Deloitte Touche Tohmasu's Richard Power reports that the increase in cybercrime is partly attributable to the economic downturn, while cutbacks in corporate budgets and personnel only increase the difficulty businesses face in securing their computer systems. Law enforcement officials acknowledge that tracing cybercrime is hard, because hackers can use technology to remain anonymous--plus they have an advantage over the authorities in terms of skill and numbers. Complicating matters is the fact that perpetrators are often corporate insiders; in fact, Gartner analyst John Pescatore attributes 70 percent of cyber-intrusions to employees who sold information to competitors in hopes of getting better jobs or building a financial cushion to sustain them if they are let go.
    (Access to this site is free; however, first-time visitors must register.)

  • "Taming the Frontier"
    Wall Street Journal (01/27/03) P. R10; Totty, Michael

    The Internet is becoming more structured and regulated as government and businesses work to stamp out illegal or competitive activity. In 1996, Electronic Freedom Foundation founder John Perry Barlow wrote in the "Declaration of Independence of Cyberspace" that the Internet was outside of traditional boundaries of law and government jurisdiction. Since then, attempts to control Internet activity have grown. In the 2002 legislative session, the Competitive Enterprise Institute reports over 400 bills affecting the Internet were proposed in Congress. The music and movie industries have increased their legal assault, and are now targeting individuals, ISPs, universities, and corporations that allow copyright piracy on their networks. Plans are in place that would allow states and localities to collect taxes on Internet sales, which were exempted in 1998 by the federal Internet Tax Freedom Act. Still, despite their increased efforts, illegal file-sharing has continued to grow tremendously, as users of the former Napster service have shifted to new networks such as Kazaa. Last year, Kazaa had more than 9 million user accounts, up from just 1 million in the middle of 2001, and a court in the Netherlands, where the company is based, recently ruled it is not liable for the illegal activity of users on its network. Ohio State University law professor Peter Swire explains that increasing regulatory pressure will force "elephants"--powerful organizations too large to escape scrutiny--to comply, while "mice"--individuals and small, independent entities--will continue to evade authorities. He says the population of "mice" can be checked by restricting their resources. In addition, most users are law-abiding, and will not make the extra effort necessary to conduct illegal activity on the Internet.

  • "Little Progress on Women in IT"
    VNUNet (01/24/03); Fielding, Rachel

    A study conducted by the U.K.-based Women in IT Champions Group has found that government efforts to attract more women into the IT workforce have made little progress. The report concludes that though 36 percent of new hires in first quarter 2002 were female, they represented 46 percent of all leavers. Some women are quitting their careers to focus on family life, but older women are leaving as well. Speaking at the Women in IT conference, Secretary of State for Trade and Industry Patricia Hewitt declared that business, government, and industry must work together to halt the erosion of the female IT sector. Rebecca George of IBM says the government should devote more research into the reasons women leave their careers, adding that the assumption that the primary reason for quitting was an imbalance between work and home life was "naive." "[W]e think that women are leaving because of corporate cultural issues and because they want to work in an environment where they have more control," she explains. Meanwhile, Oracle human resources director Richard Lowther admits a plan to bring graduates from non-IT disciplines into IT positions was not very successful.

    For more information about ACM's Committee on Women in Computing, visit http://www.acm.org/women.

  • "Electrical Control of Electron Spin Steers Spin-Based Technologies Toward Real World"
    ScienceDaily (01/27/03)

    Quantum computing and spintronics technologies could be one step closer to reality thanks to a breakthrough from researchers at the University of Pittsburgh and the University of California at Santa Barbara (UCSB). The scientists report that high-speed electrical circuits can be used to locally control electron spin, a discovery that demonstrates a quantum logic gate that can work with gating technologies used in current electronics. The basis for the project was formulated by UCSB physics professor David Awschalom and the University of Pittsburgh's Jeremy Levy, who realized that electric fields could be converted into magnetic fields if the axis of spin rotation could be changed with an applied electric field. The researchers reasoned that a control mechanism could be provided by the creation of semiconductor sandwiches, fashioned from aluminum gallium arsenide and gallium arsenide, that also feature parabolic quantum wells. "The experiments show that it is possible to build a very scalable array of quantum gates using semiconductors in a relatively straightforward manner," explains Awschalom. Meanwhile, Levy calls the discovery an "enabling technology" for spintronics and a "feasibility demonstration" for quantum information processing. The research, which was funded by the Defense Research Project Agency, was published on Science Magazine's "Science Express" Web site on Jan. 23.

  • "A Wish List For Commercial Grid Systems"
    Grid Computing Planet (01/22/03); Shread, Paul

    Fujitsu Laboratories researchers recently wrote that commercial grid computing networks are still immature, even as grid standards continue to improve. They said the Globus Toolkit 3.0, which is expected in alpha version soon, provides the most basic infrastructure for grid computing networks, but that commercial implementations require more refinement. Specifically, the Fujitsu paper said commercial grid computing technology needed to provide consistent, robust, and manageable service, but that this task was complicated by integration issues. Because Internet data centers often operate on many different platforms, the researchers said administrators often had to compromise robustness for timeliness and reliability. Also, in order to ensure service-level agreements are met, companies providing grid computing resources would likely have to over-provision. The Open Grid Services Architecture on which the Globus Toolkit is based could prove a solution for these problems, as it would create a common hosting environment linking different technology platforms. The researchers said a commercial solution would necessarily conduct Java program, batch process, and system composition operations. In addition, usability features would need to be included, such as configuration management, autonomic capability, resource management, authentication management, and user-tracking.

  • "THE Key to User-Friendly Computers?"
    Business Week Online (01/22/03); Salkever, Alex

    Jef Raskin, who co-designed Apple's trend-setting graphical user interface (GUI), is also one of its most outspoken critics. In an effort to repair what he terms "fundamental flaws" that are the result of "incompatibilities between the designs of both GUIs and command-line interfaces and the way our brains are wired," Raskin and a team of volunteers are developing "The Humane Environment" (THE), a command architecture that integrates the GUI's advantages with more flexible command-line systems. BusinessWeek's Alex Salkever, who tried out a THE-based freeware text editor that runs on the Mac Classic operating system, notes that the tool's biggest plus is visibility, in which the user sees information only when necessary. A flashing blue block represents the cursor, while a single letter or text command appears within the block; the advantages include being able to see tabs using the mouse rather than switching back and forth between viewing modes, and being able to realign sentences and eliminate shortened lines easier. Typed commands flash in blue text on the screen background, which enables the user to quickly spot and undo any command executed by an accidentally pressed key. Navigating a page using THE is done with a methodology Raskin calls LEAP. LEAP is enabled by punching the shift key and then hitting and releasing the space key, which causes a "Command" prompt to appear under the text; leaping from one part of the page to another is achieved by hitting the ">" key, then typing in a set of characters. Salkever finds that using a mouse or scroll arrow is still more intuitive than LEAP, while overall he prefers the traditional GUI over THE. Raskin acknowledges that a learning curve is necessary when users transition between computer interfaces.

  • "Teleworking Boom Causing IT Headaches"
    IT Management (01/21/03); Gaudin, Sharon

    Teleworking--the trend of professionals working at least part of the time out of the office through online connections--is expected to increase dramatically: The International Teleworkers Association Council (ITAC) estimates that the U.S. teleworker workforce is 28 million strong, and predicts that it will climb by 6 million annually over the next three years. However, rising numbers of teleworkers is a problem for IT departments, which are tasked with deploying equipment and extending their networks so that teleworkers can be just as connected and efficient as if they were operating on-site. "One of the main inhibitors to telework has been speed and with broadband that comes off the table," notes ITAC President and Kinetic Workplace CEO Tim Kane. "If I was an IT director, I'd look at these drivers and sit down and take a hard look at having a network that goes beyond the walls of the organization itself." He adds that deploying telework capability also involves boosting security, building additional virtual private networks (VPNs), supplying online access to corporate data, and providing each remote worker with multiple means of communication. "If the IT department hasn't thought through these technologies, then they're behind the curve," warns Mike Bell of Gartner Group, who explains that teleworking can save a company money--for instance, an in-office seat costs an average of $24,000, while a remote worker can cost just $6,000 to $8,000. IBM reckons that it saves over $1 billion over a decade thanks to its teleworkers, while IBM's Dynamic Workplace Solutions director Pam Stanford explains that teleworkers are also more productive.

  • "Remote Monitoring Aids Data Access"
    Technology Research News (01/22/03); Patch, Kimberly

    Researchers at Sandia National Laboratories have discovered a new way to access large sets of remote data in close to real time. Often, businesses and researchers have a hard time visualizing and manipulating complex data over distances because of increased lag time, which hampers usability, according to principal technical staff member John Eldridge. Instead of sending entire chunks of data back and forth, the scientists developed a video card system that just sends the video signals. Tweaking advanced graphics cards originally developed for computer games, Eldridge and his colleagues compressed the signals and sent them to a Gigabit Ethernet interface card, which then routed the signals over the Internet. On the receiving side, special hardware recreates the video stream from the packets, decompresses it, and displays it on a monitor. While the process of sending video instead of data is also bandwidth-intensive, it is faster for the huge repositories of data today's businesses and researchers accumulate. Doctors, for example, could use it to view magnetic resonance imaging (MRI) files when diagnosing patients. Eldridge says the system uses a few tricks to speed the system, including sending only changes in the screen display and using reprogrammable logic chips to encode and decode video signals, instead of separate components. He notes that Sandia is looking for a partner to commercialize the system, and plans to adapt it for use with multi-screen displays.
    Click Here to View Full Article

  • "Hot Off the Press"
    Nature Materials (01/23/03); Gerstner, Ed

    Organic electronics promise to allow low-cost, flexible circuits that could add intelligence to everyday items, but the technology still has some hitches before widescale commercial production is possible. Researchers have touted the ability to deposit polymer-based circuitry through ink-jet printing techniques and dye-transfer, but those methods require the polymers to be liquefied. More complex circuits require layering organic materials, and getting the solvents used in those processes chemically compatible is difficult. Researchers have discovered a new thermal imaging technique that allows the polymer to be deposited as a solid, thus eliminating the need for solvents. The scientists use a thin film with multiple layers to deposit the organic material onto the substrate. The organic material is thinly filmed on the bottom layer, a transparent carrier layer is in the middle, and a metal that absorbs light is on the top. The researchers lay the multilayer film down on the substrate, and then focus a laser through the film, so that the organic material is fused with the substrate according to the laser's design. Heat generated during the process rules out many organic materials, but the scientists are able to create a thin-film transistor array 50 cm by 75 cm in area with polyaniline mixed with an dinonyl naphthalene sulphonic acid additive. Single-walled carbon nanotubes are included to increase conductivity. The team hopes to refine the process so that it can be used for the dielectric and semiconducting layers as well.
    (Access to this site is free; however, first-time visitors must register.)

  • "When Files Survive, But Not the Technology to Read Them"
    Philadelphia Inquirer (01/23/03) P. D9; Jesdanun, Anick

    Howard Besser, director of New York University's Moving Image Archiving and Preservation Program, comments that the assumption that digital records will last forever is false. Computer files may be preserved, but if the technology to read them becomes obsolete, discarded, or lost, such documents become useless. Lacking ways to preserve such technology in some form could lead to a "digital dark age" in which countless knowledge is forever lost. This erosion is happening right now: Joe Miller of the University of Southern California recently learned that he could not access data from the 1976 Viking Mars landings, which were stored on magnetic tape; a digital version of the approximately 900-year-old Domesday Book can only be read by specially tailored hardware and software that are aging; and PC users who migrate files from one format to another may experience losses. The JPEG image format, for instance, compresses images so that details are lost, while converting to the newer JPEG 2000 standard makes such loss even more pronounced. Rand preservation specialist Jeff Rothenberg prefers emulation--the imitation of old platforms to run old software--over migration, but Margaret Hedstrom of the University of Michigan says such a method may not appeal to average PC users. In the meantime, IBM researcher Raymond Lorie is developing the universal virtual computer, standardized rules that current and future computers will be able to comprehend. However, such projects can only take off if legal issues over the copying and adaptation of out-of-date software are resolved, according to Cornell University's Anne Kenney, while Kenneth Thibodeau of the National Archives adds that software companies adapting open standards would be a significant development.

  • "Interfaces of the Future"
    Technology & Business Magazine (01/13/03); Withers, Stephen

    Futuristic computer interfaces such as those envisioned by science fiction writers--machines that can read a user's gestures or facial expressions, for example--are still on the drawing board or in the lab, and there has not been a sudden appearance of a revolutionary mass-market technology since the introduction of the graphical user interface in the mid 1980s. However, in Australia, some sophisticated interfaces have been making a gradual penetration into niche IT markets, virtual reality (VR) being one of them. Alan Ryner of SGI reports that immersive VR has become "core infrastructure" in some sectors, including the military, manufacturing, and mining, oil, and gas exploration. VR enables military personnel to deal with vast amounts of data in command and control systems by representing it visually, while the automotive industry, which started using VR to simulate car crashes, has moved on to more advanced applications such as styling and design. Ryner also reports that VR-based design can speed up time to market and make Australian companies more competitive by allowing far-flung teams to collaborate on the same data set. Oil and gas companies can use VR simulations derived from seismic data to work out the best areas to drill. Ryner lists "hazard perception and situation awareness" as a developing market for VR, in which people can be trained and retained using an interactive artificial environment that can model people's behavior. VR can be especially useful in analytical situations as a way to better serve people who are reluctant to handle large volumes of data.
    Click Here to View Full Article

  • "Internet2: The Virtual Sequel"
    Scientist (01/13/03) Vol. 17, No. 1, P. 14; Bunk, Steve

    Although Internet2 is capable of delivering data about 1,000 times faster than a standard 56 Kbps modem, the research network has not yet lived up to its potential in terms of applications. Technologies targeted for use with Internet2 may not become widely used until after five or 10 years, and may never filter through to the regular Internet. Some potential uses for the network, which is currently used by more than 200 American colleges, some 60 corporations, and federal agencies, include virtual labs, digital libraries, and tele-immersion. Virtual labs replicate real labs by using high-quality photos, three-dimensional images, genetic data, and point-and-click activation. A simulated sense of touch will be created through "force-feedback" interfaces, while voice, video, appliance readings, and simulations will be processed through a variety of computations across multiple branches of learning. Digital libraries are intended to house images using magnetic resonance, light microscopy, and digitized brain slices, for example. Tele-immersion refers to observing three-dimensional, real-time movements from actual or created sources. A virtual conference table is one possible tele-immersion application.
    (Access to this site is free; however, first-time visitors must register.)

  • "Electric Paper"
    New Scientist (01/18/03) Vol. 177, No. 2378, P. 34; Fildes, Jonathan

    Scientists are working to transform regular paper into an electronic display for moving images, changing colors, and text. After asking paper makers four years ago if they were interested in having electronic circuits, sensors, and displays added to their newspaper and packaging, Magnus Berggren and colleagues at Linkoping University and the Advanced Center for Research in Electronics and Optics in Sweden last year unveiled traditional paper that featured electronic displays. The display's "active matrix" resembled a laptop's thin-film transistor screen, with semiconducting polymers printed on paper to make its transistors and display cells. Berggren, who envisions electronic paper being used for large, low-resolution displays, is currently demonstrating a seven-segment display that resembles a digital clock. The technology could lead to chameleon-like wallpaper, flashing cereal boxes and toy packaging, poster-like displays in shops, and changing text in magazines within five years. However, challenges remain, including concerns about how to power electronic paper. Researchers also must find a way to make the displays change faster, and improve their color quality.

  • "Wi-Fi Stands Guard"
    EBN (01/13/03) No. 1345, P. 22; Dunn, Darrell

    The growth of Wi-Fi technology is being impeded by security concerns that have prompted corporate customers to postpone deployment, according to Dennis Eaton of Wi-Fi chip supplier Intersil. "Although we've seen a healthy amount of wireless LAN deployment, it isn't going to be mainstream--in the large enterprises, small to medium businesses, and all the way down to the public consumer--until we address this problem in a way that is cost-effective and simple," notes Intel's Patrick Bohart, who adds that enterprise adoption trickles down from office users to home users, thus playing a major role in consumer market capitalization. Three years ago a number of papers disclosed that the Wired Equivalent Privacy (WEP) security protocol bundled into the first Wi-Fi standard could be circumvented, and Eaton says that software tools designed to breach networks shielded by WEP subsequently appeared in the public domain, thus scaring potential enterprise clients. However, the loss of sensitive information carried on wireless networks remains a rarity. In response to the publication of the WEP problems, the IEEE organized Task Group I, which has been developing the Robust Security Network (RSN) using the 80211i protocol, which Eaton says is about 80 percent finished. RSN is supposed to provide better encryption for legacy equipment and future wireless LAN (WLAN) devices through the Temporal Key Integrity Protocol and the Counter Mode with CBC-MAC Protocol, respectively. Until the standard is complete, Wi-Fi chipmakers will offer support for Wi-Fi Protected Access (WPA) as a backwards compatible replacement for WEP. Bohart adds that companies can supplement WPA with trusted platform modules, biometrics, smart cards, and other security measures, and explains that the Wi-Fi industry is experiencing growing pains similar to those of any other developing technology.

  • "Electroactive Polymers"
    Technology Review (01/03) Vol. 105, No. 10, P. 32; Huang, Gregory T.

    Researchers at the Artificial Muscle Research Institute at the University of New Mexico are using electroactive polymers to give robotic devices more lifelike movements. The lab is home to robotic fish, wings, and arms that swim, flap, and lift with muscles from electrically activated polymers, which are flexible and not inhibited by gears and bearings. Although scientists and engineers have been applying voltage to the composition of polymer to make artificial muscles expand, contract, or bend since the early 1990s, polymers often consumed too much energy, did not generate enough force, and did not last long enough. However, researchers at Pennsylvania State University last September developed an electroactive actuator that demands 10 times less the voltage needed previously. The breakthrough means that robots that have the mobility of humans can be designed, as can implantable biomedical devices that could serve as microdelivery systems. The development "will enable faster implementation of science fiction ideas into engineering reality," says Yoseph Bar-Cohen, a senior research scientist at NASA's Jet Propulsion Laboratory. Observers expect the technology to be ready for market in five years, while smarter and more interactive materials could be available in 10 years.

  • "When It Doesn't Wash"
    Baseline (01/03) No. 14, P. 15; Gage, Debbie

    As consumer appliances are enhanced with software to make them more durable and more easily upgradeable, the question arises as to how to deal with buggy software. Analysts say the chief culprit behind such problems is the technology industry, which suffers from lax regulation and rushes products to market without adequate testing and development. The unreliability of highly-touted software can be a source of considerable embarrassment: For example, in May and July of last year there were major recalls of BMW 7-series automobiles because of buggy software in their electronic management units that affected the cars' performance. "BMW tried to do too many things at once with this car, and they underestimated the software problem," remarks Silicon Valley retiree Gary Conley, who currently owns a BMW 745i that exhibits various software glitches--many of which cannot be fixed by dealers. Auto industry expert and Automotive Consulting Group President Dennis Virag concurs, explaining that manufacturers made the mistake of contracting out software development because they were competing to add various electronic bells and whistles. Some technology vendors are doing away with quality assurance testing by implementing automated software development, but ABN AMRO Services cautions that this does not necessarily eliminate software bugs. VP John Schmuck advises companies to invest in planning and training, and integrate testing tools. Nevertheless, IBM hopes to eliminate software errors in its development cycle thanks to a licensing agreement it entered into with Parasoft, which offers automated software development tools.

[ Archives ] [ Home ]