HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.

To send comments, please write to [email protected].

Volume 4, Issue 409: Wednesday, October 9, 2002

  • "Tech Panel Recommends Better Data Sharing, Analysis in War Against Terror"
    SiliconValley.com (10/07/02); Puzzanghera, Jim

    Federal officials need to think more critically about how to share and analyze data, and build a decentralized IT network in order to collaborate with local and state officials, according to the Markle Foundation Task Force on National Security in the Information Age. The group's report, "Protecting America's Freedom in the Information Age," was presented to Homeland Security Director Tom Ridge, and detailed recommendations on how to maintain privacy and security while improving data sharing between homeland security-related agencies. For instance, the task force said that the FBI should separate its data analyzing function from its law enforcement tasks, as does the United Kingdom, Germany, France, and Israel. Moreover, analysis of sensitive personal information, such as financial records history, should be handled by the Department of Homeland Security, and not the FBI. The California Anti-Terrorism Information Center was also touted as a model of how to connect local, state, and federal agencies to better defend the nation against terrorism. That system gives local law enforcement officials a quick and easy mode of communication with their federal counterparts. Besides organizational changes, the task force also said federal purchasing rules should be changed, so as to make it easier for agencies to buy the best solutions, not the most convenient ones. The 34-member task force, chaired by former Netscape CEO James Barksdale, included several Silicon Valley executives and recommended that government take advantage of private-sector technology.

    To read more about ACM's activities in the area of database protection, visit http://www.acm.org/usacm.

  • "More Students High-Tail it Out of High-Tech Classes"
    USA Today (10/09/02) P. 1B; Kessler, Michelle

    U.S. colleges say fewer and fewer students are registering for computer science and engineering classes, and experts say the trend could lead to a severe shortage of workers once the economy revitalizes. Enrollment in a key computer science course at Ohio State University is off 30 percent this year, while a similar course at the University of Michigan is down 20 percent. Educators say this year's decline continues a trend that began last year. Students say the high-tech industry is less attractive now due to the problematic job market and the dot-com bust, which has cut many perks and lowered the chance for big financial rewards. Although current computer science majors should provide a healthy influx of graduates for several years, the decline in the number of new students likely will produce a shortage of tech workers in four or five years, says Virginia Tech's Verna Schuetz. Silicon Valley think tank Joint Venture says unfilled tech jobs cost Valley firms as much as $7 billion during the industry's boom.
    Click Here to View Full Article

  • "Cutting-Edge Tech Grabs Federal Grants"
    CNet (10/08/02); Lemos, Robert

    The National Institute of Standards and Technology (NIST) awarded $92 million in federal grants to 40 advanced-technology initiatives on Tuesday. Over $12 million went toward nanotechnology projects, including a proposal from General Motors and the State University of New York Binghamton and Superior MicroPowders to develop materials that can dissipate the heat generated by microprocessors 10 times better than current substances. Meanwhile, five companies were granted nearly $12 million for a project to boost the storage capacity on tape systems by increasing data density 250 times. InPhase Technologies won funding from NIST to develop holographic storage that can be rewritten as many as 1,000 times. Cinea received $2 million to create a technology designed to thwart pirates who record movies using a video camera. The company declared that it wants "to develop and test a prototype technology for distorting unauthorized recordings of digital movies without affecting human visual perception of the original version." NIST says the Advanced Technology Project awards are not intended to support product development research, but are intended to help companies build technologies that can be commercialized later on.

  • "Better PCs with Plastic Magnets"
    Wired News (10/07/02); Knapp, Louise

    Ohio State University researchers are designing a computer that could boast crash-proof data, high-speed processing, and instant boot-ups by exploiting electron spin using magnets. Electrons normally spin randomly, but the application of a magnetic field polarizes them so that they spin in one direction. Ohio State researcher Arthur Epstein adds that a computer that uses spintronics would boot up without any lag time since the data would be stored directly in RAM via a magnetic control drive. University of Utah chemistry professor Joel Miller says that a spintronic-based system would be able to process complex data faster and store more information. Furthermore, the non-volatile magnetic RAM would ensure that unsaved data can be retrieved in the event of a computer crash. Miller and Epstein collaborated on a magnet made from vanadium tetracyanoethanide, which retains magnetism at high temperatures; this is a marked improvement over other plastic magnets, which only function at extremely low temperatures. The Ohio State team successfully achieved spin polarization in the polymer, and David Awschalom of the University of California, Santa Barbara, reports that such systems could be mass-produced at relatively low cost. Manufacturing could also be simplified, since the magnets easily lend themselves to ink-jet printing. However, Miller says the plastic magnets' stability must be heightened before they can be incorporated into conventional computers.

  • "Are Tech Jobs Paying Less?"
    E-Commerce Times (10/08/02); Regan, Keith

    There has been a noticeable decline in both the number of technology jobs available as well as salary growth since the tech boom ended. Challenger, Gray & Christmas CEO John A. Challenger says that employers "are in no rush to hire, and apparently they are secure enough to believe they can find the talent they need for a lot less money." His firm reports that 75 percent of top-earning professionals who changed jobs in the second quarter of 2002 received salary reductions. Yet most salaries have increased slightly or remained level over the past two years, while the Bureau of National Affairs predicts that there will be an overall salary gain of 4 percent this year, with technical workers receiving the highest increases. Challenger says that employers are loath to institute pay cuts except as a last resort, since such measures can have a negative effect on worker productivity. Monster.com technology jobs expert Allan Hoffman notes that location and worker skills can seriously impact earnings, and adds that tech centers such as Silicon Valley and Washington, D.C., could be characterized by higher salary decreases due to the large number of qualified workers available. Some employees may be discouraged not by salary drops, but by jobs that have lost their glamour because perks such as stock options are being eliminated or downscaled. Skills that are still highly sought after include software design expertise, which has become a major focus of the security sector, Hoffman says.

  • "Quantum Leaps May Solve Impossible Problems"
    NewsFactor Network (10/07/02); Martin, Mike

    Australian mathematician Tien Kieu is challenging long-cherished notions of mathematics and computability as outlined by Alan Turing and Alonzo Church, whose Turing-Church Thesis states that problems are insoluble if a computer is unable to solve them, and vice-versa. Kieu contends in a recent paper that there are so-called incomputable problems that could be solved using quantum mechanics. He also claims that his theory effectively solves two major conundrums: David Hilbert's tenth mathematical problem, which asks if any algebraic equation involving whole numbers can be solved via algorithm or program; and Turing's halting problem. A quantum algorithm, Kieu explains, would enable him to search through a limitless number of possible solutions to Hilbert's problem in a finite amount of time. He adds that once the quantum computer is done searching it should halt, thus solving the Turing problem. Richard Gomez of George Mason University reports that Kieu's findings are consistent with those of quantum computation and quantum physics researchers. However, he says that "We are still trying to figure out where all the quantum calculations are going to take place, since there are not enough atoms in the universe--about 10 to the power of 80--to do the calculations and store the qubits--individual bits of quantum information."

  • "Beware of the Internet Toll Booth"
    ZDNet (10/07/02); Perens, Bruce

    The World Wide Web Consortium (W3C) recently rejected patented software for use in open standards in what is likely the first in a series of battles to keep software fees away from open technology standards. Had patented software been allowed into W3C standards, patent holders--usually large technology companies--would have possibly been able to collect fees whenever the standard was used under RAND conditions (reasonable and non-discriminatory terms). This would have kept many individual developers from creating widely distributed free Internet programs and penalized smaller companies that do not have extensive patent portfolios they can leverage to get cross-licensing deals, which large firms use to eliminate patent fees owed to one another. Patented software can still affect open standards because patent applications are secret until accepted under U.S. law, and the new W3C rule would probably mean any impinging standard would be reworked to avoid any associated fees. Although patented software may also sometimes be the best technical option, the fault should lie with the current software patent paradigm, which was ostensibly set up to encourage technology innovation. However, as is demonstrated in the W3C case, software patents actually serve to hinder technological progress by restricting what developers can do without paying patent fees. The W3C decision also included a provision that allows patented software to be included in standards if the owner grants royalty-free rights to others for that specific open technology.

  • "CMU Taking a Leading Role in War Against Cyberterror"
    Pittsburgh Post-Gazette Online (10/08/02); Spice, Byron

    Carnegie Mellon University (CMU) and CERT Coordination Center staff have been working for two years to set up CMU's new Center for Computer and Communications Security, and a five-year, $35.5 million grant from the Department of Defense could further their efforts. The money will go toward a anti-cyberterrorism initiative that involves fixing long-standing internal network security holes that can be exploited by ordinary, mischief-making hackers and crooks as well as terrorists. Firewalls and other measures may help block access, but Software Engineering Institute director Stephen Cross reports that getting rid of the flaws entirely would be even more effective, and CMU's new center could be ideally suited to accomplish this because researchers are not beholden to shareholders or specific product lines. The center is focusing on research to embed artificial intelligence into disk drives and other computer elements so that they can detect intrusions and take appropriate action. Center director Pradeep Khosla adds that the confirmation of user identity and data authenticity is another research goal of the facility, and notes that user ID verification will be essential to commerce as well as homeland security. He predicts that multiple biometric tools will probably be merged together into a single ID solution. The center's 2002 budget will total $8 million, using money not only awarded by the Defense Department, but the National Science Foundation, the National Institute of Standards and Technology, and various state and private companies. "We want Carnegie Mellon to be the top player in [the security] arena," declares Khosla.

    To read about ACM's work related to a national ID system, visit http://www.acm.org/usacm

  • "IT Advances to Drive Lots of Job Cuts, Gartner Predicts"
    Computerworld Online (10/07/02); Hoffman, Thomas

    Gartner released a top 10 list of IT forecasts at its Symposium/ITxpo 2002 conference on Monday, and among them was a prediction that continued technology advances will lead to millions of layoffs starting within the next two years. Such advances include IT systems that automate manual operations, a development that will "substantially lower the labor load of business," according to Gartner research director Carl Claunch. The No. 1 prediction made by Claunch on behalf of his company is that it will become more cost-effective to add new bandwidth rather than purchase new computers: The annual doubling of optical bandwidth capabilities will lead to more data service centralization and more computing resource sharing between companies following an application service provider model. He also predicted that decentralized IT operations will resurface by 2004, while most application decisions will be made by business units rather than IT. Other trends Gartner expects include mainstream penetration of business activity monitoring within five years, and the continued upholding of Moore's Law through 2010. By 2007, banks will be the chief suppliers of presence services, while many segments of the IT market will experience vendor consolidation--in fact, Claunch believes that 50 percent of current software vendors will be out of business by 2004. Finally, Gartner expects most major new systems to be either inter-enterprise or cross-enterprise, giving companies a macroeconomic shot in the arm. "This will have a clear and recognized effect on productivity," Claunch declared.
    Click Here to View Full Article

  • "Chicago Researchers Move Toward Molecular Transistor"
    Nanoelectronics Planet (10/02/02); Pastore, Michael

    University of Chicago chemists have created a diode from a single molecule, and report their findings in the Sept. 12 issue of the Journal of the American Chemical Society and the Oct. 2 issue of Angewandte Chemie. "Essentially, [chemistry professor Lupin Yu] has shown that the important electronic properties of this circuit element can be engineered into a single polymer molecule," declares Reginald Penner of the University of California-Irvine. To synthesize the diode, the researchers first created two separate compounds with electronic properties that oppose each other, then combined them chemically. The mostly hydrogen-carbon compounds were incorporated into a monolayer sheet that was moved to a gold platform; there a scanning tunneling microscope was used to gauge the diode's properties. Yu, along with fellow researcher Man-Kit Ng, were able to successfully mass produce the diodes after over half a year of development. Their research was supported by funding from the National Science Foundation, the Air Force Office of Scientific Research, and the University of Chicago's Materials Research Science and Engineering Center. Yu believes the synthesis of a molecular transistor is within reach, but now faces the challenge of hooking up molecular elements into an operational computer.
    Click Here to View Full Article

  • "Computer-Human Conversation Closer to Reality"
    PC Magazine Online (10/02/02); Metz, Cade

    A truly intelligent computer can carry out conversations with people without them realizing they are talking to a machine, according to the Turing Test. Brainhat has developed a natural language operating system that can technically pass the Turing Test, according to CEO Kevin Dowd. The system is programmed to analyze natural language and respond to it with the help of speech recognition and voice synthesis software. "We extract semantic value from language by parsing through it, identifying different parts of speech, and organizing everything within various data structures," explains Dowd. Once these segments are understood, the system manipulates them to create a viable response. The more knowledge the system gleans from subsequent sentences as the conversation continues, the better it can comprehend sentences and provide responses. The system was tested when it was hooked up to a telephone line, and a woman accidentally called it; it was able to hold a conversation with the woman for several minutes, using speech patterns designed to imbue it with the personality of an archetypal Valley Girl. Brainhat is on track to develop a machine that can hold limited conversations, while language labs at IBM, Microsoft, and numerous U.S. universities are also pursuing the same goal. Brainhat has been hired by several organizations to help them build improved robots, while Dowd expects that that their system can one day handle customer support, among other things.

  • "InfiniBand Finds Favor in Testing"
    Mass High Tech Online (10/07/02); Miller, Jeff

    The emerging InfiniBand standard promises to speed data transfer and ease input/output burdens on CPUs and could eventually replace the PC's ubiquitous PCI bus, although its first use likely will be to boost data center performance. In anticipation, a number of startup companies are looking to cash in on new InfiniBand deployments, as the technology nears its commercial debut next year. Several companies in Massachusetts, for example, are currently conducting beta tests for their products with potential customers. Paceline Systems and InfiniSwitch, for example, both have lent test machines to Sandia National Laboratories in California, as well as to other organizations. Intel researchers found that server chips spend up to 70 percent to 80 percent of their processing power managing input/output devices running on gigabit Ethernet, while InfiniBand technology eased that burden to just 3 percent to 5 percent of resources. Although InfiniBand is currently being aimed squarely at the data center and network systems, proponents one day hope to integrate it into desktop systems as well in place of the PCI bus. International Data's Vernon Turner says that the type of testing currently going on is critical for the success of InfiniBand. The technology was once seen as a shoo-in, with support from major IT vendors Intel, Microsoft, IBM, Hewlett-Packard, Compaq, and Sun Microsystems, but recently it looks as though it will have to be proven first. Intel recently announced it would not create silicon chips for InfiniBand, but would include integration support in its server software, while Microsoft announced it would not bundle InfiniBand drivers in its next .Net server system. Those setbacks indicate that the technology will be judged based on its actual merits and will not be a default upgrade.
    Click Here to View Full Article

  • "Most IT Not Ready for Cars, Says GM"
    VNUNet (10/03/02); Collins, Jonathan

    Tony Scott, GM's CTO of information systems and services, says the IT industry has not succeeded in providing the software and hardware necessary for in-car systems. The IT industry is "not ready yet," he said at Internet World Fall 2002 in New York. He said market development hinges on the creation of very reliable, cost-effective, and powerful hardware and software. Allied Business Intelligence says the market for in-vehicle wireless technology will to grow to over $12.3 billion by 2007, while 2001's total was just $2.2 billion. Bluetooth technology is expected to be incorporated into almost 20 percent of new cars across the globe by 2007. At present, laws require car makers to support their products through parts and services for 10 years after initial sale. But Scott doubts the IT industry will be able to match such support demands, given the short life-cycle of computer software. Ultimately, he says, IT would have to be managed without impeding drivers and fail-safes built in so that an IT crash would not wreck a moving car.

  • "Studying Evolution with Digital Organisms"
    Astrobiology Magazine Online (10/07/02); Bortman, Henry

    Researchers at Caltech's Digital Life Laboratory and Michigan State University's Center for Biological Modeling believe that observing self-replicating digital organisms will shed new light on Darwinian evolution. Using such organisms, Caltech's Chris Adami and others have noted that species exposed to high levels of mutation will develop a tolerance, and will consequently outlast less tolerant species with a higher reproductive rate. The researchers evolved two distinct digital species--one under low mutation, the other under high mutation--and then had them compete in environments with high and low mutation rates; however, the species that replicated faster could not adapt and prevail in a high-mutation-rate environment. Adami says the highly mutation-tolerant organisms survive "by rearranging their code, in a sense spreading it out over the genome so that they are less vulnerable to mutations." Although the initial computer code for the digital life forms is written by humans, Adami explains that the code is discarded after a certain number of generations, when it no longer applies to the environment. He thinks that the quasi-species behavior shared by both the prevailing digital species and viruses could lead to more effective anti-viral treatments. Adami is also collaborating with Ken Nealson at the JPL Center for Life Detection and fellow Caltech researcher Evan Dorn on devising a new means of detecting extraterrestrial life using this approach. However, University of Colorado astrobiologist Benton Clark thinks that the appellation "digital organisms" is misleading, since he considers them to be potentially artificial and simulated, not true life forms.
    Click Here to View Full Article

  • "ACM Conference to Spotlight the Strengthens, Apps of Object Technology"

    An upcoming conference on object-oriented technology will focus on the latest practices of reuse, enterprise components, programming challenges, Web services, new computing models, and much more. The 17th Annual ACM Conference on Object-Oriented Programming, Systems, Languages, and Applications (OOPSLA) will bring together practitioners, researchers, and students to share ideas and experiences in a broad range of disciplines linked by a common thread in object technology. Among the speakers slated for the event are Microsoft founder Bill Gates who will ponder the future of programming in a world of Web services; Alfred Spector, vice president in IBM's Research Division, will discuss the challenges of delivering on the promise of distributed systems; and editor Jerry Michalski will try to forecast where technology developers should place their energies. A trio of practitioner reports will explore how concepts that sound good on paper really work in the real world. The conference will be held in Seattle November 4-8 at the Washington State Convention and Trade Center.

  • "Stolen Code"
    New Scientist (09/28/02) Vol. 175, No. 2362, P. 36; Grossman, Wendy M.

    Europe appears to be taking a cue from the United States to make software patents allowable, a move that has split the European software community into supporters who claim it will spur innovation, and opponents who argue it will have the opposite effect. Both sides agree that a key component of Europe's patentability rules, the Munich Treaty, needs to be resolved: The treaty bans patents on "programs for computers," yet sanctions patents on anything that achieves a "technical effect." Most European patent offices do not grant software patents, yet 30,000 such patents have been approved, most of them by the European Patent Office. Those opposed to the patents argue that they are an unsuitable way to protect work in an ever-fluctuating field such as software development; they can give rise to "bad" patents with too broad a scope (a recurring problem in America); and they are not really the best way to inspire innovation. Free software pioneer Richard Stallman sides with the opposition, and explains that new software products are built upon multiple ideas, at least one of which is patented. Jeremy Philpott of Britain's Patent Office claims that the more restrictive patent processing system in Europe ensures that bad patents are avoided--the EU insists on "non-obvious" innovations, gives the public nine months to protest patents after publication, and takes up to a decade to grant patents. Foundation for Information Policy Research (FIPR) co-founder Ross Anderson notes that inventors may reap fewer financial rewards than they may expect, and his organization is one of several campaigning against software patents. However, the unlikely defeat of the EU directive would probably prompt members to seek alternative ways of patenting software, such as lobbying for changes to the Munich Treaty, or simply filing patent applications with the U.S. Patent Office.

  • "PCAST Aims to Expand Offerings to Stir Broadband Growth"
    White House Weekly (10/08/02) Vol. 23, No. 40, P. 5; Nance, Scott

    The President's Council of Advisors on Science and Technology (PCAST) last week issued a draft of a report to be released later this month that argues that more pervasive broadband adoption will help rejuvenate the economy. The council said direct government intervention was not the best way, but that fostering several strategic areas would create demand. PCAST cited statistics that show that over 60 percent of the American populace uses e-government applications, and added that broadband-specific e-government services would prove popular. The federal government could also promote telework initiatives at its agencies, and encourage telemedicine. The online facilitation of medical care would be especially useful in rural areas. Although the government was encouraged to take a hands-off approach in the market, PCAST did support greater federally funded research and development, since it would lead to more innovations. The draft report said that "we look for creative interaction among public and private sector scientists...to address the roadblocks with fresh approaches to technological design, public policy, and industry cooperation." Wireless broadband technologies and security were two areas that needed attention, the council said. PCAST estimated that broadband Internet access could give American consumers up to $400 million worth of economic benefits annually, while "the faster the deployment, the greater the estimated benefits."

  • "What Does the Internet Look Like?"
    Economist (10/03/02) Vol. 365, No. 8293, P. 77

    The Internet's structure has been the source of much argument: Understanding it is difficult, given its unplanned expansion, but simultaneously vital, because comprehending the interconnection of its hundreds of thousands of routing computers is necessary for proper usage. Mapping out the Internet seems a futile task, since such maps are inherently incomplete and outdated, so Dr. Albert-Laslo Barabasi and others at the University of Notre Dame, Indiana, have devised a general architecture designed to improve the accuracy of Internet models. The researchers model the Internet as if it were a natural system. Four years ago, Internet models were furnished by randomly generated graphs that used points to represent routers and lines to represent the links between them; however, this methodology misses two major elements--the preferential attachment of Internet links and the fact that the Internet has far more clusters of connected points than random graphs do. The Internet is also scale-free, a phenomenon that makes it both highly reliable and highly vulnerable to attacks on routers. Dr. Barabasi explains that following a scale-free architecture will produce a more predictive and descriptive Internet model. The discovery of the Internet's scale-free nature has demonstrated that viruses can be prevented completely by changing the software on a relatively small number of hubs, rather than readjusting many more routers and expecting a cumulative effect, as random graph studies have indicated.

  • "Nowhere to Run"
    Canadian Business (10/14/02) Vol. 75, No. 19, P. 112; Holloway, Ann

    The American, British, and Australian governments are bringing in less foreign technology help to shore up their domestic IT workforce, but Canada is taking the opposite approach. Although there was a 4.3 percent increase in professional, scientific, and technical services employment in August, Canadian firms still have a great need for personnel willing to work for a "fair" wage, according to Workstream CEO Michael Mullarkey. As a result, the government has eased its restrictions on bringing in outside workers. The enactment of the Immigration and Refugee Protection Act in late June allows Citizenship and Immigration Canada (CIC) to match foreign applicants to broader needs for "human capital," rather than pairing them up with specific jobs, notes CIC's Susan Scarlett. Employers can also import particular kinds of workers without requiring an individual labor market assessment to see if Canadian residents can fill the same positions. Furthermore, spouses and common-law partners of these temporary employees do not need job offers in order to apply for work permits, while foreign students at Canadian universities can more easily apply for temporary visas upon graduation. The CIC has signed agreements with nine provinces and territories to supply foreign IT workers to rural areas where they are most needed, although Mullarkey expects corporations will need such personnel in late 2003 and 2004 to help effect the replacement of pre-Y2K systems with new hardware and software. Canada is currently experiencing a significant shortage in Web architects, high-end Oracle and Unix administrators, and .Net platform specialists.
    Click Here to View Full Article

  • "Being Wireless"
    Wired (10/02) Vol. 10, No. 10, P. 116; Negroponte, Nicholas

    MIT Media Lab founder Nicholas Negroponte predicts wireless 802.11 systems will transform telecommunications from a centralized, proprietary mesh of networks to one that is open and free to most users. Although phone companies are pursuing 3G technology as the future for wireless data transmission, Negroponte points out that 3G is still more applicable to voice applications and will carry just 1 Mbps in two years. Meanwhile, 802.11b carries data at 54 Mbps and already has an estimated 15 million connections in the United States alone. While 802.11 only carries for about 1,000 feet unamplified, Negroponte says that peer-to-peer technology will eventually allow for a robust, free network that can largely bypass links to the traditional Internet. He points out that as more individual nodes are linked to one another via peer-to-peer, the quality and capabilities of the network are increased, unlike traditional networks, where more users means less bandwidth. Moreover, the technology is becoming cheaper and more functional, and special antennas can send signals further than 20 kilometers so that far-flung communities can also be hooked up. In the end, the traditional telecommunications infrastructure will be forced to recognize the strength of this new network and authorities reorganize spectrum accordingly, so that it is owned commonly and not parceled out to be used with less effectiveness.

[ Archives ] [ Home ]