HomeFeedbackJoinShopSearch
Home

       HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.

To send comments, please write to [email protected].

Volume 4, Issue 376: Monday, July 22, 2002

  • "Lawmakers Propose Volunteer Corps to Guard Nation's Technology"
    Agence France Presse (07/21/02); Hoskinson, Charles

    On July 19, the U.S. Senate passed the Science and Technology Emergency Mobilization Act, which calls for the formation of a National Emergency Technology Guard. The guard would be composed of scientists and technology experts volunteering to help forestall and respond to terrorist attacks on the country's communications infrastructure. The proposal would also set up a government agency to administer security technology sharing programs and commit $35 million toward the development of emergency communications programs. Furthermore, the bill would establish a "virtual technology reserve" of private equipment that authorities can borrow in times of emergency. When the bill was being debated in April, George Washington University computer science professor Lance Hoffman cautioned that the background and credibility of technology guard members should be checked in order to prevent infiltration by malicious individuals. The need for emergency measures is supported by FBI findings that al-Qaida terrorists are investigating ways to control digital switches for critical infrastructure systems via the Internet, while a recent Riptech report estimates that cyberattacks have risen 64 percent in the past year. A similar homeland security bill was approved by the House of Representatives, and the next step is for both Senate and House to agree on common standards prior to submission to President Bush.
    Click Here to View Full Article

  • "Federal Bill Targets E-Waste"
    ZDNet (07/19/02); Skillings, Jonathan

    Rep. Mike Thompson (D-Calif.) on Thursday introduced legislation to create a national program for the recycling of computer waste funded by a $10 fee levied on retail sales of desktop and laptop computers, as well as monitors. Thompson's Computer Hazardous Waste Infrastructure Program (CHIP) would reside within the EPA, and is expected to be examined first by the U.S. House Committee on Energy and Commerce. Dell, Hewlett-Packard, IBM, Best Buy, and Staples have all announced voluntary recycling initiatives that charge consumers at a product's end-point if a consumer brings the product in to be disposed. Most computer manufactures are looking to prevent a government recycling program at the state or federal level. Thompson says 70 percent of lead and mercury in U.S. landfills comes from electronic waste, and 41 million PCs disposed of this year will skyrocket to 500 million per year in 2007. "We can't afford to continue endangering our health and environment and filling our landfills by ignoring the problems created by computer waste," says Thompson, who also notes that "up to 80 percent of e-waste is actually exported to Asia, where it ends up in riverbeds or is illegally and improperly disposed." Several states are also considering computer recycling legislation.
    http://zdnet.com.com/2102-1103-945092.html

  • "Do We Need a National ID Plan?"
    CNet (07/22/02); McCullagh, Declan

    Free Congress Foundation analyst Brad Jansen, one of the key members of an ad hoc coalition that vehemently opposes the creation of a national ID standard, is advocating a White House proposal submitted last week suggesting federal agencies help set "minimum standards for state driver's licenses" to forestall abuses by terrorists. He adds that the Electronic Privacy Information Center agrees with his thoughts on the matter. However, the ACLU--which formed the ad hoc coalition along with the Free Congress Foundation--is up in arms, claiming that the proposal is exactly what they have been fighting against. "That language points to the fact that the Bush administration appears to support national IDs through the standardization of state driver's licenses," comments ACLU legislative counsel Kate Corrigan. Jansen argues for the proposal, noting that the government will allow states to control standards development, with federal agencies acting as consultants. The standardization of driver's licenses is part of President Bush's plan to create a Department of Homeland Security, but House Majority Leader Dick Armey (R-Texas) and colleagues have added pro-privacy provisions that limit the practice. Furthermore, they have insisted that the department include a privacy officer and advocate a ban on citizen-spy programs. After these changes were made, the proposal was approved by a special committee led by Armey; a floor vote is slated to take place this week.
    http://news.com.com/2010-1079-945347.html

  • "With 'Old' Design, Japanese Supercomputer Beats Top U.S. Machine"
    Associated Press (07/21/02)

    NEC's new Earth Simulator, built in Japan, can process 35.9 trillion calculations per second, faster than the combined computational power of the 15 fastest machines based in the United States, according to Jack Dongarra of the University of Tennessee. The NEC computer signifies a resurgence in the debate about whether supercomputers using custom-built processors and those using off-the-shelf components are better. Proponents of custom-built machines, such as the Earth Simulator, say that the manufacturers have placed too much emphasis on the cost-effectiveness of off-the-shelf processors, which are used in almost 92 percent of supercomputers currently, compared to 1993, when about 27 percent used standard components. In addition, custom-built machines have more memory bandwidth, which allows data to feed into the processors faster. However, J. David Neelin of the University of California, Los Angeles, says the critical point in research involving supercomputers is not the speed of the machine, but the quality of the underlying theory being tested. IBM, Hewlett-Packard, and others say they are working on even more powerful supercomputers than the Earth Simulator. Still, Cray chief scientist Burton Smith says arguments for off-the-shelf processor-based supercomputers "are all based on strange economic theories--none of them are based on technical grounds."
    Click Here to View Full Article

  • "Alloy With Shape Memory May Be Ready For Broad Use"
    New York Times (07/22/02) P. C1; Feder, Barnaby J.

    Nanomuscle is working to mass-produce small actuators that replace motors in small devices. The Nanomuscle devices use nickel-titanium alloy wires that contain "shape memory" properties in which its molecules change their shape depending on the temperature and can revert back to a previous shape when heated. The alloy wires, when heated with an electric current, pulse back and forth between shapes, mimicking the action of muscles. Computer scientist Rod McGregor, an entrepreneur formerly in the software business, owns Nanomuscle and has signed a contract with the toymaker Hasbro to produce up to 10 million of the small devices. The nickel-titanium alloy, known as nitinol, has spawned a number of small materials companies hoping to commercialize its unique muscle-mimicking properties, but Nanomuscle is taking a unique approach by building the nitinol wires into modules. With modules, product designers would not have to know the exact mechanics of the actuators before incorporating them into their designs. In order to keep costs down, McGregor made certain that the specifications for the nanomuscles are compatible with current semiconductor-manufacturing techniques, and has contracted with a Taiwanese chip foundry to build them.
    http://www.nytimes.com/2002/07/22/technology/22NANO.html
    (Access to this site is free; however, first-time visitors must register.)

  • "Raising the Accessibility Bar"
    Wired News (07/22/02); Mayfield, Kendra

    Stanford University's Archimedes Project is an initiative to develop information interfaces that disabled people as well as the general population can use. Attracting this wide range of users depends on creating a system designed to last decades without the need for frequent upgrades, and giving it performance levels that surpass that of commercial products, according to project leader and co-founder Neil Scott. Unlike traditional adaptive technology, in which existing computers and devices are modified so people with certain requirements or disabilities can use them, the Archimedes Project offers a Total Access System (TAS) in which users with an "accessor" can control any computer and information appliance without the need for specialized hardware or software. Human-centered interfaces such as speech recognition and head-and-eye-tracking are embedded within the accessor so that they can function for users with individual requirements. Scott says this "allows people to truly mix and match different modalities...It gives them a huge amount of flexibility." However, he notes that the mainstream will not accept this technology until it is improved: Voice recognition technology needs to be more reliable, and there is a considerable learning curve for accessibility technology in general. Scott adds that the Archimedes Project is trying to design a voice recognition system that can match words to context accurately through disambiguation software, while intelligent agent systems will enable computers to exhibit more human-like behavior. He says the goal of the project is to create accessibility technology that everyone can use that is also affordable to the disabled population.

  • "Second Law of Thermodynamics "Broken""
    New Scientist Online (07/19/02); Chalmers, Matthew

    Chemical physicists at the Australian National University (ANU) have discovered through experimentation that the second law of thermodynamics--which states that a closed system becomes more disorganized as time passes--can be consistently violated on the micron level, a discovery that could have significant implications for nanotechnology. The researchers used a laser beam to measure the movement of latex beads suspended in water and calculate their entropy. They uncovered a negative entropy change over time intervals of a few tenths of a second; this change became positive over time intervals of more than two seconds. It has already been established that the probability of second-law violations increases at the atomic scale, but this is the first time such behavior has been detected at the micron scale. The outcome of the experiment aligns with predictions of the "fluctuation theorem" developed at ANU a decade ago. The scientists declare that "The results imply that the fluctuation theorem has important ramifications for nanotechnology and indeed for how life itself functions."
    http://www.newscientist.com/news/news.jsp?id=ns99992572

  • "Higher Learning at Warp Speed"
    Los Angeles Times (07/22/02) P. C4; Singer, Paul

    Case Western Reserve University in Cleveland is setting up a $27 million fiber-optic network that will connect 16,000 computers and boast a top data delivery speed of 1 Gbps, which surpasses the speed of the average home broadband link about a thousand times. Sprint and Cisco Systems will handle the upgrade: Sprint will receive technology testing rights, while the school will receive discounts on later upgrades. Case will develop the applications to be run on the network, which will be designed to aid education. Case technology chief Lev Gonick says the new system can support the combination of high-definition video and high-definition audio. Internet2's Steve Corbato calls Case's effort "one of the most aggressive if not the most aggressive deployments" of network technology in the academic sector, and predicts that many university campuses will boast gigabit Ethernet technology in roughly two years. Caltech director of information technology John Dundas expects other institutions will adopt network technology due to competitive pressures. Meanwhile, Carnegie Mellon University CIO Joel Smith reports that his school has established gigabit-speed connections between campus buildings, and is undergoing a continuous upgrading process. Carnegie Mellon students and faculty can log in from any location via wireless connectivity.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Scientists Create Smallest Ever Laser-Like Light Beam"
    Scientific American Online (07/22/02); Moeller, Rachael

    A report in the current issue of Science magazine details how scientists created laser-like light beams emanating extremely small wavelengths that enable the viewing of nanoscale technology. The viewing of nanoscale technology has become a thorn in the side of nanoscale research, but the University of Colorado's Randy Bartels and his team hope they have an innovative solution. The Bartels team used a laser and blasted it at a pulse rate of one quadrillionth of a second through an argon gas-filled tube, and then through a "structured waveguide,"
    in order to generate streaming protons possessing wavelengths just tens of nanometers in size. The whole device can fit on a dining room table, and may one day help scientists and engineers view molecular behavior, test systems, or work with microelectronics. "In an arena such as microelectronics, any new tool that speeds development of a new technology can have a big economic and competitive impact," says JILA's Henry Kapteyn.
    Click Here to View Full Article

  • "A Conversation With the Inventor of Email"
    Internet.com (07/16/02); Gaudin, Sharon

    In an interview with Internet.com's Sharon Gaudin, email creator Ray Tomlinson of BBN Technologies says the invention of email was not very complicated, and its impact was not immediately apparent because there were few computers. He acknowledges that email has had an effect on society, although it has not fundamentally changed people, but merely given them the means to communicate more rapidly. Tomlinson finds spam to be irritating, and thinks that solutions are either ineffective or are filtering out too much; in his opinion, legislation is not the answer. He does not believe that email will change drastically in the next decade, although he does predict more integration with instant messaging and other communications formats, and increases in bandwidth as well as improvements in cable modems and DSL. Tomlinson does not think there is any technological solution to cutting down the proliferation of worms and viruses via email. "You'd have to have the cooperation of the hacker for that to work," he notes. Tomlinson is currently engaged in a distributed systems project in which expert software applications, dubbed agents, work from different locations to solve problems. The Java-based system is platform independent, and Tomlinson's team is using both the Linux and Windows operating systems.
    Click Here to View Full Article

  • "With False Numbers, Data Crunchers Try to Mine the Truth"
    New York Times (07/18/02) P. E7; Eisenberg, Anne

    Consumers are often reluctant to answer truthfully to personal questions online--such as their age and how much money they earn--and will often substitute false answers, which complicates companies' business. "People are lying, and vendors don't know what is false and accurate, so the information is useless," says Ontario's commissioner of information and privacy, Ann Cavoukian. To circumvent this problem, Ramakrishnan Srikant and Rakesh Agrawal of the IBM Almaden Research Center have created a data-mining program that masks individual truthful answers that consumers might enter once trust was established, yet still finds useful demographic information. The program automatically assigns random values--for example, a specified range of years to the age the consumer puts down--and then makes a series of mathematical guesses to recreate an accurate description of the distribution; the randomization of the initial data partially forms the basis of the guesses. The random perturbation method the program uses has had a long tradition, notes David F. Andrews, formerly of the University of Toronto. Approximating the actual distribution of age, salary, etc., can result in some inaccuracy, according to Cavoukian, but she adds that "in return for about 5 percent inaccuracy, you have a privacy model in which individual answers are not used." Purdue University's Christopher W. Clifton believes that companies could use raw data that is currently inaccessible with the IBM program and others.
    http://www.nytimes.com/2002/07/18/technology/circuits/18NEXT.html
    (Access to this site is free; however, first-time visitors must register.)

  • "Taking Programming to the Extreme"
    Technology Review Online (07/19/02); Sherman, Erik

    So that they can churn out quality software in a market clamoring for fast rollouts of new products and features, software companies are increasingly adopting development tactics that emphasize collaborative engineering. "Agile development" emphasizes teamwork, flexibility, and end-user cooperation. One form of agile development is extreme programming, in which two programmers work together--one writes the software code while the other checks for mistakes and finds possibilities for improvement; continuous testing is also a key element of the process. Peer programming involves development partners taking turns writing code and describing logic functions, and supports constant review by regularly switching partners. Testing is also undergoing a change: For example, pharmaceutical software developer Phase Forward has instituted a strategy in which programmers and quality assurance personnel continuously interact to find and fix bugs during application development; the traditional testing approach involves programming, followed by separate quality testing. Customer feedback is another form of quality control, one that Cognizant Technology Solutions follows. Rather than setting formal specifications followed by code writing, Cognizant project starts with prototypes that are adjusted by an "industrialization" cycle, in which future users make suggestions during the development process. However, consumers themselves may set the limits of software quality by determining what is acceptable.
    http://www.technologyreview.com/articles/wo_sherman071902.asp

  • "Another Dimension"
    Forbes (07/22/02) Vol. 170, No. 2, P. 173; Fulford, Benjamin

    Chip designers are rapidly running out of space for additional computing power, and many efforts to build a 3D chip solution have been dropped in favor of nanotechnology. However, two chipmakers have continued to pursue a 3D chip, and claim that they are nearing success. Tohoku University professor and former Toshiba engineer Fujio Masuoka has designed a chip that features vertical circuitry and a cylindrical architecture; he says his design will boast a tenfold speed increase over flat chips while costing only a tenth as much. He expects his 3D chip to be ready in five years, but Matrix Semiconductor is working on a 3D chip of its own that it plans to start selling by the end of 2002. Matrix executives say they have ironed out how to build a chip vertically and still conduct electricity, which prefers to move horizontally. The company has taken a cue from flat-panel displays in which circuits are positioned atop glass substrates via the development of "thin-film" transistors. Matrix also employs a polishing process in which the metal that sits above the circuitry is smoothed out, allowing engineers to sandwich in an additional layer of transistors. The Matrix chip will offer significant time and cost savings because it is designed to be built at existing chip plants.
    http://www.forbes.com/forbes/2002/0722/173.html
    (Access to this site is free; however, first-time visitors must register.)

  • "How Colleges Get More Bang (or Less) From Technology Transfer"
    Chronicle of Higher Education (07/19/02) Vol. 48, No. 45, P. A24; Blumenstyk, Goldie

    Colleges are implementing policies to commercialize the fruits of their research, but measuring the success of this technology transfer is difficult. The University of Michigan has had to retool its technology-transfer effort to shore up a flagging bottom line: An Association of University Technology Managers study ranked the institution 65th out of 118 in royalties earned relative to the cost of research in fiscal years 1996 to 2000; furthermore, the university reported 0.3 inventions for every $1 million spent on research. Reasons cited for Michigan's poor performance included a cultural resistance to commercialization, a less-than-entrepreneurial regional business environment, a risk-averse licensing office, and a technology-transfer office with a high turnover rate. Associate VP for research Marvin G. Parnes says that revamping the technology-transfer program was essential because "We really believe that our technologies and discoveries should get used throughout the world." In the last three years, the university has doubled its patenting budget and its technology-transfer staff; it has also increased its invention disclosures and license executions, and earned twice as much royalty revenues compared to a year ago. Meanwhile, student-faculty collaboration lies at the core of Brigham Young University's research program; small, marketable inventions often stem from this scheme, while a greater focus on applied research is demonstrated by its high number of invention disclosures relative to research spending. Washington University in St. Louis needs to improve relations between faculty and the technology-transfer office if it wishes to increase invention disclosures, according to associate vice chancellor for technology management Michael G. Douglas; the institution's chief strength is executing licensing deals. The University of Maryland, Baltimore, has a high record of invention disclosures and patents, but leaves something to be desired in the licensing department: To reverse this trend, the university hopes to leverage an off-campus incubator and a research park under development to facilitate technology transfer via spinoff companies.

  • "Last Mile by Laser"
    Scientific American (07/02) Vol. 287, No. 1; Acampora, Anthony

    Nine out of 10 U.S. businesses with more than 100 employees cannot avail themselves of the nation's multibillion-dollar optical-fiber network because they are separated by just one mile, but many experts believe that free-space optics (FSO) utilizing low-power infrared laser transceivers is the best chance to bridge this last mile. Deploying FSO systems can cost as little as one-tenth the price of underground fiber-optic cables, and implementation time is infinitely shorter--it takes just a few days to set up an FSO link, whereas laying cable can take up to a year. The lasers rely on line-of-sight, which means their transmission can be hampered by bad weather such as thick fog, but this problem can be mitigated by setting up a spiderweb-like "optical mesh topography" of transceiver nodes that make the system fog-proof and provide redundant routes that can be used in case of link failure. The laser diode emits optical pulses of light in the 850-nm to 1,500-nm range as a collimated light beam, which is collected by a receiver that recreates and amplifies the transmission. Auto-tracking capabilities at both ends of the transmission's path are necessary so that the signal is not affected by building sway and the thermal expansion and contraction of building materials. The many multiplexing, demultiplexing and switching functions that are necessary to process the signals from a variety of computing communications equipment in an optical mesh require that the signals be converted into a single format, a task handled by the network termination unit. A rival contender for last-mile transmission is point-to-point microwave radio, which is more expensive than FSO, requires licenses to operate in most radio bands, and has limited spectrum.
    Click Here to View Full Article

  • "Inside the Invention Factory"
    Red Herring (07/02) No. 115, P. 54; Bruno, Lee

    Bell Labs channels the ingenuity of 1,850 researchers who average four patents each day, and has paved the way for many innovative products--phones, VCRs, televisions, remote controls, CD players, stereos, and computers among them. But this has not made the institution immune from a downturn in the telecom sector that has severely impacted corporate parent Lucent Technologies, which suffered a 37 percent decrease in annual revenue last year; Lucent has been funneling at least 12 percent of that revenue into the labs since 1996. Lucent's long-term commitment to basic research, which can consume up to 10 years of development time, could be at odds with its need to generate short-term returns from product research, which averages two to three years. Lucent executives are dedicated to the labs, whose inventions could help the company get back on top in the communications-equipment industry: Developments with promise include an all-optical router switch for optical networks and boosts in optical-transmission-distance. Whether Bell Labs researchers are being pressured to make new discoveries in order to shore up Lucent seems to lie at the heart of an inquiry into the work of scientists Hendrik Schon and Zhenan Bao, who reported the development of a potentially revolutionary molecular transistor last year; duplication of their conclusions has eluded other researchers, which prompted the inquiry, although Lucent and Bell Labs officials deny this is an indication of rushed research. However, Bell Labs executives acknowledge that basic research funding accounts for only a small portion of this year's allotment, while business realities and the pace of innovation should be aligned. Furthermore, some former Bell Labs researchers claim that the labs are indeed kowtowing to the bottom line.
    Click Here to View Full Article

  • "Lawrence Lessig: The Thought Leader Interview"
    Strategy & Business (07/02); Fisher, Lawrence M.

    Stanford University law professor Lawrence Lessig says the Internet's freedom, growth potential, and innovation potential are being threatened by a slew of legislation, technology, and monopolization that would cede control of the medium to a few. He argues that the freedoms that sustain the openness of the Internet's architecture, infrastructure, and content are constantly being besieged. Lessig contends that increased regulation is stifling innovation, which in turn restricts corporate growth. In his book, "The Future of Ideas," he says the innovation commons provided by the Internet offers a neutral, nonrivalrous resource that anyone can use, and that neutrality fosters the growth and sustainability of the open source movement, among other things; however, this commons is being eaten away by an enclosure that governments and corporations are erecting to avoid technological disruption. "Giving up that neutral platform will benefit some commercial innovation, but at the expense of a vast majority of opportunity," Lessig posits. Although he points out that such a development will not halt disruption, it could retard it to the degree that it produces a less valuable outcome. Lessig argues that Microsoft's monopolization policy threatens innovation, and says that current remedies cannot adequately diffuse the threat, but he also says the company could be more beneficial if it follows its .NET Web services business model, which relies on an end-to-end architecture. He also opposes broadening copyright law, on the grounds that it merely gives publishers more rein to choke off competition.

 
                                                                             
[ Archives ] [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM