Compaq is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, Compaq offers performance-packed products and comprehensive services.

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Compaq or ACM.

To send comments, please write to [email protected].

Volume 4, Issue 364: Friday, June 21, 2002

  • "Tech Firms Bemoan Bush Talk"
    Wall Street Journal (06/21/02) P. A4; Dreazen, Yochi J.

    Leaders in the technology industry are largely disappointed with President Bush's speech on national broadband rollout. Although they welcome the high-level attention paid to the topic, they say the Bush administration is weak on details, shuffling off the responsibility to the FCC. The FCC and backers in the fixed-line telecom business are pushing for deregulation of the digital subscriber line (DSL) market, in line with a congressional effort that has stalled in the Senate. The cable industry, whose broadband offerings are not regulated, opposes deregulation of DSL for competitive reasons, and the two powerful groups are vying for political support. Some Bush advisors argue that the government should not get too involved in the formation of an industry that still has yet to prove itself, and oppose tax credits for companies that provide broadband infrastructure to rural and poorer areas. But industry representatives say there are other peripheral issues that could be addressed, such as defining online copyright laws and right-of-way issues that hinder physical deployment.

  • "Study: Equal Security in All Software"
    CNet (06/20/02); Lemos, Robert

    At a technical conference in Toulouse, France, Cambridge University researcher Ross Anderson presented a paper concluding that idealized open-source programs have the same level of security as closed-source programs. Anderson says the key difference between the two kinds of software is that bugs are easier to detect in open-source software than closed-source software. Anderson found equal security by estimating the average time before a program will fail in both kinds of software. As a result of these findings, he believes open and proprietary systems will boast roughly equal growth in terms of reliability and security assurance. Still, Anderson admits that his conclusions may not be borne out in the real world, where conditions are rarely ideal. Open-source and closed-source advocates each claim that their software is more secure, but Anderson's research supports neither argument, and his work will probably not resolve the debate. The paper also criticizes the Trusted Computer Platform Alliance consortium, accusing it of merely being a tool its members use to control PC content, rather than an agency dedicated to ensuring PC security.

  • "Fears of Misuse of Encryption System Are Voiced"
    New York Times (06/20/02) P. C5; Markoff, John

    A prominent European technologist and University of Cambridge computer scientist has released a paper that points out the possible misuses of the encryption technology supported by the Trusted Computing Platform Alliance, a group which includes tech heavyweights such as Intel, Microsoft, IBM, and Hewlett-Packard. Their technology would involve embedded encryption in hardware and software that would prevent viruses and faulty software from running, as well as pirated DVDs. Similar encryption is already built into game consoles from Nintendo, Sony, and Microsoft, helping to ensure software makers have bought the appropriate licenses from hardware vendors, for example. The Cambridge scientist, Ross Anderson, is also chairman of the Foundation for Information Policy Research and says the standard would run afoul of European Union laws and compete with the smart card industry emerging there. In the United States, privacy and security advocates are divided over the initiative because of its potential benefits and dangers to privacy. David Farber, University of Pennsylvania computer scientist and board member of the Electronic Frontier Foundation, says the Trusted Computing Platform Alliance technology would allow for more solid protection against viruses while providing intellectual property protection and privacy at the same time. Critics such as Cambridge's Dr. Anderson note that the technology could also be put to other uses, including the enforcement of electronic censorship campaigns.
    (Access to this site is free; however, first-time visitors must register.)

  • "Tech Access Law: Slow Progress"
    Wired News (06/21/02); Mayfield, Kendra

    Significant progress has been made since Section 508 of the Rehabilitation Act was signed into law a year ago, but the full impact of the mandate to make government Web sites and major hardware and software products accessible to disabled users has yet to be felt. Information Technology Industry Council (ITI) Section 508 working group chair Laura Ruby says the law has caused many vendors to refine their products so that they comply with 508's standards, but this compliance is gradual. The average product cycle can last as long as 24 months, and the accessibility guidelines have yet to be included in some of the newer products. Companies that want to attract government business would be at an advantage by making their products more accessible: ITI President Rhett Dawson says that "The implementation of Section 508 has spurred greater competition and innovation in the federal marketplace, and will increase employment opportunities for people with disabilities in both the public and private sectors." ITI members such as IBM, Hewlett-Packard, and Compaq have spearheaded compliance initiatives, such as the creation of the Voluntary Product Accessibility Template (VPAT), a system that enables manufacturers to demonstrate how their products adhere to Section 508 regulations. However, VPAT is still relatively unknown to many companies, while others do not see Section 508 as relevant to them. Larger government agencies--the Defense Department and the Social Security Administration among them--already boast internal accessibility programs, while Ruby says that smaller agencies unaware of the issue face a steeper learning curve.

  • "Web Thinkers Warn of Culture Clash"
    Washington Post (06/21/02) P. E5; Jesdanun, Anick

    Engineers and policymakers met this week at the annual meeting of the nonprofit Internet Society to discuss how corporate and government initiatives could curtail the Internet's democratic aspects. Government efforts, which are motivated by control, and corporate efforts, which are driven by the need for profitability, are inhibiting the openness of the Internet, warned Google CEO Eric E. Schmidt. He cited the "balkanization" of instant messaging as an example--AOL users are unable to communicate with people on competing services because of a surfeit of standards. Internet pioneer Steve Crocker cautioned that current decisions "could stunt the Internet to where it becomes a mechanism for delivering entertainment, ads, and conducting consumer-oriented business for large players." Society co-founder Vint Cerf said that innovation is being stifled because information transmission has not caught up with information reception, while Stanford University law professor Lawrence Lessig stressed the danger of the government making knee-jerk decisions about regulation; he decried copyright safeguards designed to stop online theft that cut into fair use rights. All the same, many participants acknowledged that government and business are impatient to impose regulations, and cannot wait for issues such as copyright controls, spam, and online privacy to be resolved. Wolfgang Kleinwachter of Denmark's University of Aarhus urged Internet policymakers to exercise caution if they want to avoid two extremes: One would give organized crime an advantage, while the other would limit freedom and choice.

  • "Hard Lessons in Learning a Common Tongue"
    Financial Times (06/21/02) P. 9; Harvey, Fiona

    Web services, which will allow companies to take full advantage of the Internet, need a standardized way to communicate industry jargon and definitions between computer systems. XML, or extensible markup language, is now the accepted platform for these future Web services, though its adoption in business requires dedication and rigorous work, according to experts. XML itself does not allow computers to communicate, but it provides a way for companies to define common standards and definitions for transactions, for example. Adhering to the created schemas will take a discipline that is uncommon for Web programmers, who are used to coding in HTML, traditionally a free-and-easy programming language. Standard Generalized Markup Language (SGML), a precursor to XML set up in the 1980s, failed to gain wide acceptance because it was intolerant of errors and was regarded as difficult to use. Today, however, all the major technology companies backing Web services efforts tout XML, leaving it up to the determination of the industry to successfully implement XML throughout their computer systems. Still, Gartner Group analyst Charles Abrams says, "It could take the rest of the decade to begin to get this sorted," and after that it will need to continuous updating. Abrams says XML may succeed where SGML failed simply because with an X in its acronym, "XML is sexy and attractive, so programmers like it."
    Click Here to View Full Article

  • "A Chip That Mimics Neurons, Firing Up the Memory"
    New York Times (06/20/02) P. E7; Eisenberg, Anne

    Dr. Theodore W. Berger, director of UCLA's Center for Neural Engineering, envisions a computer chip that can be implanted in people's brains and mirror the functions of neurons. Such a device could be particularly beneficial for persons whose cognition and memory are affected by a damaged or shrinking hippocampus--Alzheimer's patients, stroke victims, and epilepsy sufferers, for example. Neuroprosthetic research by Berger and his team involves the study of rat hippocampus and the structure and connections of the principal neurons. "We've created mathematical models of what a neuron does when it receives a signal, and then, having processed it, sends out a different signal," boasts Berger. Dr. John J. Granacki of USC's School of Engineering is building circuits based on Berger's models, and has thus far designed and simulated chips with as many as 10,000 neurons; chips he has built are capable of replacing up to 100 brain cells. Implantation of such chips into the brains of rats and monkeys is a goal Berger plans to reach with the help of Wake Forest University's Dr. Samuel A. Deadwyler. Dr. Joel Davis of the Office of Naval Research, which is funding some of Berger's research, notes that the neuron substitute Berger has in mind is designed to enhance cognition rather than augment the senses, which is what bionic chips and other developing neuroprosthetics are designed to do.
    (Access to this site is free; however, first-time visitors must register.)

  • "New Method to Make Faster, Smaller Computer Chips"
    Reuters (06/19/02)

    Princeton University researchers led by Stephen Chou have hit upon a technique for printing nanoscale chip patterns faster, which could lead to more efficient production of computer chips with a concentration of transistors that is 100 times denser than current models. Using a mold and a laser, the team of electrical engineers has imprinted 10-nanometer features on a chip in a quarter of a millionth of a second; producing a chip using traditional etching methods can take as long as 20 minutes. The mold is pressed to a silicon substrate, while a pulse from the laser causes the material to melt and resolidify. Princeton has filed a patent application for the technique, which is called Laser-Assisted Direct Imprint (LADI). Stanford University's Fabian Pease reports in Nature that Chou's breakthrough will enable electronics makers to keep pace with Moore's Law. Such research could also be a significant step toward the development of more powerful memory chips and computers.
    Click Here to View Full Article

  • "Bush Urges Private Sector to Shore up Networks"
    InfoWorld.com (06/19/02); Harreld, Heather

    The federal government is seeking the cooperation of the private sector to safeguard the nation's critical infrastructure. Of key importance are the IT networks that facilitate infrastructure operation, which are managed and protected by business, not the military. This situation led to the establishment of the Critical Infrastructure Protection Board and the current formation of a "National Strategy for Securing Cyberspace," which is meant to consolidate support from the private sector. Although attacks up until now have been mostly the actions of hackers and other individuals, the government worries that a state-sponsored attack, or one organized by a terrorist organization, would be able to cripple the nation's infrastructure by taking IT systems offline. Not only are large corporations in charge of utilities and nationwide networks being solicited, but smaller businesses are also being encouraged to sign onto FBI InfraGard chapters set up in each state. Those chapters provide a safe environment for companies to share information about network threats with law enforcement, without the fear of public disclosure. Through InfraGard, the FBI will be able to analyze data and identify potential threats with greater accuracy and speed.

  • "Tiny 'Whiskers' May Advance Nanoelectronics"
    NewsFactor Network (06/20/02); Lyman, Jay

    Boron nanowhiskers produced by a collaboration between Washington University in St. Louis, Mo., the Semiconductor Research Corporation, and others may bring the dream of nanoelectronic circuitry one step closer. Washington University graduate student Carolyn Jones Otten explains that growing wires is a much more practical alternative to molding or shaping them out of metals, a technique that University of Washington professor William Buhro predicts will soon reach a "small-size limit." Beyond a certain size, metallic wires thin out and break, which makes them unfit for nanoelectronics, according to Buhro. The boron nanowhiskers are grown out of crystal, and could boast more durability and temperature resistance than carbon nanotubes. The work also encourages researchers to believe that the creation of boron nanotubes is a possibility. Building specific nanoelectronic devices is not a major part of the researchers' agenda, Buhro says. "We are merely doing a basic science study to support the eventual emergence of nanoelectronics," he explains.

  • "Silicon Quantum Computer"
    Nature Online (06/19/02); Ball, Philip

    The architecture for a silicon-based quantum computer that can be constructed using current methods has been outlined by Thaddeus Ladd and associates at Stanford University. The microelectronics industry has an advantage over other quantum computer efforts because it has refined the manipulation of silicon and its properties for decades. Ladd's device is similar to a solid-state quantum computer suggested by Bruce Kane of Australia's University of New South Wales: Kane's proposal involves phosphorus atoms embedded in silicon crystals which store quantum bits, or qubits, that can be read and manipulated via nuclear magnetic resonance (NMR) capable of detecting single atoms. However, Kane dismisses this approach as impractical. Ladd's concept involves chains of silicon-29 grown on a substrate of silicon-28, with minuscule magnetic rods placed perpendicular to the chains controlling silicon-29's magnetic quantum states. Making painstaking measurements on single atoms could be avoided by storing each qubit in many thousand duplicates rather than in a single atom, with readouts accomplished by magnetic resonance force microscopy. The Stanford researchers claim the silicon quantum computer can be fashioned without "unrealistic advances in fabrication, measurement, or control technologies."

  • "Taking Security Concerns Private: U.S. Appeals to IT Firms"
    Washington Post (06/20/02) P. E5; Barraro, Michael

    Hobbled by a lack of expertise and staff, government administrators believe that the only way to develop technologies to help secure federal IT systems and defend the nation's critical infrastructure is to turn to the private sector. The government's inability to protect the infrastructure by itself has become obvious since Sept. 11, and Paul Kurtz of the President's Critical Infrastructure Protection Board says that "Many of the answers [to security problems] lie outside the Beltway." President Bush's homeland security initiative and proposed increases to the IT security budget have added fuel to the fire. IT marketers and executives are expecting more federal investment in data-protection technology, network integration services, and information-sharing systems. Government projects being developed by private industry include a national early warning system and an anti-cyberterrorism software system. At a panel hosted by TechNews.com, Kurtz told about 150 IT representatives yesterday that "If you have a solution, you need to come to government." Capt. J. Katharine Burton of the National Communications system says that many agencies are pursuing private-sector collaborations out of fears that their budgets may be cut if they fail to comply with the Government Information Security Reform Act signed by President Clinton two years ago.
    "Scientist Studies Robot Conversation Skills"
    Tallahassee Democrat (06/17/02) P. B6; Boyd, Robert S.

    Several experiments focus on teaching robots language skills, which will be an essential capability of machines designed to function as humanoid aides. SONY Computer Science Laboratory artificial intelligence expert Luc Steels proclaims that open-ended dialogues with "physically embodied robots" are possible. He has conducted an experiment in which an AIBO robot equipped with a video camera and a microphone was "taught" to recognize a new object--a ball--by forming concepts from the visual and audio input it received. The robot picks up on preprogrammed words spoken by its human teacher--such as "look," "listen," "good," "what is it?," "yes," and "no"--and associates them with the object. University of Maryland-Baltimore robotics expert Tim Oates has carried out an experiment in which the robot learns language through observation rather than instruction. By listening to statements used to clarify a certain action being performed, the robot can relate particular objects to specific sound patterns, and answer questions about them; Oates notes that the robot is taught to understand English, German, and Mandarin Chinese. A third experiment involves two robots that teach each other language mostly free of human supervision: In it, one robot points to an object and speaks its name, sometimes making it up from a catalog of syllables if there is no record of it in its memory. The other robot points to the object in question when it recognizes it, and incorrect identifications are pointed out by the first robot.

  • "The End of the Revolution"
    Salon.com (06/14/02); Leonard, Andrew

    In Milton L. Mueller's new book "Ruling the Root," Mueller traces how the DNS and the Internet have evolved from the time when Jon Postal allocated domain names himself while being funded by the Defense Department, to a utopia ideal of decentralized free expression, to a place run by ICANN and being hammered into a totally new shape by trademark attorneys. Trademark interests have been able to boost trademark protection to unprecedented levels on the Internet. For instance, Yahoo! not only has trademark protection for its name as a domain name, but also for 30 types of misspellings of its name. In addition, Yahoo! can prevent other people and companies from registering domain names using the word "Yahoo" among other letters, numbers, or words, even if the registrant has no intention to compete against Yahoo!. Mueller says the idea that the Internet was created by idealistic technology professionals working together is true, and details how technologists lost control of the Internet. ICANN has squelched participation of elected board members and the Internet community in general, while also delaying new domains such as .biz and .info that could bring down the value of .com. The use of sunrise periods in recent new-TLD launches gave trademark protection without asking trademark interests to prove if infringement occurred--a prior restraint protection not granted to trademark interests in other media mediums. Mueller says trademark interests now control the Internet's "point of market entry."

  • "Whose Domain Is It Anyway?"
    Wall Street Journal Online (06/17/02); Dyson, Esther

    What is at stake in ICANN reform is who should oversee ICANN: the Internet community, the U.S. government as Sen. Conrad Burns (R-Mont.) advocates, or a polyglot of national governments, writes Esther Dyson. Today ICANN provides a crucial role as the nexus of the Internet, and yet Dyson ICANN only offers standards of conduct for domain name selling and Internet services, and it remains "weak and powerless." Although ICANN's role needs to be bolstered, ICANN equally needs to remain a "weak" organization, stepping in to resolve conflicts rather than to exercise bureaucratic control. The reluctance of ccTLDs to sign ICANN contracts is a weakness that hampers ICANN's ability to manage the Internet while simultaneously protecting the Internet from governments. In terms of ICANN reform, U.S. government intervention would result in a howl of protests from around the world. Instead, ICANN needs to reach out to non-U.S. managers by offering them a seat at the table of policy-making. In addition, ICANN needs to become more accountable to individual users, and should replace its current leadership when the leadership departs with people open to constructive criticism. ICANN should abide by its founding mandate and leave the goals of serving the poor, managing Internet prices, and preventing fraud to governments and industry.

  • "Schools Turn to Slumping Tech Sector to Recruit Teachers"
    eSchool News Online (06/18/02); Murray, Corey

    California is bringing in unemployed technology professionals to fill the void left by a shortage of quality math and science teachers. "What we are trying to do is take a talent pool that has dried up and redirect it," says Larry Rios, director of the Technology-to-Teacher project for the Ventura County Business and Employment Services Department. He says the project allocates $1.6 million for training programs that enable out-of-work professionals to qualify for over 200 currently vacant teaching slots, adding that participants are attracted to the program mainly so they can make a positive impact on students. Jesse Gonzalez, a former product manager for a voice-mail services firm training to become a math instructor, expects his work experience will help him relay the practical applications of math and science to students. Furthermore, Leslie Conery of the International Society for Technology in Education believes that educators who hail from the private sector could greatly benefit the deployment of technology in schools. Successful Technology-to-Teacher applicants must exhibit a fundamental knowledge of math, reading, and writing by passing the California Basic Education Skills Test, and be certified displaced professionals. Once they have mapped out a certification path with an academic advisor, applicants must enroll in a university-backed teacher credential program paid for by the program. The U.S. Department of Education recently issued a report in which Secretary of Education Rod Paige stated that "If we are to meet the challenge of having a highly qualified teacher in every classroom by the 2005-06 school year, states and universities must take heed and act now to bring more [qualified educators] into our nations classrooms."

  • "Why the Future Belongs to the Small-Minded"
    VNUNet (06/14/02); Moltzen, Edward

    Nanotechnology research could lead to significant breakthroughs in the next two decades, according to advocates. Recent developments include an Idaho National Engineering and Environmental Laboratory project that has yielded a "super-hard" steel coating; project leader Daniel Branagan wants to apply that research to silicon semiconductors and other materials. IBM has more than 20 years of nanotech research under its belt, and its innovations could greatly benefit the computer storage sector: One area of investigation involves coaxing grains contained inside magnetic recording media to more efficiently self-assemble into two dimensions or as a simple thin film. Meanwhile, the Electron Microscopy and Microanalysis team of Australia's Griffith University reported last year that they manipulated thermally grown oxide layers on silicon substrates to form a hard drive that can hold 1,000 times more data than current models. Foresight Institute associate Gina Miller believes that "The true [nanotech] breakthrough will be to assemble a crude molecular machine that itself can build molecular machines." Venture capital investment in nanotech is expected to skyrocket from less than $100 million in 1998 to over $1 billion in 2002, while President Bush has requested more than $1 billion for the National Science Foundation--a chief supporter of nanotech--for fiscal 2003. Nanotech could further push the convergence of human biology and machines and the development of biomechanical interfaces, and this in turn could fundamentally change human culture and the perception of the surrounding world.

  • "Living on the Grid"
    InformationWeek (06/17/02) No. 893, P. 30; Ricadela, Aaron

    Grid computing enables enterprises to harness the raw computing power of many machines to process data, solve complex problems, run simulations, and perform other functions too large for single computers to handle, boosting efficiency and cost savings. For some companies, grid computing promises performance boosts even greater than those offered by more expensive systems: Bristol-Myers Squibb executive director Rich Vissa says that grid computing could raise performance levels by a factor of 100, while a $500,000 Linux cluster might only deliver a threefold increase. Grid computing views software as a service rather than as a set of orders running on a single computer or a small group of computers. Hewlett-Packard, IBM, Sun Microsystems, and Microsoft are developing grid-computing applications, while the companies on the cutting edge of grid computing either use proprietary software or products from Avaki, United Devices, or other startups. There are disadvantages--Pete Bradley of Pratt & Whitney notes that grid computing tools have slow Internet transmission speeds, and require file deletion and the freeing up of memory and processors when projects end; furthermore, most business operations can only be conducted behind the firewall. But such drawbacks may not deter IT managers who need more computing horsepower but cannot afford extra software and hardware. Still, grid computing may not necessarily optimize tasks that rely heavily on collaboration or customer service, such as supply chain management or e-commerce sales. Another barrier to the wide adoption of corporate grid computing are unresolved issues about manageability, cross-platform security, and data sharing.

  • "Quantum Superbrains"
    New Scientist (06/08/02) Vol. 174, No. 2346, P. 24; Mullins, Justin

    The speed and performance of a quantum computer would make current supercomputer models seem like pocket calculators in comparison, but the various interests racing to build one face significant hurdles. One challenge involves solving the problem of decoherence, in which the quantum state is disturbed by even the slightest environmental interaction; this requires that quantum bits (qubits) be isolated yet connected so that the machine can perform logical operations. Advances toward solving this problem have been made with liquid nuclear magnetic resonance (NMR) devices, but they hold a severely low number of qubits--up to 10 or 12--while a practical quantum computer requires perhaps 100. Another area of research, particularly that of National Institute of Standards and Technology physicist David Wineland, concerns the capture and manipulation of particles in ion traps to form a quantum logic gate, but this approach also falls prey to decoherence. So does a setup involving a neutral atom trapped by light in a mirrored cavity and excited to a higher energy state by a laser beam. Meanwhile, several groups are pursuing a solid state approach, developing quantum computer chips: Efforts include research into quantum dots, superconducting chips, and NMR chips. Physicists are also investigating photons as another possibility, but two-qubit operations with photons are almost impossible under normal conditions because the particles are unable to interact with each other, and there has been little reported progress on mediating such interaction. Despite these challenges, scientists are still hopeful because the creation of a quantum computer is still possible within physical laws, notes Artur Ekert of the University of Oxford.

[ Archives ] [ Home ]