HomeFeedbackJoinShopSearch
Home

       HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.

To send comments, please write to [email protected].

Volume 5, Issue 456: February 10, 2003

  • "Pentagon Names Boards to Oversee Data-Search Plan"
    Reuters (02/07/03); Dunham, Will

    The Defense Department announced on Feb. 7 the creation of two panels tasked with overseeing the controversial Total Information Awareness (TIA) project, in response to criticism from Congress and other organizations that such a project, supposedly designed to prevent terrorist attacks, would be used to spy on innocent Americans through their computerized transactions. Undersecretary of defense for acquisition, technology, and logistics Edward Aldridge declared that he would lead an internal Pentagon oversight board composed of senior Pentagon officials, while the second panel would be set up outside the Pentagon and headed by Northwestern University professor of communications law Newton Minow. The first panel, which will meet later this month, will oversee the employment of terrorist tracking tools and establish usage rules both inside and outside the Defense Department, while the second panel will supply the Secretary of Defense with advice on policy and legal issues. Although Sen. Ron Wyden (D-Ore.) lauded the formation of the oversight boards, he pledged that he will continue to advocate legislation to halt TIA funding, which he attached to a spending bill passed by the Senate. If approved by the rest of Congress, the moratorium will be in effect until the Pentagon furnishes a report detailing how TIA works and its impact on civil liberties.
    Click Here to View Full Article

  • "Don't Underestimate Cyberterrorists, Experts Warn"
    Medill News Service (02/07/03); Costello-Dougherty, Malaika

    Security experts warn that America's enemies can exploit cyberspace to wreak havoc with the nation's public infrastructure, unless its online defenses are beefed up with the installation of security software and increased awareness among users of both the risks and ways to avoid them. A report by Dartmouth College's Institute for Security Technology Studies notes that cyberattacks designed to carry a political agenda are often harbingers for physical acts of terrorism, and warns that Internet-based assaults on information systems as well as public utility systems are "extremely likely" if the United States goes to war. IDEFENSE intelligence analyst Ken Dunham notes that a known hacker claiming allegiance to al Qaida has threatened to unleash a computer worm upon the United States if it attacks Iraq, and adds that this individual has the capability to do so. The Dartmouth report says that most cyberattacks are likely to be led by terrorist sympathizers, or hackers who do it for the thrill. Dunham cautions that hackers can use scanning tools available online that could enable them to find security holes in a matter of minutes, and can hinder the dissemination of information by altering or imitating legitimate Web sites. He adds that knowledgeable Web surfers can recognize spoofed sites by noticing the "@" symbol within the link. A recent Symantec report says that threat severity and computer vulnerabilities increased last year, and lists blended threats as the most pressing danger to Internet security; government agencies and security firms have joined forces to encourage users to deploy security software and adopt protective measures. "As long as we have vulnerabilities in cyberspace and as long as America has enemies, we are at the risk of the two coming together to severely damage our great country," declared White House cybersecurity adviser Richard Clarke in an email announcing his resignation.
    http://www.pcworld.com/news/article/0,aid,109261,00.asp

  • "NASA Leads Efforts to Build Better Software"
    Computerworld Online (02/07/03); Thibodeau, Patrick

    The 1999 crash of the Mars Polar Lander, which was attributed to a software bug, made NASA officials realize that preventing a similar embarrassment would require an upgrade in software quality and the development of failure-proof systems. Following the crash, then-head of the NASA Ames Research Center Dr. Henry McDonald advised the agency to get more private-sector parties involved in its dependable-system initiative, and NASA did so by inviting top universities to participate in a collaborative effort. Furthermore, NASA plays a key role in the Sustainable Computing Consortium (SCC), which supports the development of software that always fulfills its function regardless of bugs, a breakthrough that could benefit all industries. One of the major problems this effort faces is the lack of definitive software reliability metrics. NASA and Carnegie Mellon University are jointly working on a software architecture that provides reliable computing, known as the High Dependability Computing Program. "A bad way to approach any kind of design is to look at it monolithically, to lump everything together and consider all the problems at once," notes head of Carnegie Mellon's West Coast campus, Dr. Michael Evangelist. "NASA is looking at ways to modularize design so you can focus individually on important things." The recent destruction of the space shuttle Columbia has revived interest in NASA's computer systems and software.
    Click Here to View Full Article

  • "Scientists of Very Small Draw Disciplines Together"
    New York Times (02/10/03) P. C4; Feder, Barnaby J.

    A recent three-day conference in Los Angeles was a rallying point for scientists and other advocates who wish to merge the disciplines of nanotechnology, biotechnology, information technology, and cognitive research into a single discipline collectively known as NBIC. Organizers believe that such a development will bridge the cultural gap between these disciplines, which often lack a unified terminology despite the fact that their researchers have common interests. For example, nanotech research focuses on nanoscale atom-molecule interaction, a process critical to living cell systems, which creates an overlap between nanotech and biotech. Meanwhile, electronics specialists are studying nanotech and biotech as they research ways to shrink computers, data storage systems, and communications devices. The fourth ingredient in NBIC--cognition--could lead to technology applications that could improve health and enhance people's ability to retain memory, control their moods, and communicate with machines. Institute for Global Futures President James Canton declared that "NBIC are the power tools of the 21st century," and noted that NBIC proponents are planning to ask the Bush administration to fund a program for collaborative NBIC initiatives at a yearly cost of hundreds of millions of dollars. National Nanotechnology Initiative head Dr. Mihail C. Roco has made NBIC convergence for both research and educational institutions a top priority. Speakers at the conference said that successful convergence will rely on basic changes in educational and organizational architecture.
    http://www.nytimes.com/2003/02/10/technology/10NANO.html
    (Access to this site is free; however, first-time visitors must register.)

  • "'Sticky' DNA Crystals Promise New Way to Process Information"
    Newswise (02/07/03)

    Researchers at the University of Minnesota have succeeded in forming a regularly structured DNA crystal with gold nanoparticles attached. Electrical engineering professor Richard Kiehl, who leads the project, says it is significant because matrices between metal and organic molecules are very difficult to construct. He says the regular pattern of the DNA crystal could act as a scaffold for a future nanochip, since the four basic compounds that form DNA naturally configure themselves in the double-helix chain. Kiehl says that chips using this structure would be able to store 10 trillion bits per square centimeter, or 100 times the amount stored by 64 GB DRAM memory in 2010. In addition, the short interconnects allowed by DNA structures also promise to eliminate traditional obstacles to faster information processing, namely the distance between where information is stored and shared. Kiehl predicts the efficiency provided by his future nanochip could allow computers to recognize images at the same speed as humans. The university team laid out these chains in DNA tiles, with extensions that connected one to the other. Gold nanoparticles were attached to vertical extensions of DNA so that they sat above the tiles. Kiehl says the structure could be compatible with carbon nanotubes or other conductive material, and that information could be stored either in electrical charges or in the magnetic states of the nanoparticles. The challenge now for the university team is to demonstrate that the structure responds correctly to the electrical functions necessary for computing.
    http://www.newswise.com/articles/2003/2/NANODNA.UMN.html

  • "The Linux Kernel's Next Incarnation"
    NewsFactor Network (02/07/03); Brockmeier, Joe

    As Linux becomes a more important component of enterprise computing portfolios, IT managers should be aware of upcoming changes in the Linux kernel. Although popular Linux software such as Apache and Samba create a front-end for the open-source platform, the kernel defines how Linux operates on the enterprise computing infrastructure--how many processors or how much system memory can be utilized effectively, for example. Linux always has two versions: An experimental version defined by an odd number, such as the current 2.5 version, and the official mainstream version, which is the even number. Linux founder Linus Torvalds has frozen development on the most current official version (2.4), causing many to speculate that the version 2.6 will be released sometime in 2003. Version 2.6 is likely to include a number of experimental efforts underway in version 2.5. SGI Linux engineering director Steve Neuner says Linux 2.6 will have a task scheduler that can make use of more processors, and handle I/O issues better. Red Hat director of engineering Brian Stevens adds that storage will no longer be a limiting factor in enterprise Linux implementations since version 2.6 will push the amount of storage handled by the operating system far beyond the current 2 TB limit. Sistina Software's Logical Volume Manager will also be updated in the new Linux kernel so that systems can dynamically accommodate multiple hard drives, and SGI's XFS Unix file system will be bundled into the kernel as well.
    http://www.newsfactor.com/perl/story/20698.html

  • "See Here!"
    Toronto Star Online (02/10/03); Ross, Rachel

    York University scientists are building an automated videoconferencing system that responds to hand signals, and its development includes research into the psychology of body language and camera-operator training. The research team, led by computer science professor John Tsotsos, has thus far given the system, dubbed GestureCam, the capacity to identify basic hand signals. GestureCam's first job is to recognize a human face by focusing on an assortment of flesh tones; it then takes direction from a person who holds up his right hand with a certain number of fingers raised to determine which camera operation--zoom, pan, etc.--it should follow. A second gesture indicates the scope of the camera's movement--whether it should zoom in or out, pan left or right, and so on. Following a pointy finger is only a partial solution--Tsotsos is currently developing a system that can comprehend American Sign Language. "It's not that we want to put the human [video camera] operators out of a job, we just want to bring the costs down so we can start deploying these systems everywhere," he explains. Tsotsos believes that the system, once perfected, could find use for both conferences and for distance learning.
    Click Here to View Full Article

  • "Group Hopes to Give New Life to Desktop Linux"
    EarthWeb (02/05/03); Olavsrud, Thor

    The Desktop Linux Consortium (DLC), which is composed of leading vendors and open source organizations, hopes to deepen the penetration of Linux on the desktop, which Linux creator Linus Torvalds declared as "inevitable" in a press statement. "Linux is firmly established in the server space, and now desktop Linux is coming of age," says DLC Interim Chairman Jeremy White. "The ultimate beneficiary of the consortium is the computing public, which will be assured a vibrant, open, stable alternative to closed proprietary systems, and an end to ever-escalating licensing fees." Microsoft and other big-name vendors see the open source movement as a threat to the traditional commercial software development (CSD) model. In its most recent 10-Q filing with the Securities and Exchange Commission, Microsoft says the only way to compete in a market that increasingly accepts the open source model is to cut prices on its products, which could in turn reduce revenues and operating margins. Open source proponent and DLC Interim Executive Director Bruce Perens says the DLC will ensure that all issues and events related to Desktop Linux will be treated fairly, and fully represent all vendors and respect open source conventions. However, International Data analyst Al Gillen says Desktop Linux has its work cut out for it: For one thing, Linux accounted for only about 2.1 percent of client operating environment new license shipments in 2001, while Microsoft grabbed 93 percent. Gillen also observes that most consumers use Microsoft technology at home, which promotes its penetration in the corporate space. He says the consortium needs to get well-known, major vendors on its side in order to make inroads into the consumer population, and adds that this can only be achieved if the vendors see enough volume to justify such a shift in allegiances.
    http://itmanagement.earthweb.com/erp/article.php/1579921

  • "Suits Test Limits of Digital Copyright Act"
    National Law Journal Online (02/07/03); Seidenberg, Steve

    New lawsuits by Lexmark International and the Chamberlain Group are testing the limits of the Digital Millennium Copyright Act (DMCA), which observers say has been interpreted broadly by courts. Lexmark is suing Static Control Components and Chamberlain is suing Skylink Technologies, and both suits address the issue of whether businesses can use software copyrights to keep consumers from using add-ons or replacement parts from third-party companies. The companies allege that Static Control and Skylink gained unauthorized access to copyrighted work by creating technology that circumvents their protection measures, which they claim is a violation of the DMCA. Static Control has enabled consumers to use less-expensive toner cartridges from other manufacturers in Lexmark printers, while Skylink has enabled consumers to use its remote controls rather than Chamberlain garage door openers. Jonathan Band, an IP partner in the Washington, D.C., office of San Francisco's Morrison & Foerster, says the DMCA was meant to promote the use of technology to protect digital content, such as that stored on DVDs. In 1992, however, the 9th U.S. Circuit Court of Appeals ruled that copying software code for purposes of product interoperability was considered fair use. Legal expert Jessica Litman, professor of IP law at Wayne State University Law School, says fair use may not be a defense in these cases, considering previous interpretation of the DMCA.
    Click Here to View Full Article

  • "Journalist Perpetrates Online Terror"
    Computerworld Online (02/06/03); Verton, Dan

    Computerworld has retracted a story about terrorists claiming to be the author of the recent Slammer worm after learning that Brian McWilliams, former Newsbytes.com reporter, was imitating the cited terrorist group when this group ostensibly asserted this claim. McWilliams has owned harkatulmujahideen.org for 11 months now under the false name of "Abu-Mujahid of Karachi," and McWilliams has been operating the Web site as one seemingly run by Islamic terrorists. In fact, the Web site once did belong to a Pakistani terrorist group that become internationally known when it executed Wall Street Journal reporter Daniel Pearl. This group's official Web site is now at ummah.net.pk/harkat/, and contact information on this genuine Web site refers people to harkatulmujahideen.org. McWilliams says he impersonated terrorists and ran harkatulmujahideen.org in order to collect information for news stories, and says that creating a hoax Web site is easy, partially because "it's so easy to conceal...the ownership of a domain." Computerworld reporter Dan Verton, who reported the original hoax claim, and whose story has been picked up by other news organizations after first appearing online with Computerworld, says McWilliams' alias "Abu-Mujahid" actually contacted him in an email in order to assert this hoax claim. McWilliams went so far as to deface harkatulmujahideen.org by a supposed hacker attack in order to reinforce the Web site's authenticity, and this stunt fooled security company Mi2g.com into thinking the Web site was legitimate. McWilliams now says he regrets taking his hoax as far as he did.
    http://www.computerworld.com/printthis/2003/0,4814,78238,00.html

  • "For the Smart Dresser, Electric Threads That Cosset You"
    New York Times (02/06/03) P. E5; Eisenberg, Anne

    Electrotextiles--electrically conductive cloth--are being developed so that they can be applied to many wearable products in both the military and civilian sector. The cloth is fashioned from synthetic or metallic fibers that can be linked to processors and batteries, and Rensselaer Polytechnic Institute physicist Dr. Michael S. Shur projects that electronic functions will be incorporated into all types of clothing in 10 years' time. He also predicts that one day such devices will no longer depend on external power sources, but have a built-in power supply, such as solar cells woven into the fabric. The conductive fibers used in electrotextiles can transmit signals, notes John D. Ross of DuPont's Advanced Fiber Systems, which has developed Aracon, a form of Kevlar cladded with metal and insulated with a polymer to protect the wearer's skin; the material is used in a prototype stretch cotton T-shirt that is equipped to monitor the wearer's health, according to Dr. Sundaresen Jayaraman of the Georgia Institute of Technology. He says the shirt, which he is presently adapting for firefighters, wirelessly sends vital signs to a PC via a pager-like device. Meanwhile, Infineon Technologies has incorporated an MP3 player into a jacket and hood using conductive textile to connect its electronic components. Malden Mills Industries' David L. Costello reports that his company has created a lightweight electric blanket out of its Polartec cloth, which features woven-in stainless steel conductive fiber. Most electrotextile technology is being applied to the military, an example being an antenna woven into a soldier's uniform created by Foster-Miller.
    http://www.nytimes.com/2003/02/06/technology/circuits/06next.html
    (Access to this site is free; however, first-time visitors must register.)

  • "The Fate of UCITA"
    InfoWorld (02/03/03) Vol. 25, No. 5, P. 49; Foster, Ed

    Ed Foster writes an open letter to the American Bar Association's (ABA) House of Delegates in which he urges them to kill the Uniform Computer Information Transaction Act (UCITA), which has long been a target of criticism from customers of IT products. He explains that none of the latest UCITA amendments proposed by the National Conference of Commissioners on Uniform State Laws (NCCUSL) has satisfied opponents, and notes that they fail to address many of the major concerns highlighted by the ABA's Working Group on UCITA last year. Foster observes that UCITA advocates sent the ABA House of Delegates materials that categorize the opposition as representative of three specific groups--consumer organizations, library associations, and insurance companies; but he adds that UCITA opposition "goes far beyond" those three sectors. He writes that one of the often overlooked segments of the opposition are IT professionals, and remarks that both the ACM and IEEE are among those who have repeatedly sided against UCITA. These groups object to the bill because it will worsen the quality of software, reduce the choices American IT customers have, negatively impact the competitiveness of American products in the global market, and hobble attempts to boost the security of U.S. technology infrastructure. Foster notes that UCITA only has the support of one interest group--major providers of IT products and services--but points out that that not all major providers support the measure. The biggest UCITA lobbyists are Microsoft, AOL, and perhaps a few companies heavily influenced by Microsoft, he suggests. He closes his letter by asking the ABA House of Delegates whether UCITA solves a real-world problem, a question to which he has never received a direct answer.
    http://www.infoworld.com/article/03/01/31/05gripe_1.html

    To learn more about UCITA, visit http://www.acm.org/usacm.

  • "Proving IT"
    CIO Insight (01/03) Vol. 1, No. 22, P. 40; Duvall, Mel

    Prior to committing to big IT investments, companies are now setting up test labs to prove the business value of tech projects. The in-house testing facilities are usually overseen by CIOs and other IT executives, while personnel are made up of business people and technologists. Furthermore, solid business results must be demonstrated within 90 days or sooner. Stevens Institute of Technology professor Jerry Luftman notes that IT labs must follow several guidelines, including instituting management by both IT and business; keeping projects at 90- to 120-day timeframes; and being careful not to take on too many projects at once. Information Economics President Sunil Subbakrishna adds that scalability issues should be considered for every project, and technology must be properly engineered. Teamwork is also a requirement for most IT labs, where project success often hinges on the alignment of IT and business leaders with business goals. Bell Canada's Centre for Information Technology Excellence (exCITE!) lab evaluates proposed projects using a team of business and technology managers, and then develops them under strict regulations. Bell has tested and approved of many IT initiatives through exCITE! that have added up to over $25 million in savings and $10 million in additional revenue. Twenty-five percent of computer and communications manufacturers and 21 percent of government IT executives polled by CIO Insight report that they are beginning to require more pilot projects.
    http://www.cioinsight.com/article2/0,3959,841192,00.asp

  • "10 Emerging Technologies That Will Change the World"
    Technology Review (02/03) Vol. 106, No. 1, P. 33; Roush, Wade; Waldrop, M. Mitchell; Fairley, Peter

    New technologies with significant implications for computing, manufacturing, security, and other vital areas are being developed by some of the best minds in their fields. University of California, Berkeley, scientist David Culler is working on small monitoring "motes" that could pave the way for wireless sensor networks: The motes gather environmental readings and transmit them to each other, but this relatively simple operation could form the basis for low-power wireless networks that can monitor virtually everything. Grid computing, pioneered by Argonne National Laboratory's Ian Foster and the University of California's Carl Kesselman, will one day enable people to access computational resources quickly, regardless of their location; the researchers co-developed the standard, open-source grid deployment protocol package known as the Globus Toolkit, which is being used in practically every major distributed computer under construction. The combination of traditional mechanical systems with new electronic elements and intelligent software--a technology known as mechatronics--is already being employed by automotive researchers to boost vehicle safety and reliability, and at the forefront of mechatronics research is Darmstadt University of Technology engineer Rolf Isermann. The commercialization of nanotechnology will hinge on finding an inexpensive way to mass-produce nanoscale features, a problem Princeton University's Stephen Chou is working on by employing nanoimprint lithography; he has found a way to stamp out features smaller than 10 nm by heating a solid surface with a laser, and is busy formulating a way to build complex microchips using this method. Nancy Lynch and Stephen Garland of MIT's Laboratory for Computer Science have created a computer language and programming tools that offer more thorough software development, which could lead to better-quality software that in some cases literally means the difference between life and death. Quantum cryptography promises unbreakable security of electronic communications because any attempt to read quantum-encrypted data would be detectable. It is this quality that Nicolas Gisin of the University of Geneva says will be critical to e-government and e-commerce.
    http://www.technologyreview.com/articles/emerging0203.asp

  • "Pssst! This Note's For You"
    Discover (02/03) Vol. 24, No. 2, P. 19; Johnson, Steven

    Mainstream adoption of Global Positioning System (GPS) technology has reached critical mass, as evidenced by unexpected uses far beyond what its inventors intended. Originally developed as a way for smart bombs to more precisely locate targets, GPS has evolved into a tool for sophisticated social activities such as geo-caching. Geo-cachers use GPS to determine the location of receptacles containing trinkets so that they can find them, take out items, and add new ones. It is believed that the biggest forthcoming breakthrough for GPS will be the geographical storage of information. This would allow, for instance, a person with a mobile device to access data left by others in a specific physical area--information about the region, restaurant recommendations, and even virtual notes from friends. Andrea Moed, creator of Annotatespace.com, believes that the pastime of storytelling could be revolutionized by such technology, and has built a walking tour of New York's Down Under Manhattan Overpass neighborhood supplemented with location-based content. Swedish researcher and GeoNotes co-creator Fredrik Espinoza believes location-tracking technology can support much more open and social systems than prepackaged tourist guides that services such as Vindigo offer.
    http://www.discover.com/feb_03/feattech.html

  • "Tech's Biggest Battle"
    Fortune (02/17/03) Vol. 147, No. 3, P. 78; Kirkpatrick, David

    Intel plans to crowd IBM and Sun Microsystems out of the 64-bit server market with its Itanium 2 chip, but rivals may still be able to triumph by using the time between the announcement of Itanium and its long-delayed launch to deepen their roots and focus on other 64-bit server technologies. The runaway success and enduring popularity of Intel's 32-bit chips among corporate customers could hinder the company's plans to get its clients to transition to Itanium; other factors include the chips' lack of application support, and the fact that companies will have to jettison their existing software to implement the technology. Intel's biggest competitor is IBM, which was encouraged by the success of its 64-bit Power4 chip to not adopt Itanium, and is confident that it can keep up with Intel's manufacturing know-how. Furthermore, IBM has extensive knowledge of Itanium's limitations, partly because Intel provided information about the technology when it courted IBM as a prospective adopter. Advanced Micro Devices (AMD) is considered a dark horse by many analysts, but the April release of its Opteron chips, which can interoperate with Pentium, may give it an edge, though the company will have to overcome a dearth of marketing and manufacturing savvy; AMD plans to embed Opteron in servers and later PCs, and offer them at much cheaper prices than Itanium. In the meantime, analysts believe Sun will have no choice but to adopt Itanium for its own survival, because they doubt that sales of its SPARC processor will climb. In Intel's corner is Hewlett-Packard, which has committed unswervingly to Itanium so that it can concentrate on polishing its servers and gaining extra revenue from the sale of Itanium chips. Dell Computer supports Itanium because it hopes to ascend the corporate-computing ladder and become a provider of high-end corporate hardware, but the company is also considering offering AMD's Opteron; IBM has already forged a deal with AMD to exchange technology and manufacturing expertise, which could lead to Opteron chips ending up in IBM factories. In the event this happens, Intel could fall back on the launch of another 64-bit chip, the Yamhill.
    http://www.fortune.com/fortune/technology/articles/0,15114,418480,00.html

 
                                                                             
[ Archives ] [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM