HomeFeedbackJoinShopSearch
Home

       Compaq is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, Compaq offers performance-packed products and comprehensive services.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Compaq or ACM.

To send comments, please write to [email protected].

Volume 4, Issue 347: Friday, May 10, 2002

  • "Judge Rules Copyright Law Constitutional"
    SiliconValley.com (05/08/02); Files, Jennifer

    In ruling that the Digital Millennium Copyright Act (DMCA) is constitutional, U.S. District Judge Ronald M. Whyte has denied a motion filed by Russian software maker ElcomSoft to dismiss a case alleging that it violated the DMCA by distributing a software program that copies electronic books. Because the DMCA controls the function of software, as opposed to its content, it does not circumvent the First Amendment, Whyte ruled. Consequently, he said there is no "generally recognized First Amendment right" for consumers to make backup copies of e-books or other online materials. Lawyers involved in the case stress the importance of Whyte's decision, since it represents the first ruling on the DMCA's constitutionality in a criminal court. The ruling has agitated proponents of online free speech. "What good are the public's rights if the tools needed to make fair use or access works in the public domain are illegal?" argues Electronic Frontier Foundation legal director Cindy Cohn. ElcomSoft lawyer Joseph Burton warns that Whyte's decision could put firms that distribute security tools such as password recovery software at even greater risk of prosecution. The hearing to set a trial date for the ElcomSoft case was scheduled by Whyte for May 20.
    http://www.siliconvalley.com/mld/siliconvalley/3225820.htm

  • "At Senate Hearing, Cyberterrorism Fears on the Rise"
    Computerworld Online (05/09/02); Verton, Dan

    At a recent Senate hearing, Senate Governmental Affairs Committee Chairman Sen. Joseph I. Lieberman (D-Conn.) stressed the urgent need to secure the nation's critical infrastructure from terrorist attacks, particularly those carried out online. John Malcolm, head of the Justice Department's Criminal Division, went so far as to say that hacking could translate into physical attacks, such as a cyberterrorist commandeering and crashing an airplane by exploiting flaws in the FAA control system. The private sector controls 90 percent of the U.S. critical infrastructure, and the focus of the hearing was the exchange of information between that sector and the government. "The future battlefield is in private, not public hands," explained Sen. Robert Bennett (R-Utah), who presented a bill designed to encourage more information sharing by preventing private-sector data from being unwittingly revealed under the Freedom of Information Act. Notably absent from the hearing was the dissenting opinions of terrorism experts who believe the country is investing too much in a threat that is not imminent. Former CIA chief of counterintelligence Vince Cannistraro, for example, argues that terrorists see little incentive in cyberattacks because no actual violence or bloodshed is involved. Former CIA profiler Eric Shaw said, "Considering all possible threats is a nice, creative process but there is little evidence to suggest its practical benefit, other than funding of security-related projects that may not be needed."
    Click Here to View Full Article

  • "House OKs Stiffer Cybercrime Penalties"
    Reuters (05/08/02)

    A bill that toughens penalties for cybercriminals and increases Internet user surveillance has passed unanimously through the House Judiciary Committee. The proposal would require the U.S. Sentencing Commission to consider such factors as the motives of cybercriminals and whether sensitive government computers were involved in their misdeeds. If convicted for knowingly or recklessly committing a cybercrime that jeopardizes human lives, perpetrators could be sentenced to life in prison; current law imposes punishments based upon the amount of economic damage caused by cybercrimes, and the guilty receive minor jail sentences, if at all. Under the proposed legislation, ISPs would be allowed to report non-immediate as well as immediate network threats without fear of legal reprisals, but must preserve electronic records for at least 90 days or face penalties. The Center for Democracy and Technology has criticized Smith's legislation, claiming that electronic privacy could suffer if government agencies have the power to force ISPs to disclose their records without a search warrant.
    http://zdnet.com.com/2100-1106-903235.html

  • "House Proposal Would Double Engineering Research Spending"
    EE Times Online (05/08/02); Leopold, George

    A bill introduced Tuesday in the House Science Committee would eventually double the budget of the National Science Foundation (NSF), which conducts a large percentage of the nation's basic research in engineering and technology. Backers of the new proposal say the new funding level would set the NSF's budget on par with the that of the National Institutes of Health, which President Bush said would receive double its budget for basic medical research by 2003. The new money proposed for the NSF would equal 15 percent increases for three consecutive years, effectively doubling the current budget at the end of the three-year period. The NSF money would boost "nanoscale science," networking research, educational efforts, new laboratory construction, and the purchase of research equipment. The engineering trade group IEEE-USA says the money would enhance the nation's development of nanotechnology, homeland security, IT, and electrical technologies. The NSF currently assists in training 25,000 graduate students annually, and backs 46 percent of basic engineering research done at U.S. universities.
    http://www.eetonline.com/sys/news/OEG20020508S0023

  • "Tiny Triumph for Science"
    Washington Post (05/10/02) P. A14; Gugliotta, Guy

    A German-led research team has successfully converted light into mechanical energy on the molecular level for the first time, which could have significant ramifications for nanotechnology. The experiment, reported today in Science, concerns a microdevice built in a diving board-like array, with a single molecule of the plastic azobenzene connecting the underside of the board to a glass sheet. When exposed to light waves of differing frequency, the molecule contracts or expands, pulling the board down or pushing it back up. However, the molecule eventually wears out and breaks after a certain amount of usage, and research team member Hermann E. Gaub admits the material's resiliency will need to be improved before it is ready for the marketplace. A physicist at the University of Munich's Nanoscience Center, Gaub believes the azobenzene molecule could prove beneficial to the development of ever-shrinking sensors and chemical analysis tools, or could be used as an alternative to circuitry. "You need to wire circuits," he explains. "Shining light on something is more straightforward."
    http://www.washingtonpost.com/wp-dyn/articles/A62712-2002May9.html

  • "Cold Comfort for Chip Talk"
    Nature Online (05/08/02); Ball, Philip

    A team of researchers at the TRW Space and Electronics Group has taken a notable step toward the development of faster and cheaper electronics by devising circuits based on niobium, a material that becomes superconducting at approximately nine degrees above absolute zero. This property, which is induced through the application of liquid helium, allows niobium chips to transmit data at least three times faster than conventional silicon chips, effecting rapid communications between superconducting microprocessors. Despite this breakthrough, Konstantin Likharev of the State University of New York at Stony Brook maintains that superconducting electronics still has a long way to go. Economic factors are likely to impede the technology's progress. Chipmakers would need to make a heavy investment in order to transition to a new building material and manufacturing process, while the liquid helium cooling requirement would translate into massive overheads. "Most chips don't need high performance," says Likharev, who adds that the technology will probably be relegated to niche markets at present.
    http://www.nature.com/nsu/nsu_pf/020506/020506-2.html

  • "Is Wi-Fi Heading Down the Wrong Track?"
    802.11 Planet (05/09/02); Sutherland, Ed

    FCC officials say that complaints about interference on the 2.4 GHz band from Wi-Fi devices is increasing along with the spread of the technology. Wi-Fi (802.11) operates in an unlicensed experimental section of radio bandwidth that is not subject to license control by the FCC. Still, television stations, amateur radio groups, and satellite communications companies have all lodged complaints with the FCC, as have end users, who often misunderstand the technical nature of Wi-Fi technology. The head of the Spectrum Management Working Group, Dewayne Hendricks, says Wi-Fi is merely an intermediary technology and has recommended that the FCC begin regulating its use more strictly before it further degrades the performance of licensed devices. However, Hendricks is a known backer of the competing Ultrawideband wireless standard that recently won approval for limited use by the FCC. Wireless Ethernet Compatibility Alliance Chairman Dennis Eaton says criticisms of Wi-Fi are unfounded because the benefit of widespread wireless broadband is already apparent and Wi-Fi standards have interference protections built in.
    http://www.80211-planet.com/news/article/0,,1481_1107451,00.html

  • "On Internet of the Future, Surfers May Almost Feel the Spray"
    New York Times (05/09/02) P. E5; Taub, Eric A.

    The Integrated Media Systems Center of the University of Southern California is developing a tool that could revolutionize the subjective experience of video and act as a driver for the next-generation Internet. "You can stop looking at the imperfections in the picture and sound and completely lose yourself in the event," boasts center director Ulrich Neumann. The tool, known as Remote Media Immersion, combines a large video display with a surround-sound audio system that could prove particularly attractive to consumers, as their demand for home theater systems demonstrates. The video portion is recorded on high-definition cameras and compressed onto a hard drive using MPEG-2 compression technology, but the picture is much sharper because it is compressed from its raw data rate of 1.5 Gbps to 45 Mbps. The 10.2 audio system consists of five speakers positioned around the front of the display, and provides more precise sound origination than the 5.1 Dolby Digital standard used by DVDs. The programming for the system would be delivered on demand, and consumers would choose the programming they wish to view via a Web browser. However, the current Internet cannot accommodate the system's required transmission speed of 60 Mbps, so a new, high bandwidth version will need to be built.
    http://www.nytimes.com/2002/05/09/technology/circuits/09NEXT.html
    (Access to this site is free; however, first-time visitors must register.)

  • "3D Printers Could Become Industrial Dynamos"
    NewsFactor Network (05/09/02); Hirsh, Lou

    MIT scientists led by Professor Emmanuel Sachs are developing a 3D printing technology with support from the National Science Foundation. Sachs says the technology is based on the principle that any configuration can be constructed out of stacked layers. The process uses layers of powdered ceramic, metal, or plastic composites only 30 to 200 microns thick, and MIT says that the ink jets deposit a binding substance to cement the layers together. The technology is already being applied to industry in the production of 3D product models and plastic prototype casts, according to Z Corporation marketing manager Karen Kiffney. She says if actual parts could be manufactured by a printer, then components could be assembled by demand, significantly reducing both shortages and surpluses of inventory. The technology could be used to make auto parts, for example, but it could also be applied to the medical industry--the construction of artificial bone grafts, for instance. The process is being refined to print out chambered pills and scaffolding for human tissue grafts. Other products the technology will likely be used to manufacture include metal components, ceramic molds, power plant filters, and electronics.
    http://www.newsfactor.com/perl/story/17669.html

  • "A Visual Rather Than Verbal Future"
    Washington Post (05/09/02) P. E1; Walker, Leslie

    A popular conceit is that speech will become the primary mode of communication between humans and computers, but University of Maryland computer science professor Ben Shneiderman sees things a little differently: He argues that data visualization technologies will offer users better computer control. Shneiderman says lab research proves that the act of speaking uses the same part of the brain as short-term and working memory, so speaking and thinking simultaneously is a difficult proposition. Visual interfaces have no such limitations, and the University of Maryland's Human-Computer Interaction Lab, which Shneiderman founded, is developing a number of projects. The timesearcher is one such project--users can drag a graphical box over a vast database, expanding or shrinking it to better understand relationships; results are displayed on a nearby panel instantly. Another tool is dynamic search query software that allows users to refine searches and test various what-if scenarios using graphical sliders. Meanwhile, lab director Ben Bederson has developed PhotoMesa, software that enables users to view a massive amount of image directories and thumbnails, and then zoom in on specific images or groups. The lab's work is being applied to the real world--Wall Street analysts are trying out the timeboxes for stock analysis, although Shneiderman thinks the tool could be even more useful for analyzing genomic data; and Massachusetts-based Spotfire and drug companies are using the dynamic query sliders. If Shneiderman's predictions about visual tools pan out, it could either mean fewer software agents for the next-generation Internet or a new way of controlling them.
    http://www.washingtonpost.com/wp-dyn/articles/A56499-2002May8.html

  • "Why PC Design Must Change"
    ZDNet (05/10/02); Papadopoulos, Greg

    Sun Microsystems CTO Greg Papadopoulos reasons that computer design needs to be rethought if we are to ride out the next networking wave, in which trillions of things--light bulbs, environmental sensors, and radio-frequency identification tags--will be linked to the Internet to the point where the connection will become an essential part of daily life. In the third networking wave the flow of information will change direction and move back to the Internet rather then originate from it. For the past few decades, microprocessors, disks, memory, and input/output have been basic system elements. That will change, Papadopoulos predicts, into a system model comprised of computers, storage systems, and IP networks. This architectural revamp will provide terabits of bandwidth, exabytes of data storage capacity, billions of IP connections, and microprocessors with dramatically higher scalability. "We must learn--and are in fact learning--to virtualize the elements of the network to create a single pool of resources that can be dynamically allocated, matching resources to services on the fly," Papadopoulos writes. This development will hasten the automation of change management, the reduction of total cost of ownership and system complexity, and the improvement of resource utilization. It will also significantly cut down the development cycle of distributed applications, according to Papadopoulos.
    http://zdnet.com.com/2100-1107-908769.html

  • "Tournaments Become Latest High-Tech Recruiting Tool"
    Boston Globe (05/06/02) P. C1; Lewis, Diane

    Employers are using programming competitions as a tool to find and assess the most qualified candidates for IT jobs--a relatively new trend, according to a random poll of recruiters in Boston. George Herchenroether of TopCoder says that his company hosts weekly programming matches that registered students can participate in to build up a ranking; those who rank high enough may become eligible to join a tournament that gives the winner a $100,000 grand prize and a listing as the top national college programmer. The ranking system also lists the coders' specific accomplishments so that recruiters can measure their abilities and competence. Sun Microsystems' Reggie Hutcherson says the contests offer participants an opportunity to refine their skills and test their mettle against professional programmers. However, Aquent COO Chris Moody says that such competitions--which rank people mainly by their technical know-how--may not necessarily be the best way for employers to find the ideal candidate. "Rarely are companies looking for [candidates with superior technical skills]," he explains. "Many also want cultural fit and experience fit...A person could be brilliant and not be right for the job."
    Click Here to View Full Article

  • "Autonomic Computing"
    Scientific American Online (05/06/02); Gibbs, W. Wayt

    An anonymous IBM manifesto issued last October contends that human intervention and administration cannot keep up with the advancing complexity of the IT infrastructure. Computer firm and university research leaders meeting at an April conference in Almaden, Calif., agreed with this assertion, although solutions to the problem vary. The manifesto urges that the best solution is autonomic computer systems that function in the same capacity as the involuntary nervous system. Such systems possess a sense of self, keep tabs on their components, and separate their public and private elements. A Columbia University effort has yielded systems with automated monitoring, tuning, and repair capabilities through the addition of software probes, gauges, and configuration controls. Active performance improvement, another trait of autonomic systems, lies at the heart of IBM's experimental Oceano system, which uses optimization algorithms to manage servers. Autonomic systems should also be able to heal themselves in the event of damage, and a Stanford University team led by Armando Fox has retooled a satellite ground station that can reboot its subsystems independently and come back online in six seconds rather than 30. Meanwhile, a recovery-oriented computing effort at the University of California, Berkeley, follows the belief held by David Patterson and his colleagues that computer system operations should not be concealed from human operators.
    http://www.sciam.com/explorations/2002/050602autonomic/

  • "Foundation Launches Middleware for 'Grid Computing'"
    NewsFactor Network (05/08/02); McDonald, Tim

    U.S. scientists will be able to more easily use grid computing applications with new middleware from the National Science Foundation (NSF). NMI Release 1.0 is a set of grid computing tools that enhance security and usability and is part of the NSF's goal of developing a national middleware infrastructure for engineering, education, and science. Besides the Globus Toolkit and Network Weather Service, tools include the eduPerson program that allows researchers to more easily access information available on the network, regardless of platform compatibility or location. The NSF Middleware Initiative (NMI) bundle also comes with an authentication and authorization tool called Shibboleth. The release is aimed at making grid computing easier to use, as well as bridging the gap between divergent systems. Although most of the emerging U.S.-based computing grids use Globus open source protocols, more standards need to be established in order to ensure compatibility.
    http://www.newsfactor.com/perl/story/17645.html

  • "No Crisis Over 1,024-Bit Encryption"
    VNUNet (05/02/02); Middleton, James

    At the Financial Cryptography conference in March, cryptography experts claimed that 1,024-bit encryption was "compromised," but RSA Security says the situation has been misinterpreted. Burt Kaliski, RSA Security's Laboratories director, believes that estimates performed by the experts were done quickly, that they have proven to be significantly inaccurate, and that encryption is as reliable as ever. Kaliski says that although a machine could be built by the end of the decade that could break 1,024-bit encryption, such a machine would only be created if it offered a high return on investment. Kaliski says that 1,024-bit encryption still enjoys widespread industry support and that it will stand up "for a few years yet." He also says that as the U.S. relaxes encryption export rules, stronger encryption will follow.
    http://vnunet.com/News/1131452

  • "Open for Business"
    InformationWeek (05/06/02) No. 887, P. 38; Ricadela, Aaron

    Although the open-source Linux system is making headway in the corporate sector, its entry is still complicated by trust issues, incompatibility with existing applications, inadequate documentation, and a lack of technical and business software support. InformationWeek's Research Web survey estimates that 80 percent of companies running Linux use it to serve Web pages, almost 60 percent use it for application development, and 58 percent use it for file and printer sharing. Two-thirds of respondents at 80 firms without Linux say they refuse to employ the system because it cannot operate key enterprise software. Meanwhile, IT vendors are hoping that Linux will become the platform of choice for utility and grid computing applications, given its ability to run on multiple microprocessors. The system is inexpensive and easily modified, factors that have contributed to its growth; in the survey, over 85 percent of technology managers who use Linux or intend to use Linux in the next 12 months are expecting higher numbers of Linux server licenses this year. However, many businesses remain averse to turning their toughest computing chores over to Linux, a reluctance that the more risk-tolerant scientific community does not have. Still, progress is being made: Over 80 percent of survey respondents who use Linux or plan to say they will probably run key software on the platform, while 41 percent expect to run business applications on Linux in the next year.
    http://www.informationweek.com/story/IWK20020503S0009

  • "Why So Few Women?"
    IEEE Spectrum Online (05/02); Applewhite, Ashton

    New York University's Margaret Wright and Columbia University's Kathleen McKeown are disturbed by declining numbers of female computer science majors, and they say this trend sets in well before college. An interest in computers could help women secure IT jobs that offer generous salaries and exciting career opportunities, as well as the chance to refine developing technologies to better suit them. Wright and McKeown argue that a subtle form of job discrimination is taking place, one based on clashing communication styles: Wright says that in the male-dominated computing culture, extroversion and unabashed promotion of one's own accomplishments is a sign of intelligence, whereas the low-key, self-effacing approach that women may use indicates a lack of same. To improve this situation, certain gender markers should be exposed, such as the male perception that success only comes from a singular obsession with computing, a viewpoint that contrasts with women's need to have a balanced home and work life, according to authors Jane Margolis and Allan Fisher. McKeown contends that mentors will help women better understand how to achieve this balance. Wright and McKeown have implemented or plan to implement interdisciplinary programs at their institutions, so that people--women included--can see the multiple applications of computer science. Wright believes that computing should have a greater role in the curriculum, while both she and McKeown agree that students must learn basic computing principles if they are to advance to more sophisticated systems quickly. The educators also value academic-industrial collaborative ventures.
    http://www.spectrum.ieee.org/WEBONLY/resource/may02/care.html

    To learn more about ACM's Women in Computing, visit
    http://www.acm.org/women.

  • "What's Next For the Database?"
    Intelligent Enterprise (05/09/02) Vol. 5, No. 8, P. 28; Stodder, David; Kestelyn, Justin

    A roundtable discussion between NCR Teradata CTO Stephen Brobst, IBM Fellow Donald J. Haderle, Ken Jacobs of Oracle, and Stanford University professor Jeffrey D. Ullman offers insights on the future development of database technology. Brobst says that "Databases today are not only responding to simple queries, retrievals, and updates: There's enough intelligence in the database servers that they are often the ones making decisions, creating events, and then pushing the results back out to the applications." Although they generally agree that database problems such as online transaction processing and batch decision support have been rectified, the technology must be further developed so that it can more rapidly respond to business requirements, blend decision support workloads against an organization's single source of truth, and automate management and maintain high availability; Jacobs cites progress being made in this last area. The panelists agree that SQL has a long and healthy life ahead of it, even with XML making gains, and Jacobs believes the language is developing to the point where it can simultaneously handle structured and unstructured data. Brobst forecasts that data mining will be almost completely internal in two to five years. Jacobs says the shared disk model will support "a highly scalable, available, and manageable environment that will run on...new emerging hardware architecture," while Brobst maintains that a shared-nothing environment offers more efficiency for databases from a hardware perspective.
    http://www.intelligententerprise.com/020509/508feat1_1.shtml

  • "Hack-Proof Chatting"
    Discover (05/02) Vol. 23, No. 5, P. 26; Smalley, Eric

    Quantum encryption will presumably represent the ultimate in secure, hacker-proof communications, and research groups around the world are competing to make quantum encryption a reality. In quantum cryptography, a sequence of horizontally and vertically polarized photons conveys a key that encrypts and decrypts messages, and these photons can be polarized in four directions; any eavesdropper who tries to uncover the message would be found out, since determining the polarizations would require equal amounts of measurement and guesswork that can be easily detected. The main challenge is creating a quantum encryption-enabled communications network. A Los Alamos team led by physicist Richard Hughes is working on a "free space" system in which photons are transmitted through the air and encrypted messages can be sent to any point on earth by bouncing laser pulses off a satellite. IBM Almaden Research Center researcher Donald Bethune says such a system could be ideal for relaying secret messages of a diplomatic or military nature. Quantum computing will not be able to enter the mainstream until the efficiency of photon emitters and detectors is improved and quantum boosters are added. Hughes says that wireless optical communications systems currently in use can easily be supplemented with single-photon encryption. Meanwhile, Nicolas Gisin and colleagues at the University of Geneva have formed a company, Quantique, through which they hope to sell prototypes of a quantum key distribution system based on fiber-optic cable.

 
                                                                             
[ Archives ] [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM