HomeFeedbackJoinShopSearch
Home

       HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM. To send comments, please write to [email protected].
Volume 5, Issue 575:  Monday, November 24, 2003

  • "Switching Allegiances in Computers"
    New York Times (11/24/03) P. C2; Markoff, John

    The Comdex PC supershow is shrinking while ACM's supercomputing conference (SC2003) held in Phoenix last week is takingoff: SC2003 is still just two-thirds the size of Comdex this year, but is growing rapidly while Comdex attendance dwindles. The shift represents how feelings in the computer industry have diverged to focus on supercomputers capable of new tasks and the sophistication of common electronic devices. The supercomputer industry has been rejuvenated by international competition, signified by the Japanese Earth Simulator, now the fastest computer in the world. The U.S. government is redoubling its efforts to build better supercomputers, not only to increase national competitiveness, but also to use in the fight against terrorism and to advance science. Another important aspect of the renewed supercomputer industry is the focus on grids, which IBM Deep Computing vice president David Turek says fundamentally changes the way people access computing power. Notably, Microsoft made its first SC2003 appearance and Apple Computer beefed up its presence after Virginia Tech researchers built an Apple G5 cluster that ranked No. 3 on the world's supercomputer list. The new interest in supercomputing is a vindication for IBM, the historical mainframe titan and maker of the processor used in the Apple G5; IBM's recently announced deal to make gaming processors for the next-generation Microsoft Xbox stirred talk about reconfigurable computing. IBM is looking into reconfigurable chips that would allow Microsoft the flexibility to run existing software alongside newer applications on the new chips, for example. Industry consultant Rob Enderle says Comdex and SC2003 show how industry conferences have to keep abreast of technology trends or else be quickly marginalized.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Spam Pact Toughens Penalties, But Critics See a Lack of Muscle"
    Wall Street Journal (11/24/03) P. A3; Fole, Ryan J.; Clark, Don;Angwin, Julia

    Although the House has passed a compromise anti-spam bill that imposes tougher penalties against spammers and authorizes the FCC to create a no-spam list similar to the do-not-call registry, the commission reports that it has neither the technology nor the personnel to set up such a registry. The FCC could be ordered to establish such a registry with additional legislation, but the effort could run afoul of litigation; furthermore, FTC officers such as Howard Beales and Chairman Timothy Muris are highly skeptical that such a list would be successful. Under the bill, people who send fraudulent messages such as false advertising and bogus return names could spend as much as five years in prison, and the most notorious spammers could receive fines of up to $6 million. The measure institutes an opt-out policy for people who receive spam, but ePrivacy Group's Ray Everett-Church describes the bill as having "all the earmarks of what we have seen fail in states that have passed anti-spam laws." Brightmail CEO Enrique Salem adds that enforcement will be more difficult than it would be with an opt-in policy in place. The legislation also forbids individual consumers from suing spammers, and supercedes state laws that grant consumers this privilege; consumer advocates argue that certain types of spam are legitimized under the legislation. Although the Direct Marketing Association approves the idea of a national mandate to preempt a hodgepodge of state laws, it warns that a do-not-spam registry could hurt legitimate businesses. Despite critics' reservations, the Senate will probably approve the bill as early as Monday, and the president is expected to sign the bill into law soon after.

  • "Computer-Security Experts Challenge Researchers to Focus on Long-Term Solutions"
    Chronicle of Higher Education (11/21/03); Carnevale, Dan

    Purdue University's Eugene Spafford was one of five speakers at a recent Virginia conference who suggested strategies computer scientists could follow to implement long-term cybersecurity solutions. Spafford declared at a news conference that computer networks should be rethought to include embedded, effective, and easy-to-use security. However, he remarked that "Near-term needs are so pressing that they have soaked up most of the resources and most of the funding and left little for long-term thinking. It's an ongoing arms race in cyberspace." Spafford, who predicted that better network security will encourage people to engage in more online activity and create better services, identified four "grand challenges" that researchers should address within a decade: The halt of spam, viruses, worms, and denial-of-service attacks; the development of tools to build large-scale, highly trustworthy networks; the creation of systems that give users the ability to comfortably control their privacy and security; and the design of risk-management analyses for computer systems that offer just as much reliability as financial investment risk-management analyses. Spafford also expressed hope that the federal government will allocate more funds to network security research. Other speakers at the forum included Sun Microsystems' Susan Landau, who noted that medical care could be significantly enhanced if security and reliability were incorporated into computer networks. The Virginia conference was held by the Association of Computing Machinery and the Computing Research Association, while the National Science Foundation used the event to announce that it would soon start accepting research proposals for improving computer security under its CyberTrust program.
    Click Here to View Full Article
    (Access for Paying Subscribers Only.)
    Eugene Spafford is co-chair of ACM's U.S. Public Policy Committee; http://www.acm.org/usacm

  • "E-Votes Must Leave a Paper Trail"
    Wired News (11/21/03); Zetter, Kim

    California Secretary of State Kevin Shelley announced Nov. 21 that all electronic voting systems in the state must be retrofitted to print out voter-verifiable ballots by 2006, enabling voters to ensure that their votes are properly recorded. Counties will not be allowed to buy any paperless e-voting system starting July 2005, while all machines will be required to produce a printed audit trail by July 2006. Shelley said the schedule he has organized for deploying a state-wide printed receipt system gives plenty of time for new e-voting machines to be appropriately certified, adding that election officials and poll workers will have adequate time to be trained on the new systems, as well as to educate voters on how to use them. "This is one step toward strengthening [election integrity] and giving voters confidence that their vote was handled by the computer in the way they wanted it to be handled," proclaimed San Mateo County elections chief Warren Slocum. Shelley also called for a technical oversight committee to be created, along with new security standards for e-voting machine manufacturers and software testing and auditing requirements, and also random field testing on election days to guarantee that e-voting systems are operating smoothly. Shelley convened a task force of election officials, computer specialists, disabled community representatives, and members of the general public in February to debate e-voting security. Task force member and California Voter Foundation President Kim Alexander expects Shelley's actions to spur election officials in other U.S. states to implement similar measures. Another task force member, Stanford computer science professor David Dill, declared that Shelley's mandate "breaks the vicious circle where the vendors say they're not producing printers because they say there's no demand for them."
    Click Here to View Full Article
    To learn more of ACM's activities regardling e-voting, visit http://www.acm.org/usacm/Issues/EVoting.htm

  • "On the Web, Research Work Proves Ephemeral"
    Washington Post (11/24/03) P. A8; Weiss, Rick; Williams, Margot

    The Web's amorphous nature as an information repository--with Web pages expected to last only 100 days on average, according to the Internet Archive's Brewster Kahle--makes it an unreliable medium for scholars and scientists who reference articles and other pieces of data that exist online. A team led by physician Robert Dellavalle reported in last month's issue of Science that Internet references in footnotes from articles in the New England Journal of Medicine, Nature, and Science became increasingly inactive or unobtainable as time went on. "I think of it like the library burning in Alexandria," declared Dellavalle. "We've had all these hundreds of years of stuff available by inter-library loan, but now things just a few years old are disappearing right under our noses really quickly." Another recent study estimated that one-fifth of Internet addresses employed in a Web-based high-school science curriculum vanished over a year. Scientific footnotes are not the only things affected by the Web's fluidity--business, government, and other organizational data is just as liable to change addresses or disappear: In fact, David Worlock of Electronic Publishing Services reckons that one quarter of the 2,483 British government Web sites relocate to a new URL every year. The Internet Archive and Google are recording Web pages for posterity so that duplicates exist even after the originals have been pulled down. But the Web's ranks swell with an estimated 7 million new pages per day, putting electronic archivists in the unenviable position of playing catch-up. This has prompted others to design new indexing and retrieval systems that can find pages that have moved to new URLs.
    Click Here to View Full Article

  • "Proposed Spam-Blocking Technology Is a Long Way Away"
    InternetWeek (11/21/03); Gonsalves, Antone

    The Anti-Spam Research Group (ASRG), an alliance of consumer email providers, and other organizations are attempting to control spam by developing and implementing sender-authentication solutions, but many are finding the challenge much more difficult than previously anticipated. ASRG co-chair John R. Levine admits that his predecessor Paul Judge's May forecast that some ASRG technologies would be ready for deployment within a few months was "too optimistic." The ASRG has no personnel and no budget--Levine says it is merely a coordinating body of anti-spam researchers affiliated with the Internet Engineering Task Force (IETF). Levine says that three major sender-authentication schemes are currently under consideration by his organization: Reverse MX, Sender Permitted From, and Designated Sender Protocol. All three proposals would permit a mail server receiving a message to query the email's originating domain as to whether the server that transmitted the message has authorization to send from that domain; at least a year will pass before the ASRG will be able to submit one proposal to the IETF as a suggested international standard. Meanwhile, a commercial alliance that includes Yahoo!, America Online, Microsoft, and Earthlink was established in April to develop technology ahead of the ASRG's efforts. Their proposal calls for ISPs and any other body that owns its own domain name system (DNS) to use a private key in their mail servers to embed an encrypted code in the header of each outgoing email message; upon the mail's arrival at its intended destination, the receiving mail server would get its sender's key from its DNS server to decode the header and authenticate the email's origin, while spam and other unwanted messages would cause that DNS to be automatically blocked. "What we really want to do is make sure that the Internet community is in agreement that this is a good solution, and an appropriate solution," says Yahoo! Mail's Miles Libbey. But the technology is in an early developmental stage and no general release deadline has been set.
    Click Here to View Full Article

  • "'Spyware' Steps Out of the Shadows"
    CNet (11/19/03); Borland, John

    Computer security researchers and policy makers are increasingly concerned about spyware, programs that lurk behind the scenes on people's computers, serving up pop-up ads or recording computer activity to third parties. Rep. Mary Bono (R-Calif.) has introduced an anti-spyware bill that would help curb many of the most common types of spyware, such as annoying advertising programs such as those bundled with popular free software Kazaa and Grokster. The Consortium of Anti-Spyware Technology Vendors is also working to define spyware and adware so that more effective action can be taken against programs that fit those definitions. Software firms that create Ad-Aware and Pest Patrol lead the consortium, and want to help other firms create non-infringing programs. Pest Patrol's Pete Cafarchio says clear definitions are needed so that companies have a standard to go by when using advertising in their software. Besides adware, some software vendors sell spying programs intended to record computer use, including typed passwords and other sensitive information; these programs are marketed as ways to monitor children's or spouse's computer activity, but can be used for even more dangerous purposes. Experts say that programs such as these can penetrate corporate firewalls via the computers of employees working from home. This week, the Center for Democracy and Technology issued a report calling for legislation requiring clear disclosures and uninstall procedures for software programs.
    Click Here to View Full Article

  • "EU Cybercrime Agency Gets the Go-Ahead"
    IDG News Service (11/20/03); Meller, Paul

    A plan to form a European Network and Information Security Agency (ENISA) that would ease cooperation and data exchange pertaining to network and information security has gained the approval of European telecommunications and communications ministers. ENISA will receive $39 million for its first five years, and act to back the internal European Union market. The group is set to start operations in January in Brussels, but will be placed in a permanent location later. ENISA will act as an advisor on security concerns for member states and the European Commission, deal with the necessity of higher awareness about Internet security issues, and manage activities centered on assessing and managing risk. Telecommunications interests are pressing for ENISA to work with private groups. "Until today there has been no systematic cross-border cooperation or information exchange between the EU Member States," notes the European Commission, adding that the various states have made progress to different degrees with different approaches. "This is the challenge that the ENISA is set to meet," the commission states.
    Click Here to View Full Article

  • "The Future of Open Source in Security"
    EarthWeb (11/19/03); Bourque, Lyne

    Open source tools help network administrators develop more robust defenses against electronic infiltration, according to academic and industry attendees at the second annual Open Source Symposium held at Seneca College in Toronto. Though proprietary applications have benefits such as support agreements, open source technology provides security-minded administrators with the tools necessary to innovate. A presentation on wireless security highlighted the number of open source accessibility testing tools available, including wavemon, airtraf, and wave stumbler, as well as network vulnerability tools such as Kismet, Air Snort, and Moxy. Using these tools, network administrators can view their wireless network from an outsider's perspective and adjust accordingly. Open source and computer security have a long shared history as many existing tools have open source roots, including Nmap, SATAN, SAINT, SARA, GnuPGP/PGP, and OpenSSL. Open source tools exist for virtually every security topic, and the ease at which these tools can be obtained has also brought more people into the computer security field, and avoided in many instances the building up of a high-cost proprietary system. Support for these technologies is available on mailing lists and forums, and often rivals that offered by traditional technical support. Integrating open source security technology into the system also means greater diversity and protection against particular vulnerabilities; attacks that exploit one vulnerability will more likely be isolated to a single server or service, unlike in a monoculture environment where products often share vulnerabilities.
    Click Here to View Full Article

  • "Computers 'Hamper the Workplace'"
    BBC News (11/20/03)

    An iSociety report published on Nov. 18 ends a year of study with the conclusion that technology in the workplace is causing more problems than it is solving, primarily because of a "low tech equilibrium" in which workers lack tech skills and guidance, management is "naive," and tech support personnel are cut off from everyone else. The result of this is an "endemic annoyance" toward computers characterized by sober and stoic attitudes in which employees regularly accept technological failures and have low expectations of technology's promised benefits versus what it actually delivers. The report's authors contend that "government, businesses and the technology industry must drive change, transforming workplaces from a mood of stoicism to optimism." The researchers discovered that in time of technical problems, people will either repair computers themselves if they have the knowledge, or inquire colleagues about technical matters. Failing that, they often criticize support staff, who are frequently disconnected. Report co-author Max Nathan says the barriers between workers and technical support staff are cultural as well as structural and spatial. Around 40 percent of people declared that support personnel practically speak in a foreign language, according to the report. Nathan says spatial and cultural divides often lead to management isolating knowledgeable technical staff from the decision-making process, and believes people should be given more opportunities to experiment with technology to make it function better for them.
    Click Here to View Full Article

  • "Segway Robot Opens Doors"
    Technology Research News (11/26/03); Smalley, Eric; Patch, Kimberly

    Cardea is a one-armed robot created by MIT researchers that can navigate hallways and open doors; it incorporates the base of a Segway scooter platform, whose dynamic balancing is essential to keeping the machine's arm practical, according to MIT scientist Una-May O'Reilly. The five-foot, 200-pound Cardea prototype deploys a kickstand to prevent falling, while two cameras serve as a vision system and sonar sensors aid in navigation. Researchers plan to outfit the robot with two additional arms, heat sensors, and a head so that it can safely interact with people at human-height level; each arm will have six degrees of freedom and will be equipped with end effectors or hands, while Cardea's vision system will be enhanced with improved panning ability and arm-vision system coordination. As with precursor robots such as Cog and Kizmet, Cardea will be programmed to learn in an unstructured environment through exploration, trained to recognize and manipulate objects, and use vocal tones and facial expressions to interact with people. O'Reilly adds that researchers are investigating how to make the robot capable of recharging its power supply. Cardea is one of a dozen federal and university projects involving robots built on Segway bases that were launched by the Defense Advanced Research Projects Agency. The Segway's dynamic balancing abilities come into play when Cardea's arm moves, which causes its center of mass to shift. "Regardless of where the weight is on top of it...the platform is able to move with balance," O'Reilly notes.
    Click Here to View Full Article

  • "What is WSIS Getting At?"
    CircleID (11/19/03); Crawford, Susan

    Susan Crawford, assistant professor of law at Cardozo Law School, questions the rationale for recent criticism of ICANN in relation to the planned WSIS meeting in December and worries that the organization is seeking too much control over the Internet. One accusation directed at ICANN centers on cybercrime and the shielding of intellectual property rights, and Crawford notes a number of concerns related to such assertions, including the question of whether a government could actually run the Internet, especially through the DNS. "The most that governments will do will be to build walls between nations, requiring their ISPs to point only to approved sites," Crawford writes. Such actions, the author argues, really amount to setting up national Internets as opposed to actually controlling the greater system. Overall, the author questions the actual aims of the WSIS and whether ITU control of the Internet would in fact encompass conditioning Internet access. "I really don't understand how there could be a common vision that will serve everyone's control needs AND provide Internet access to developing countries," Crawford writes. She says ICANN's operations are not linked to control needs or access, and the body should simply serve as a group of contracts and a place for considering registry and registrar stability issues. If ICANN steps beyond that role, it will fail before opposition, Crawford writes. What concerns the author is an apparent move toward an ICANN that tries to vie with the United Nations for influence and attempts to assert an opinion about material online and publishing identity information. In playing this role, ICANN will encourage the United Nations in wanting to manage control itself.
    Click Here to View Full Article

  • "User-Friendly Gadgets in Pipeline"
    Nikkei Weekly (11/10/03) Vol. 41, No. 2106, P. 14

    Universities and electronics companies in Japan are working on user-friendly gadgetry, much of which is wireless and Internet-dependent. Sony Computer Science Laboratories has developed a CD player that can play CDs without removing them from their cases; the case is outfitted with a wireless integrated circuit that interacts with a home server where the music is stored. A student at Keio University has made a device that allows users to find specific CDs using sound cues: CDs are represented on a computer screen by particular icons, and when the user moves the cursor to an icon, the system plays the opening melody of the music on the CD. Meanwhile, Osaka University graduate student Yoichi Ikeda has developed a plastic bottle that "sucks in" data by placing the nozzle against a CD case; the bottle hardens as more data is consumed, and the user then inserts the nozzle into a speaker slot and squeezes the bottle to cause the music to play. The stored data is read on the bottle by a wireless integrated circuit. A wrist device devised by Toshiba's Corporate Research & Development Center can wirelessly control lights and other kinds of equipment when the user points to it, and Sony Computer Science Laboratories has developed similar technology to switch on appliances and telecommunications devices. The handheld gadget could one day enable users to transfer films from the video store to their home systems via the Internet. Sony Computer Science's Junichi Rekimoto reports that though Internet-accessing home information electronics systems should be commercially available in about three years, "no one would connect home appliances to the Internet if it's too troublesome to do so."
    Click Here to View Full Article

  • "IT on Wheels"
    InfoWorld (11/17/03) Vol. 25, No. 45, P. 63; Schwartz, Ephraim

    Fleet maintenance departments are taking advantage of telematics technology to build a reliable infrastructure to keep track of vehicles and other mobile assets. Such technology is being used to monitor driver behavior and vehicle location, both of which can significantly affect back-end systems. Driver behavior can be sent to the HR department for the purpose of training employees and evaluating their performance, while vehicle location can be fed into transportation, route management, and package-tracking systems. In addition, asset management figures derived from telematics can be channeled into accounting systems. Commercially available in-vehicle telematics applications fall into two categories: Driver-productivity applications that keep tabs on vehicle speed, crashes, location, and door-lock status; and diagnostic applications that collate information on vehicular elements such as braking systems, fuel usage, tire tread wear, etc. Qualcomm's OmniTRACS system routes diagnostic data from various in-vehicle monitoring programs to fleet managers at corporate headquarters, while IBM offers middleware such as WebSphere to tie the monitoring programs together. "Even though you can track down the location of a vehicle or get the diagnostic data, we haven't seen good software to leverage that data and turn it into an advantage," notes Gartner VP Thilo Koslowski. The enterprise will therefore need new business processes to accommodate additional data for telematics to work successfully.
    Click Here to View Full Article

  • "ISPs Take on DDoS Attacks"
    Network World (11/17/03) Vol. 20, No. 46, P. 27; Pappalardo, Denise

    Distributed denial-of-service (DDoS) attacks are increasing in frequency and intensity, but major ISPs expect to reduce the threat with new tools that anticipate and repel both true DDoS attacks and viruses and worms that mimic DDoS attacks. Paul Morville of Arbor Networks reports that though security breaches are on the rise, DDoS attacks have changed little in the last 12 to 18 months; the main shift in tactics is one away from relatively small assaults on individual hosts and toward larger, multiple-point sieges that target whole networks. AT&T, MCI, and Sprint are among the providers currently testing tools that detect DDoS attacks and DDoS-like worms and viruses: AT&T claims to have enhanced its backbone with proactive, network-based security and is considering anomaly detection equipment from Arbor, while meshing anti-DDoS technology from AT&T Labs with commercially available tools is also under investigation. AT&T director of IP security services Sanjay Macaw says there is no single solution for halting DDoS attacks--rather, the most effective strategy is to use a combination of tools to mitigate damages. MCI's Bob Blakely says his company will roll out a more "proactive" toolkit that will include network-wide anomaly sensing equipment. Sprint's John Pardun says the carrier intends to implement DDoS mitigation and intrusion-detection measures throughout its backbone in the next year, and adds that Sprint employs a "strong network-based platform" whose edge routers use stateful inspection to analyze traffic. MCI, AT&T, and Sprint plan to make their DDoS services available to customers for a fee. Many big ISPs are also part of an informal mail and voice-over-IP mailing list that network administrators often use to warn of attacks so ISPs can respond to them faster.
    Click Here to View Full Article

  • "Twilight of the PC Era?"
    Newsweek (11/24/03) Vol. 142, No. 21, P. 54; Levy, Steven

    The IT world is still reverberating from the shock of a 12-page Harvard Business Review article titled, "IT Doesn't Matter;" in it, magazine editor and freelance writer Nicholas Carr argued that technology has turned into a commodity and no longer offers companies competitive advantage. Furthermore, he advised firms to "follow, don't lead." Mild-mannered Carr is not much of a threat to the IT industry himself, but the idea he crystallized is important and already in place in the minds of many CIOs and other corporate executives: CIO magazine publisher Gary Beach says IT buyers have a "we won't get fooled again" mindset that makes them question new systems and upgrades. There is also a growing disconnect between everyday people and the corporate IT world, since managers want to lock down PC functions and consumers demand more utility. People also are not seeing beneficial effects of increased productivity, other than the elimination of jobs and the ease at which companies can outsource overseas. Still, the advent of the microchip, PC, and Internet all indirectly caused unforeseen waves of change that proved vitally important to companies. Napster is a good example of how the confluence of new technologies--more powerful PCs, PC storage, and the Internet--ignited broadband use, started the MP3 player market, and almost collapsed the entertainment industry; IBM vice president Nick Donofrio expects sudden innovations such as these will result in "six magnitudes of improvement over the next 35 years." Open Source Applications Foundation head Mitch Kapor says there is still tremendous room for improvement in software, and other experts see Web services and RFID spurring radical changes companies had best keep up with. As Microsoft vice president Jeff Raikes puts it, "Who would you rather be--Wal-Mart or Sears?"
    Click Here to View Full Article

  • "Labs Look Ahead"
    eWeek (11/17/03) Vol. 20, No. 46, P. 60; Baltazar, Henry; Brooks, Jason; Caton, Michael

    EWeek Lab analysts note products and technologies that may generate a great deal of interest in the coming year as an economic recovery makes companies more willing to invest. Henry Baltazar writes that storage management software will be important, and companies could get a leg up by offering products that seamlessly combine tools to make storage management less of a headache. He adds that data security and recovery regulations will lead to more storage management products that offer "write once, read many" functionality and a heavier concentration on data migration tools. Jason Brooks anticipates the further penetration of open-source software into IT with the rollout of Sun's Java Desktop System, the Red Hat Enterprise Linux line, and the 2.6 version of the Linux kernel. In the area of security, Jim Rapoza predicts that new products and appliances that mesh security applications and interfaces into user-friendly systems will emerge in the next year, while Cameron Sturdevant thinks available patch management systems could decline in number as a result of patch management's integration with software distribution systems, while patch testing research is likely to increase as well. Anne Chen expects the diversification of mobile computing to accompany users' migration from dial-up to broadband and their increasing use of WLAN hot spots and ad hoc wired connections. She also foresees greater use of cell phones and personal digital assistants to link to enterprise-class applications, while companies will continue to invest in laptops and Wi-Fi-enabled tablet computers; in addition, Chen believes mesh networking systems, ultra-wideband, and 802.165.4 will be heavily hyped. Francis Chu writes that the server management sector will experience advances in server automation, resource and asset management, and convergence technologies in 2004, while server virtualization technologies will maintain their momentum in server consolidation initiatives.



    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM