Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to technews@hq.acm.org.
Volume 6, Issue 633:  Monday, April 19, 2004

  • "Supercomputer Hacks Highlight Ed Security Challenge"
    IDG News Service (04/16/04); Roberts, Paul

    Under pressure from government regulations, increased user demands, Internet-borne attacks, and even legal threats from the private sector, universities are turning to advanced security technologies such as intrusion prevention systems. Universities have historically tried to maintain as open and accessible network infrastructures as possible, but new threats are making that obligation especially onerous: For example, hackers recently broke into the Linux and Solaris supercomputer systems at Stanford University using stolen IDs and passwords, then took advantage of shared folders on the system that were kept up to facilitate data sharing and system management. Unlike companies whose main network task is to protect information, universities act as ISPs facilitating access for users; this job is made more difficult now that students are constantly taking their laptop computers home, where they are often exposed to malware, and then plugging them back into the school network when they return. Boston College has begun using home-grown tools to quarantine infected computers, forcing students to play a more active role in campus network security. University of Georgia chief information security officer Stan Gatewood says some of his school's departments recently deployed a commercial messaging platform from Mirapoint in order to better manage spam email, and notes that managing university IT environments is a politically sensitive task since there are so many stakeholders. The need to manage different groups' needs is driving network management tools that make it easy to provision specific services with as little overhead as possible. Federal and state regulations are also playing a role in determining university IT policy and priorities, as well as legal advisories from the music and movie industries concerning illegally traded material on campus networks. Some universities have begun to segment their networks in order to better manage competing needs, cordoning off student dorm networks, for example.
    Click Here to View Full Article

  • "Weighing the Results of PC Recycling"
    CNet (04/16/04); Spooner, John G.

    Next month Dell plans to publicly disclose its goal to boost the amount of hardware it recycles by 50 percent by weight of materials collected, which could spur more PC recycling as well as set up a common recycling metric for the computer industry in general. Another benefit of such a plan would be to silence critics that the PC industry's recycling efforts have been sluggish. Dell is establishing recycling goals with the assistance of Calvert Group, the As You Sow Foundation, and other organizations, which warned the company that its shareholders could suffer if the PC manufacturer failed to address the hazards of electronic waste. Dell, IBM, and Hewlett-Packard all have PC recycling programs, but analysts say the volume of gear they recycle is relatively small: Dell estimates that since its first recycling program was established over 10 years ago, only 2 million out of the tens of millions of PCs it has shipped have been recycled; an IBM Global Financing representative claims that the number of PCs his company recovers weekly has risen from 15,000 in 2002 to 22,000 in 2003; and HP reports that it recycles almost 80 millions pounds annually. However, the EPA reckons that 250 million PCs will be discarded between 2002 and 2007. Companies are also racing against the clock to prove that state and federal regulation of PC recycling is not necessary. Dell, HP, and IBM's recycling initiatives are still crying for improvement: A corporate IT executive poll by IBM Global Financing in January 2003 found that many respondents did not realize that their PCs' manufacturers probably offered a recycling program, while a Dell survey of business and institutional customers reported similar numbers. "As people become more and more aware of the environmental risk associated with [computers]...they'll be more likely to start getting them out of the stockpiles," predicts Pat Nathan, who directs Dell's "sustainable business" programs.
    Click Here to View Full Article

  • "Segway Battlefield Applications Explored"
    Union Leader and New Hampshire Sunday News (04/18/04); Morris, Jeanne

    The U.S. Defense Advanced Research Projects Agency (DARPA) has sent at least 15 modified Segway Human Transporters to researchers at MIT, Carnegie Mellon University, Stanford University, NASA, the Neurosciences Institute, and elsewhere with the goal of developing them into thinking, reasoning machines for battlefield operations. The Segways commissioned by DARPA are equipped with a platform that permits researchers to move around without having people riding or steering the machine, or even directing it by remote control. The Segway Robotics Mobility Platform features laptops that let researchers develop software and control an array of navigational sensors. It is DARPA's aim that the platform be developed into robots that can cut the acquisition and maintenance costs of military systems, widen the scope of military hardware, and facilitate a dramatic shift in the speculation, design, construction, and employment of military systems. Not only are researchers focusing on making the robots capable of thinking through reason, but learning and improving performance through experience as well as communicating to humans and taking verbal orders. Carnegie Mellon is using the Segways to further its goal of building robots that can play soccer with people, and Carnegie Mellon computer science professor Manuela Veloso believes such a breakthrough could be applied to military reconnaissance and search-and-rescue missions. Computer science professor and scientist Oliver Brock of the University of Massachusetts says a platform outfitted with a visual system and arms to manipulate objects is in the works. Meanwhile, MIT researchers have made a Segway robot that can open doors and move through corridors.
    Click Here to View Full Article

  • "FTC to Look Closer at 'Spyware'"
    Washington Post (04/19/04) P. A4; Noguchi, Yuki

    Privacy advocates are in a furor over "spyware" and "adware" that is often installed on Windows PCs in many popular programs--free music and file-sharing programs, for example--users download off the Internet, sometimes without the user's awareness. The FTC will investigate the hazards of spyware at an April 19 workshop in Washington, D.C., focusing particularly on whether criminals will exploit such programs to steal users' Social Security and credit-card numbers, notes Howard Beales of the FTC's consumer protection division. Most spyware and adware programs are apparently used to track consumer preferences, but privacy experts and anti-spyware vendors warn that such programs can compromise consumers' control over their PCs, as well as act as impinge on their privacy by acting as surveillance tools for advertisers. Beales and many privacy proponents admit that the installation of adware is often permitted in licensing agreements users are required to consent to in order to download popular programs--agreements that many consumers do not fully read. Pest Patrol's Roger Thompson says a distinction has yet to be made between benign and malign spyware use, noting that the relatively low incidence of "malicious" spyware behavior does not erase the fact that such programs "[open] a back door that allows computers to be updated by the hacker and accept commands to log keystrokes, read files, or turn on the Web cam." U.S. legislators including Sens. Barbara Boxer (D-Calif.) and Conrad Burns (R-Mont.) have proposed a bill that would ban the installation of software on a PC without user notice and consent, and require that such software be easily removable.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Making Software Customisation a Commodity"
    IST Results (04/15/04)

    A new open-source software development platform allows organizations to more effectively manage variants of open-source applications. Whereas version management software allows companies to keep track of and manage different versions of a vendor product, they do not adequately address the issue of in-house customization, says MECASP coordinating partner Remi Coulon. The project is funded by the European Commission's Information Society Technologies program, and has been placed under GNU General Public License. Coulon says open-source software evolution is not linear like vendor-managed software and requires a more sophisticated variant management framework, especially since much of the changes are the result of company's own customization efforts. MECASP is made up of three layers: a development database recording all software changes, a visual interface, and internal editors and compilers. The platform is also extremely flexible, able to run on different operating systems and in mixed environments, and applicable to different software language applications, such as those built with XML or C++. MECASP partners presented their solution last December, showing how a CRM vendor could use the platform to track and manage changes to interface, logic, or database components. The platform affected changes at different levels, even allowing developers to make byte-by-byte adjustments. MECASP is especially well-suited for firms that maintain online services operating on customized software, according to MECASP lead developer Elaine Isnard. The group intends to produce a full-featured version if it is embraced by the open-source community.
    Click Here to View Full Article

  • "Could Open Source Elections Close Out Hanging Chads?"
    NewsForge (04/05/04); StoneLion

    Electronic voting systems based on open-source software are a better alternative to those using closed-source proprietary software, according to many computer experts. A local government in Ontario recently used a Linux-based system in its elections that was developed by the local firm CanVote; with an interactive voice response system at the core, the electronic voting system allowed citizens to cast ballots both by Web and phone, and spurred 52 percent voter turnout. CanVote developer and local resident Joe Church says Linux was chosen because it was cheap, easy to customize, and secure. Open-source electronic voting systems have also played a role in larger elections, such as in Australia in 2001, where 8.6 percent of ballots were cast electronically for the Legislative Assembly election: Australia worked with Software Improvements to develop the eVACS system, which is based a pared-down version of Debian Series 1 and does not provide paper receipts; the limited functionality of the system provides for greater security, and each polling station is simply a network of simple PC terminals. The central eVACS server cleanses its hardware upon loading, and the code has been subjected to outside scrutiny. University of California, Berkeley, assistant professor David Wagner says the Australian model provides some guidance for the U.S. government, which has to date largely failed in implementing secure electronic voting systems. Wagner recently co-authored a report criticizing the U.S. Department of Defense's Secure Electronic Registration and Voting Experiment (SERVE) for overseas voters; the program was shelved, partially because government officials could not ensure the security of the proprietary code. Wagner suggested a compromise could involve code inspection by both major political parties, and that a verifiable paper receipt would be absolutely necessary to gain the trust of the public. "We may find ourselves with no chad, hanging or otherwise," he said. "I'm not sure that's an improvement."
    Click Here to View Full Article

    For more on e-voting, visit http://www.acm.org/usacm.

  • "New Web Protocol May Leave DSL in the Dust"
    NewsFactor Network (04/15/04); Martin, Mike

    North Carolina State University computer science researchers boast that current high-speed digital subscriber line (DSL) connections are positively "lethargic" compared to Internet connections using binary increase congestion transmission control protocol (BIC-TCP). NCSU's Jon Pishney says that a recent Stanford Linear Accelerator Center report shows that BIC-TCP was No. 1 in terms of stability, scalability, and fairness when compared with six other new protocols in a series of experiments. NCSU computer science professor Injong Rhee estimates that BIC is about 6,000 times faster than DSL and 150,000 times faster than existing 56K modems; he explains that in a typical scheme information is gathered at a remote site and is transmitted to and shared by various labs, which can cause traffic jams on current networks. NCSU's Khaled Harfoush, who developed BIC with Rhee and Lisong Xu, likens employing TCP to transfer data across high-speed networks to "using an eyedropper to fill a water main." BIC's speed is rooted in its binary search scheme, in which maximum network capacities are quickly detected while keeping data loss to a minimum. "What takes TCP two hours to determine, BIC can do in less than one second," proclaims Rhee, who adds that national disasters such as the recent power outage in the eastern United States and Canada could be staved off with such an approach. Meanwhile, Pishney says that BIC could be a valuable tool for national and international computing labs engaged in astronomy, meteorology, geology, and nuclear and high-energy physics research.
    Click Here to View Full Article

  • "Testing Times for Women in IT"
    VNUNet (04/14/04); Mortleman, James

    Information technology testing has become an increasingly attractive area of the IT profession for women. Five years ago, females accounted for just five percent of IT testers, but today they represent more than one-third. Now, according to Vizuri, a risk management and recruitment company, if women continue to enter the IT industry at their present pace, they will account for three quarters of IT testers by 2006. Long dominated by men, IT testing is one of the most technical areas of the IT profession. However, IT testing is also very rewarding in terms of career opportunities, salary, and benefits. "Testers love a challenge, so the incentive of breaking the IT industry's glass ceiling is a compelling one," says Vizuri's Paul Dixon. "This surge is merely the start of women's increasing role in this sector--we're sure there'll be plenty more to come." Soft skills, such as interacting with clients and customers, have been an advantage for women as IT testers.
    Click Here to View Full Article

    For more on women in IT, visit ACM's Committee on Women in Computing, http://www.acm.org/women.

  • "Making the World Safe for Free Software"
    Salon.com (04/15/04); Manjoo, Farhad

    Daniel Egger, a partner at the venture capital firm Eno River Capital, wants to establish the legality of Linux so that his startup firm Open Source Risk Management can begin to offer insurance protection for users of the open source operating system. Egger's efforts are a response to the legal action that SCO Group, a small software company in Linden, Utah, has taken against Chrysler and AutoZone for using Linux. SCO claims that its Unix software has been added to Linux, which essentially makes the Linux software an illegal copy of its intellectual property. OSRM is inspecting the operating system's code and has hired paralegal Pamela Jones to help the firm with its legal strategies. SCO has asked companies to pay it for the right to use Linux, and also has a $5 billion case against IBM, claiming its engineers stole code from its software and added it to the free operating system. Don Marti, editor of the Linux Journal, says some companies are putting off their plans to abandon Unix for Linux until the situation surrounding SCO's legal charges have been resolved. In the meantime, OSRM has embarked on an effort to certify every thing inside Linux and where it came from. While open-source developers maintain that there is nothing dangerous about using Linux, Jones says OSRM is providing "a way for the community to fight and win against future nuisance lawsuits" that are sure to come.
    Click Here to View Full Article

  • "Internet Governance Debate Heats Up"
    IDGnet New Zealand (04/16/04); Bell, Stephen

    The process by which the Internet will be governed in the future has become a hot topic, generating plenty of debate in political and industry circles alike. Some countries are arguing that governments should play a bigger role in overseeing operational aspects of the Internet such as the management of the root DNS servers and the administration of the domain name space. Internet guru Vint Cerf made a solid case for the continued openness of the Internet at last month's United Nations ICT taskforce global forum, remarking that any Internet governance plans should focus on the use or misuse of the medium, not its technical operations. In defending his point, Cerf pointed out that the technical aspects of the Internet are evolving transparently, in a manner open to all stakeholders. Rules on using the Internet, on the other hand, are less clear and might need some attention, Cerf said, though he cautioned against stifling "the innovation and freedom to create that the Internet offers." The issue of Internet governance must be focused on areas that need government intervention, says Richard McCormack, the honorary chairman of the International Chamber of Commerce, who points out that the number of Internet users has doubled to 850 million from just four years ago. ICANN CEO Paul Twomey pointed out that the Internet governance working group to be established by UN Secretary-General Kofi Annan has a mandate that is nearly identical to ICANN's. Lyndall Shope-Mafole, chairwoman of South Africa's National Commission on Information Society and Development, took a view opposing Cerf's, arguing that the technical underpinnings of the Internet should be subject to government oversight to ensure the legitimacy of the process.
    Click Here to View Full Article

  • "A Voting Revolution in India?"
    Business Week (04/19/04) No. 3879, P. 53; Kripalani, Manjeet

    India expects to stop election fraud, cut costs, accelerate the electoral process, and boost voter turnout from 60 percent to 70 percent by deploying $200 electronic voting machines that are simple and easy to use. Voters use a keyboard to enter their choice by pushing a button next to the name and symbol for their candidate, and their choices are recorded on a chip in the control unit. The voting machine's manufacturers insist their product boasts superior security: Data cannot be decrypted and printed without a court order, while Bharat Electronics general manager R. Jagannathan maintains that the device features a microprocessor containing reprogramming-proof software. It is expected that the machines will save India's government as much as 10,000 tons of ballot paper in each national election, and the Election Commission believes the products will make electoral results available within a day as opposed to three. In addition, a single machine can record nearly 4,000 votes, whereas an individual ballot box can only support 600 votes. The e-voting machines will also make elections more transparent, which could present a problem without careful oversight: Many Indian voters have exhibited hostility toward people who vote against their candidates, sometimes to the point of committing violence. To avoid such incidents, electoral officials will need to closely monitor elections. A million of the machines will be used for national elections held between April 20 and May 10.
    Click Here to View Full Article

  • "Testing 101: AWOL on the College Campus"
    Software Development Times (04/15/04) No. 100, P. 1; Correia, Edward J.

    Few college and university computer science curricula include testing for quality, which is a vital ingredient in software development, according to industry experts. Go Pro Management President Robin Goldsmith maintains that college curricula overemphasize people and project management, leaving quality assurance in the lurch: "There's a notion that a project manager should not be directing attention to hands-on skills associated with testing, requirement analysis and design because if they are busy doing that, they are not doing project management," he notes. "Requirements determine what needs to be done, and testing determines if it's being done right." Gartner research director Theresa Lanowitz adds that software quality testing and career goals deserve equal consideration, while both she and Goldsmith agree that the scope of computer science curricula should be expanded to more fully include testing. "Colleges should have parallel tracks on how to put quality code to use by testing against requirements," asserts Lanowitz. A common argument against including software testing in curricula is that testing does not constitute a real career, according to Boston University's Azer Bestavros; Lynn Robert Carter of Carnegie Mellon University's Institute for Software Research International adds that software QA testers usually earn less money and are more likely to have their jobs outsourced. Florida Institute of Computer Science professor James Whittaker strongly believes that companies must place more value in quality if they are to have successful-selling products. He concludes that quality will be the chief point of discernment for applications once those applications and development tools become equal in terms of time-to-market and feature sets.
    Click Here to View Full Article

  • "The Makings of a Do-It-Yourself Supercomputer"
    Network World (04/12/04) Vol. 21, No. 15, P. 1; Leung, Linda

    The Flashmob1 supercomputing project at the University of San Francisco drew hundreds of computer enthusiasts together in an attempt to build one of the world's fastest computers in just one day. While the goal to break into the Top 500 supercomputing list was not met, the participants did manage to reach a 180-gigaflop benchmark. The full potential of the 700 machines donated was not met because a single failure broke the system and disrupted the benchmark testing. Special Flashmob software containing the supercomputing instructions and tools to help monitoring was installed in all the machines, but Flashmob organizers were surprised by network problems on the client side, such as improperly rated network interface cards or when the Flashmob software attempted to use WLAN cards. Hewlett-Packard experts on hand explained that normal supercomputer clusters are tested over months before achieving the final benchmark scores. Flashmob organizers estimated it would take about four hours or less for benchmarking if the system was to achieve a place in the Top 500 supercomputer list, which will be published next in June. The most successful Flashmob benchmarking attempt, achieving 180 gigaflops, used just 256 computers and lasted 70 minutes before a failed node brought the whole system down again. Although the effort fell short of its initial goals, it did prove that a powerful supercomputer capable of performing serious scientific research could be set up ad hoc within communities or schools, and that such resources were not just regulated to research laboratories and large institutions. University of San Francisco Flashmob organizers have already been contacted by numerous other universities about setting up another Flashmob supercomputing effort.
    Click Here to View Full Article

  • "Happy Memories"
    New Scientist (04/10/04) Vol. 182, No. 2442, P. 28; Daviss, Bennett

    Motorola has given other electronic manufacturers prototypes of a chip to evaluate that could one day start a PC instantly, power batteries all day, never lose data, and operate at high speed. The magnetoresistive random access memory (MRAM) chip is the same technology Motorola demonstrated last October that some observers believe is the first step in enabling computer engineers to build memory into a processor chip. "Once you can put memory and processing physically together, then you can do away with disc drives and other forms of memory," says Gary Prinz, director of the Nanoscience Institute at the U.S. Naval Research Laboratory in Washington, D.C. "To do that, the memory has to update its data as fast as the processor requires, which magnetic memory can do." The technology makes use of a forgotten computer memory technology, metal ring memory, in the form of a magnetic tunnel junction device, in which layers of magnetic metals sandwiching an insulator can control the current flowing across the insulator. The electronic industry still must find a way to make the tunnel junctions small enough and cheap enough. But in theory, researchers would be able to build a complete computer onto a single chip and embed it into all kinds of everyday items. The technology represents a $40 billion annual market in instant-on computers that do not lose work, longer space missions, and faster and more powerful cell phones and PDAs. Although the technology has its skeptics--Unity Semiconductor President Darrell Rinerson says MRAM small enough for mass markets is not stable enough--others say it has promise. Jack Judy, at the University of Minnesota's Center for Micromagnetic Technologies, says, "You might need to make a major change in the structure to make it work better, but there's no doubt that clever engineers can solve these problems."

  • "Spam to Go"
    Technology Review (04/04) Vol. 107, No. 3, P. 22; Roush, Wade

    Spam is invading text messaging, with the volume of spam text messages originating in North America outstripping legitimate messages last year, according to messaging firm Wireless Services. The European Union, Japan, South Korea, and California have all passed laws to try to stem the tide, and Congress has told the Federal Communications Commission to create rules to protect cell phone users from unsolicited text messages, which can cost users money if their carrier charges for messaging. Both wireless companies and software vendors are acting on their own as well, fearing that mobile spam will discourage users from subscribing to new data services. Advanced 3G networks in South Korea, Japan, Europe, and parts of the United States allow multimedia messages and are already employing "opt-in" systems to help prevent unsolicited multimedia messages. Wireless Services, which shuttles text messages between U.S. carriers' networks, introduced software last year that builds on techniques used to block email spam, including Bayesian filtering and a quarantine system. The company wants to make its filters customizable for users. Lucent Technologies has a prototype that lets carriers create online menus so customers can specify what kinds of messages they want to receive, and when. Rick Hull, Bell Labs' director of network data and services research, says, "If the consumer can block a merchant from viewing his location information, the merchant has no idea they're passing by." Lucent says that such technology will be integrated with its existing Internet infrastructure software within a year.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "Paint the Light Fantastic"
    Discover (04/04) Vol. 25, No. 4, P. 24; Johnson, Steven

    Animating organic objects--clouds, trees, hair, fire, skin, and the like--in a computer is a tough challenge, since even an untrained eye can usually see through the artifice, often in a single glance. The two key components of computer modeling are the object's shape and the way light bounces off the object; the shape is usually defined by generating digitized wire frames, and then adding surface texture data. Illuminating objects realistically is the function of ray tracing, in which the computer calculates the independent trajectories of millions of photons, and then figures out how those trajectories would lead into a viewer's pupils. The more complex the computer-generated object is, the more unpredictable its interaction with light can be. To render hair realistically in the movie "Shrek" and its sequel, visual-effects supervisor Ken Bielenberg discovered that shadows play a crucial role, because without them hair glows artificially. Subsurface scattering is another unique light phenomenon animators must take into account if they wish to avoid characters with lifeless skin. Phenomenology offers a shortcut for accommodating these various light effects algorithmically. For "Shrek 2," the animators decided to render trees realistically by simulating their entire growth cycle, starting with digital seeds. Purdue University Rendering and Perceptualization Lab director David Ebert observes that modeling fire is even more complicated: Not only does fire contain particles that both reflect and emit light, but its gaseous content discharges transparent light, while particles of dust and soot give the flame a shadow. Recreating natural objects will become more and more mundane as computer-generated imagery plays an increasingly important role in entertainment.
    Click Here to View Full Article

  • "The Sensor Web: A Distributed, Wireless Monitoring System"
    Sensors (04/04); Delin, Kevin A.

    The NASA/Jet Propulsion Laboratory's (JPL) Sensor Web Project was designed as an instrument made up of multiple sensor platforms or pods that share information among themselves and perform as a single unit for the purposes of environmental monitoring and/or control. A Sensor Web pod is comprised of a radio that connects each pod to its local neighborhood; a microcontroller to house the system's protocols, control the radio, and perform data analysis when required; a battery pack that uses solar energy to recharge; and a sensor suite based on the particular application. The Sensor Web's communications protocol supports both omni- and bidirectional information flow, so Sensor Web information can be generated by four distinct classes of data: Raw data detected at a particular pod, postprocessed sensed data from a single pod or cluster of pods, commands entered into the distributed instrument by an outside end user, and commands entered by a pod. The network does not distinguish between the last two types of data, which makes the Sensor Web both field-programmable and self-adapting; bidirectional communication enables portal pods to be connected via satellite or the Internet so that the end user of a specific Sensor Web may be another web, extending a local web outside its geographical limits. NASA/JPL has been fielding Sensor Web technology in assorted environments, such as Florida coast wetlands, desert regions in New Mexico and Arizona, Antarctic ice sheets, and the Huntington Botanical Gardens in California. These implementations demonstrate that conventional metrics for assessing wireless sensor systems are not always applicable. The spread of Sensor Webs into various user communities will foster a demand for associated sensors to occupy these systems, and a push to not only improve existing sensors' fidelity, size and cost, but also to supply new sensors that can enhance new monitoring needs.
    Click Here to View Full Article