Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to technews@hq.acm.org.
Volume 6, Issue 616:  Wednesday, March 10, 2004

  • "Lingering Job Insecurity of Silicon Valley"
    New York Times (03/09/04) P. C1; Lohr, Steve; Richtel, Matt

    Programmers and software engineers had no trouble moving from job opportunity to job opportunity during the technology boom of the 1990s, but despite promises that an economic recovery has begun and increased demand for workers is on the horizon, tech professionals left in the lurch by the downturn are still hurting--and in many cases, are still jobless. The unemployment rate for computer scientists in 2003 peaked at 5.2 percent, while the Bureau of Labor Statistics estimated that unemployment among electrical engineers was its highest in two decades at 6.2 percent; the jobless rate for all workers averaged 6 percent last year, compared to about 4 percent in 2000. Ronil Hira of the Rochester Institute of Technology attributes the lingering unemployment levels to a combination of cost-cutting measures companies are implementing, including outsourcing to overseas contractors, automation, and a focus on squeezing more efficiency out of existing tech investments, resulting in less demand for tech workers. "The productivity and cost-cutting are impressive but it means less hiring," observes Merrill Lynch analyst Steven Milunovich. Many economists attest that outsourcing has not heavily affected U.S. jobs and salaries in general, and argue that everyone will benefit from the trend, provided that American workers can update their skills and relocate to new jobs if need be. Indeed, Erik Brynjolfsson of MIT's Sloan School of Management reports that automation of labor has far more serious consequences on U.S. tech jobs than outsourcing. Opsware, for example, has helped companies save money and reduce the need for employees by providing software that helps run data centers more efficiently. A hopeful sign is the results of an IBM survey of over 450s CEOs, 80 percent of whom said their companies planned to redirect their energies into new tech projects to spur growth this year.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Robotic Race Gets Off to Rocky Start"
    MSNBC (03/08/04); Boyle, Alan

    The Defense Advanced Research Projects Agency (DARPA) is offering $1 million to the team that creates an autonomous vehicle that can successfully navigate over 200 miles of rough desert terrain in 10 hours without human assistance. More than 20 teams will compete in this race, known as the Grand Challenge, scheduled for March 13. Qualifying trials at the California Speedway began on March 8, where eight robot vehicles have been tested but not one has thus far successfully completed the 1.25-mile obstacle course. (On Tuesday, a vehicle designed by a team from Carnegie Mellon University successfully completed the 1.36 mile test course.) Grand Challenge deputy program manager Thomas Strat says the first day of trials clearly demonstrated that the technological difficulties are considerable, but the challenge itself is "doable;" the trials are essentially a test to see if the unmanned vehicles can be operated and stopped safely, and at least have a chance of running in the March 13 race. The route of the race, which will extend from Barstow, Calif., to Primm, Nev., will remain secret until two hours before the competition begins. Even if no entrant successfully completes the course, DARPA Director Anthony Tether says the return on investment could be four to five times the $13 million the agency has poured into the project. He adds that the vehicles participating in the race could end up being used for military operations within the next 10 years, as part of an effort to reduce casualties. Anthony Levandowski, an entrant whose team has developed a gyro-stabilized motorcycle for the race, believes his invention could help lead to the creation of autonomous cycles for military reconnaissance or robotic stunt bikes.
    Click Here to View Full Article

  • "External Forces Chip Away at Internet's Overseer"
    International Herald Tribune (03/08/04); Oakes, Chris

    The International Corporation for Assigned Names and Numbers (ICANN) is still under pressure to clarify its mandate and open itself up to international interests. The recent tri-annual meeting in Rome and United Nations' International Telecommunications Union workshops in Geneva show continued discontent over how much participation international interests, especially poorer countries, have in Internet governance. Meanwhile, VeriSign filed a lawsuit in the United States against ICANN because the organization allegedly overstepped its bounds in regulating VeriSign's business. ICANN President Paul Twomey says the claims of international exclusivity are untrue and that ICANN is actively seeking to add to the 30 countries already working in a new governmental advisory group; both Twomey and ICANN critics say there is confusion about the organization's mandate and what role the United Nations should play, and Twomey says ICANN is eager to work with any U.N.-delegated body on solving problems outside its solely technical jurisdiction, such as child pornography or closing the digital divide. Meanwhile, European Union innovation and technologies minister Lucio Stanca warned against increasing government involvement in Internet technical management. University of Zurich international relations expert Marc Holitscher says that, although ICANN does not affect issues such as the digital divide on a practical level, the current structure of the organization is itself an embodiment of that problem; he explains that the domination of western governments and multinationals in ICANN are the root cause of much of the current controversy. On the other hand, VeriSign maintains that its lawsuit against ICANN will actually help strengthen the international body by clarifying its role and jurisdiction as a purely technical regulator.
    Click Here to View Full Article

  • "Trouble on Silicon Valley's Doorstep"
    CNet (03/09/04); Southwick, Karen

    Former Oracle President Ray Lane, now a general partner with Kleiner Perkins Caufield & Byers, thinks Silicon Valley has reached a turning point as the software industry begins a "tectonic shift" to a business model where renovation rather than innovation is the chief priority. He explains that this change does not spell the end of Silicon Valley, but the industry must reconcile itself to a very different way of doing business than simply funding inventions by the bushel. Innovation will still take place, but at a slower rate than in its 1990s heyday. Lane notes that customers desire standards-based software characterized by security, simplicity, and integration, which leaves two choices open to software CEOs: Application of proprietary methods to the market or adoption of a service-based strategy for software provision and maintenance. Lane believes India's business model is more in tune with the renovation market, which will make it an industry to be reckoned with. Lane also thinks U.S. legislation to limit or ban offshore outsourcing will ultimately undercut America's productivity leadership. He contends that in the new business landscape, a successful software company must expand or contract according to demand; be able to function anywhere, anytime, and in any situation; respond and deliver to uphold demand; make both internal and external operations transparent in real time; and keep asset and labor content per unit of production to a minimum. Lane notes the possibility that the next major industry driver could be worldwide communication and information access, which would once again make software the biggest global industry, at least until biotechnology matures.
    Click Here to View Full Article

  • "Will PHP Live Up to Its Billing?"
    Enterprise Linux IT (03/08/04); Ryan, Vincent

    Slated for availability in May 2004 is PHP 5, an update to the popular PHP open-source scripting language, whose promised upgrades include better native XML support, embedded SQL database capabilities, and an improved object-oriented programming approach. Zend Technology co-founder and Zend Engine co-creator Zeev Zuraski comments that PHP 5 allows XML parsers to be manipulated like XML objects, and significantly reduces the amount of code that must be written to obtain or exchange a piece of XML data; the update also has a new simple object access protocol interface scripted in C, enabling programmers to devise Web services with less difficulty. The development of dynamic-content Web applications that draw on server resources is also streamlined with SQLite, which relieves developers from the burden of installing or integrating a whole SQL data engine. Zuraski says SQLite takes up very little memory and boasts an open-source license, so applications not requiring a central database or server cluster are ideally suited for PHP 5. If Java were a scripting language, PHP 5 is the form it would take, Zuraski argues: "If you're creating hybrid applications that contain some Java components and PHP, then the syntax, behavior and semantics will be quite similar," he explains. John Coggeshall, author of "The PHP Developer's Handbook," reports that roughly 99 percent of the code was reliable when he tried to set up an object-based library with PHP 5, and says the only problem with PHP 5 is how inefficient it makes PHP 4 scripts look. OmniTI principal George Schlossnagle adds that enterprises wishing to develop fast Web applications with PHP would face fewer obstacles by using PHP 5. However, a recent Zend Technology survey found that approximately 20 percent of respondents were planning PHP 5 upgrades, while the rest said they would upgrade within a year or not upgrade at all.
    Click Here to View Full Article

  • "Talking Up a Good Game"
    ABCNews.com (03/09/04); Eng, Paul

    Being able to communicate fluently with locals in foreign lands is a very valuable skill for soldiers engaged in peace-keeping and "nation-building" operations, and computer science professor Lewis Johnson of the University of Southern California's Center for Advanced Research in Technology for Education believes military personnel could gain such skills through interaction with computer games equipped with artificially intelligent "agents." A simulation program that serves such a function is being developed at USC with funding from the Defense Advanced Research Projects Agency. Johnson's research team has developed the Tactical Language Training System over the course of more than a year, with the assistance of instructors at the U.S. Military Academy. The USC researchers have augmented the consumer game Unreal Tournament with a speech-recognition engine and intelligent agents that "react" to the user's vocal delivery. Johnson says the first part of the game trains players in proper language usage by having them utter words and phrases into a microphone while software corrects errors such as mispronunciation; the second part of the game involves players testing their linguistic competence by participating in "missions" such as asking locals for directions or finding out where bombs in need of diffusing are located. The virtual characters players interact with exhibit behavior that is unique to each individual user, and this behavior changes to reflect higher levels of difficulty commensurate with the user's improving skills. A virtual guide can be summoned when a player runs into difficulty to provide assistance. Col. Stephen LaRocca of the U.S. Military Academy's Center for Technology-Enhanced Language Learning is impressed by the USC language system, which is being tested on cadets, but insists that human teachers and classroom education will still be superior.
    Click Here to View Full Article

  • "Programmers So Far Underwhelmed by JSF"
    InternetNews.com (03/08/04); Wagner, Jim

    Programmers have thus far responded less than enthusiastically to JavaServer Faces (JSF), a standard for building Web applications at the presentation level, which was approved by the Java Community Process. JSF would provide developers with a standard kit of application programming interfaces to construct Web apps, and a tag library that would work with JavaServer Pages Web pages, to deliver a programming experience more akin to Microsoft .NET with the addition of drag-and-drop buttons, frames, forms, and other .NET-like elements. JSF advocates claim the standard offers more flexibility than the Apache Jakarta Project's Struts, which supports one-way element rendering, but Web app programmers will not be able to use the technology to its full potential until the standard is embedded within integrated environment development tools. Some programmers who posted their reactions on the TheServerSide.net developer forum complained that JSF only tacks on an additional layer of complexity. One programmer wrote, "Lets face it, graphical user interfaces should be designed (and implemented) using visual tools, not text editors. So, what I hope for in JSF is not an easier-to-read source file--it is not to have to read the source file at all." Innoopract founder and President Jochen Krause acknowledges that there will be some difficulty as programmers familiarize themselves with JSF, but insists that the specification is not too complicated, even though only a small percentage of programmers today could employ the standard in programs. He adds that criticism of JSF should wane within the next six to 12 months, when the first JSF-enabled Web apps start showing up on sites and Web portals. Krause and the committee that approved JSF think that as the standard is used more and more, Web and desktop apps will overlap.
    Click Here to View Full Article

  • "North American IPv6 Task Force Kicks Off Next Phase of Moonv6"
    Market Wire (03/08/04)

    The second phase of testing on the Moonv6 multi-vendor IPv6 network was launched on March 7, according to an announcement by the IPv6 Forum's North American IPv6 Task Force (NAv6TF). The goal of Moonv6 phase II is to continue to justify the deployment of Internet Protocol version 6 for the North American market as well as "to eventually create a Native IPv6 backbone peering that will, in time, permit production services as new applications develop and entice markets to come to the Moonv6 evolution," according to NAv6TF Chairman Jim Bound. Moonv6, a joint project between NAv6TF, the University of New Hampshire--InterOperability Laboratory (UNH-IOL), Internet2, the Joint Interoperability Testing Command (JITC), and other U.S. Defense Department agencies, constitutes the most assertive collaborative North American market trial of IPv6 interoperability and applications yet. The second testing phase will evaluate elements of IPv6 central to the pervasive implementation of the protocol, which is designed to replace the current-generation Internet protocol, IPv4, over the next few years. Aspects to be assessed include network routing protocols, applications, security, and transition mechanisms. Verifying the reliability of the first three aspects is key to the commercial deployment of IPv6, while the fourth element is important to the switchover from IPv4 to IPv6. Once the second phase of testing is complete, Moonv6 will continue to function as a test bed for industry, academic institutions, research labs, Internet providers, the JITC, and government agencies to help develop IPv6 for wide-scale support and implementation across North America.
    Click Here to View Full Article

  • "IT's Final Frontier"
    InternetNews.com (03/05/04); Haley, Colin C.

    IT vendors and service providers are looking for new opportunities with NASA now that President Bush committed to reinvigorating the space program. After the loss of the space shuttle Columbia, IT companies were uncertain about the government's future plans--but with a new infusion of $1 billion over the next five years and another $11 billion shifted to new moon and Mars missions, IT firms see a green light to develop innovative solutions. NASA has not yet signaled exactly what equipment will be needed, but it does have plans for tighter networking links between its 10 national offices and wants to better integrate its IT infrastructure; NASA's Brian Dunbar said the agency is also looking for smaller computer components that would lighten the load of equipment launched into space. NASA does not want untested, bleeding-edge technology, but rather tested solutions able to withstand the extreme environmental rigors of space. Large IT vendors are forming working groups together with NASA to see how off-the-shelf technology can be used in the space program instead of costly custom-built solutions, while NASA is also ready to do more outsourcing in order to both save money and allow its internal IT staff to focus on core projects. Content-delivery network Speedera, for example, has signed a contract to bolster availability of NASA's 3,000 individual Web sites; during the Mars Exploration Rover landing in January, Speedera enabled 4.5 billion hits to the NASA portal. Cisco is another likely NASA partner as it established a space unit three years ago and is currently testing its routers in space. Technical problems connecting space vehicles to Earth are similar to those faced by ground transportation companies, for example, said Cisco space unit leader Rick Sanford. Cisco is also a pioneer in the industry effort to lay out interoperability standards for space communications.
    Click Here to View Full Article

  • "Robo Doc"
    Engineer (03/05/04); Excell, Jon

    Honda's Asimo robot, whose crowd-pleasing abilities include dancing and climbing stairs while moving quietly and fluently, is a milestone along the road toward a truly domestic robot, contends roboticist and artificial intelligence expert Prof. Edgar Korner, who heads Honda's European Research Institute. He is leading a team whose short-term goals include enhancing the fluency of Asimo's coordination and boosting the robot's speed beyond 1 mph, as well as equipping the machine with speech-recognition and face-recognition technology. "The next stage is to enable [Asimo] to develop the ability to 'think' for itself, to an extent where it can get on with its chores without bothering its owner," Korner explains, adding that AI will evolve with further insight into human and animal brains. He argues that, as with biological systems, robots' key elements--intelligence, mobility, and vision--should evolve in parallel rather than in isolation. Following this path should lead to domestic androids that can perform basic chores in a predictable environment in five years, while robots that can function in an unpredictable environment are at least another five years off. Korner notes, "If we want [an android] to be really useful as a service robot or a companion then it must be granted some degree of autonomy, and this, I expect, will be a step-by-step process, because we first have to establish some kind of control." He points out that Honda has stressed from the outset that the goal of its robotics research is the creation of a useful human servant, not a facsimile of a human being. At the same time, Korner acknowledges that a certain degree of anthropomorphism is encouraged to make society more accepting of robots, while Honda is also concentrating on making the machines secure enough to be entrusted with important user information.
    Click Here to View Full Article

  • "Navy Researcher Has Novel Security Visualization Technique"
    Government Computer News (03/04/04); Jackson, Joab

    A researcher at the Naval Postgraduate School in Monterey, Calif., has applied techniques from the field of thermodynamics to characterize data traffic on computer networks, and says the concepts about visualizing network activity should make it easier to fend off security attacks. David Ford, a senior research coordinator for the Defense Information Systems Agency, posted a paper, "Application of Thermodynamics to the Reduction of Data Generated by a Non-Standard System," in Cornell University's electronic repository for scientific papers in February. The paper is a formal explanation of prototype software that visualizes the state of a network, which Ford helped build. The Therminator software is able to make sense of the normal activity of a computer network, and identify any unusual activity. Thermodynamics has been used by mathematics to make sense of complex environments, and Ford believes it can be used to comprehend the movements of packets of data, which behave similarly to molecules. "When a packet does something that is not within the intended flow, then it stands out like a sore thumb," says Ford. Intrusion detection systems are often designed to bombard security administrators with information.
    Click Here to View Full Article

  • "Digital Rights Management: Fixing Wrongs?"
    Investor's Business Daily (03/09/04) P. A6; Howell, Donna

    The first line of defense against digital content piracy employed by technology and entertainment companies is digital rights management (DRM), and signs indicate that the technology is gaining acceptance among consumers. Hewlett-Packard recently announced that it has joined an organization that intends to standardize and license DRM, taking advantage of the FCC's ruling that new digital TV equipment must be able to identify use-restriction flags within digital content starting in July of next year. "You not only have to have compelling devices and services, but rights that go with that, which allow content owners to make money off entertainment in its different forms," explains HP director of business development Felice Swapp. Meanwhile, Movielink, which allows customers to download movies to their computers for a fee, uses Microsoft's Windows Media 9 DRM technology, and CEO Jim Ramo reports that its subscriber base is growing substantially. The once-notorious Napster, which also uses Windows Media 9, has sold over 5 million song downloads since being relaunched as a pay-for-play service in October 2003. "One of the things that has made our DRM technology a leader in the field is...the flexibility in deciding what business rules you want to apply," notes Jason Reindorp with Microsoft's Windows Digital Media Unit. In its last quarterly earnings report, Napster parent Roxio stressed the need to provide DRM measures that satisfy both content owners and consumers. However, Everett Ehrlich of the Committee for Economic Development says that legislation to control digital piracy could hinder innovation, and notes that the development of disruptive consumer entertainment technologies--blank videocassette tapes being one example--has ultimately benefited the industry.

  • "XMPP Transports Presence Data"
    Network World (03/08/04); Hildebrand, Joe

    Extensible Messaging and Presence Protocol (XMPP) allows for easy exchange of presence data among IM systems and enterprise software applications, writes Jabber chief architect Joe Hildebrand. The XML streaming protocol recently won IETF approval and already has thousands of deployments due to its open and clear character. Because of its XML base, XMPP lets developers create diverse and flexible interfaces, business rules, and logic for their presence applications. Implementation was made as simple as possible, with XML extensions allowing exchange between different system gateways; these extensions are standardized by the Jabber Software Foundation in the same way the World Wide Web Consortium approves Web formats. There are already many interoperable commercial implementations of XMPP and a large open-source community behind the protocol. In a customer relationship management application, XMPP would allow customer approval forms to be sent to mobile phones, or an escalated trouble ticket to be routed to appropriate customer service resources; XMPP networks can also enable real-time communication of presence data between different organizations' applications. XMPP routing is similar to Simple Mail Transfer Protocol where servers locate, connect, and authenticate one another using the specified domain address as long as business and logic rules are adhered to. XML structured data uses XMPP as a universal transport layer with embedded presence and context awareness.
    Click Here to View Full Article

  • "Big Apple"
    Government Computer News (03/08/04); Daukantas, Patricia

    Virginia Tech researchers are following up the successful construction of their System X supercomputer by developing the infrastructure for a Terascale Computing Facility. Using 1,100 commercially available Apple G5 systems with dual IBM PowerPC 970 processors, the researchers put together System X in less than three months, whereas most supercomputer construction efforts take an average of 12 to 18 months. In addition, the Virginia Tech supercomputer's $5.2 million cost was a fraction of the typical budgets for machines ranking high on the Top500 list of fast computers. System X's calculative power of 10 trillion floating-point operations per second won it the No. 3 spot on the Top500 list. Virginia Tech built the supercomputer using a corps of over 150 volunteers, and other commercial components incorporated into the system included a rack-mounted cooling system from Liebert, InfiniBand switches and host control adapter cards from Mellanox Technologies, and Mac OS X. "What we've done is show that, even with a modest amount of money, you can build a machine capable of tremendous performance," boasts terascale center associate director Jason Lockhart, who adds that fluid dynamics, computational chemistry, and large-system dynamics are just some areas of research that will tap System X's resources. It was recently announced that System X will be upgraded to Apple's Xserve G5 rackmount server, which will improve electrical and cooling efficiency and reduce System X's square footage by a factor of two. Virginia Tech expects that the Terascale Computing Facility, which will be linked to the upcoming 40 Gbps National Lambda Rail fiber-optic research network, will qualify for funding from the National Science Foundation under its Advanced Cyberinfrastructure Program.
    Click Here to View Full Article

  • "The Garden Where Perfect Software Grows"
    New Scientist (03/06/04) Vol. 181, No. 2437, P. 28; Bentley, Peter

    Peter Bentley of University College London's Digital Biology Group believes software will become more reliable, error-proof, and virus-resistant through biologically-inspired evolution and adaptation, a change that will make programmers more like gardeners. Genetic programming seemed to offer a path toward computer evolution, but hit a wall because the sheer number of basic commands and functions that go into complex software made the process unworkable. Bentley has determined that this is due to genetic programmers' failure to take development, whose biological equivalent is embryogenesis, into account. Programmers were assigning a part of the solution to each software "gene," whereas in nature genes are similar to IF/THEN statements that determine the growth of an organism. From this realization, Bentley argues that computer programs could be evolved and developed instead of designed: All that is needed is a couple of numbers that represent the name and concentration of a digital protein, and digital genes set up to behave like IF/THEN statements. "Cells" can be represented by lists of the numbers assigned to genes and proteins, while function-determining genes can designate a traditional computing instruction to each cell. As in nature, the program evolved iteratively from a single cell is bug-free, and Bentley reports that he has used this technique to grow a self-healing robot brain. The researcher believes that "If these ideas are taken up commercially...software will be developed in organic nurseries where it will evolve and grow like biological life to perform a specific task, adapting to its environment and changing its behavior to fit new environments."

  • "The Myths of Open Source"
    CIO (03/01/04) Vol. 17, No. 10, P. 82; Wheatley, Malcolm

    The growing popularity of open-source software solutions among CIOs helps to dispel some of the myths surrounding the technology. Many companies claim that, contrary to the mythological view, the price advantage of open source is not its most attractive feature: The real impetus for adopting open source is its reliability and stability, which allows more efficiency to be squeezed out of corporate activities. Support for open source does indeed exist--and is free, to boot--although there admittedly is no single source of information; nevertheless, Atsec consultant Klaus Weidner observes that multiple support services can be an asset, especially to companies tired of dealing with vendors that provide poor support or discontinue support of certain kinds of software. "The breadth of resources available for open-source applications is so great worldwide that we can get support, communicate with a developer or download a patch no matter the time of day," adds RightNow Technologies' Thomas Jinneman. Another myth is that there are no real savings in open source, but that notion is contradicted by users' own testimony of substantial net savings thanks to an absence of "vendor churn" and marginal cost of scale. The legal ramifications of open source are not as treacherous as some believe--there is an assortment of open-source licenses available, and some open-source vendors offer indemnification options to customers. Open source is also suitable for mission-critical applications, as evidenced by open-source deployments by banks such as Banca Popolare, which chose Linux to smoothly integrate disjointed apps into a Web browser, speeding up training cycles and transaction times and creating more opportunities for cross-selling the bank's services. Open source is also proving its viability for the desktop, as demonstrated by Baylis Distribution's decision to phase out Microsoft on desktops in favor of Linux and open-source productivity tools for spreadsheets, word processing, and other tasks; IT Director Chris Helps reports that the switchover has cut the cost of PC installation by about 50 percent.
    Click Here to View Full Article

  • "Future of News Delivery"
    Computerworld (03/08/04) Vol. 32, No. 10, P. 34; Rosencrance, Linda

    Dennis McLeod of the University of Southern California's Annenberg School for Communication in Los Angeles believes the near future will witness the emergence of a revolution in news representation and delivery thanks to new information technologies and integrated media systems. He is currently engaged in the User-Directed News project, an effort to develop "immersive" news experiences that are personalized and interactive. This technology could manifest itself as a head-mounted display the user wears to receive controllable multimedia newsfeeds consisting of audio, video, animation, text, and tactile feedback conveyed through gloves. Paul Grabowicz, director of the University of California, Berkeley's New Media Center, postulates that in three to five years multimedia journalism will have evolved into a form that users will access via preference-based entry points. Nora Paul of the University of Minnesota's Institute for New Media Studies expects future news delivery to feature "animated infographics" that represent events that are hard to comprehend in traditional text-based media, and that users can repeatedly play at their own pace. She adds that this technology is already available, but the news media is not employing it widely. Meanwhile, Rich Gordon of Northwestern University's Medill School of Journalism foresees a time when people will be able to view interactive multimedia news stories on handhelds. He asserts, "The buzzword for the future, no matter what platform, is interactive multimedia, which both represents user control as well as the multiple forms of media incorporated into a single format."
    Click Here to View Full Article

  • "Face-off: Are Anti-Spam Appliances Better Than Software?"
    Network World (03/01/04) Vol. 21, No. 9, P. 47; Chiu, Tim; Schneider, Ken

    Mirapoint senior manager Tim Chiu argues that anti-spam appliances are a more effective and inclusive solution for blocking spam than software: He contends that appliances need significantly less continuous management than software, and offer quicker return on investment and a low ongoing total cost of ownership. Gateway appliances, which are implemented at the network edge, deliver unmatched efficiency and manageability in solving spam problems by unburdening the existing mail server and dealing with security threats before they can penetrate the network. Chiu claims that customers can deploy protection faster with appliances, and can benefit from embedded performance and reliability optimization features that software on general-purpose servers lacks. He adds that software-based anti-spam schemes are vulnerable to malware, while appliance-based products boast a pre-hardened operating system that does not have any open ports or vulnerable executable environment. Brightmail CTO Ken Schneider contends that effective anti-spam appliances--or any effective anti-spam measure, for that matter--owe their potency to strong software. He remarks that effective anti-spam software can be implemented with any operating system, set up on any hardware platform, and employed with numerous email applications. "The most complete anti-spam software provides the best of these key characteristics--effectiveness (most spam stopped), accuracy (fewest false positives) and zero administration (automated and timely updates)--all in a platform-agnostic package," Schneider notes. He writes that software must be flexible enough to defeat spam attacks that are growing in sophistication as the arms race between spammers and anti-spam vendors escalates.
    Click Here to View Full Article

  • "The Great Robot Race"
    Wired (03/04) Vol. 12, No. 3, P. 132; McGray, Douglas

    On March 13, some 20 teams will compete to see whose autonomous vehicle can navigate a 250-mile off-road course from Barstow, Calif., to Las Vegas, with the team whose vehicle first reaches Sin City within 10 hours receiving a $1 million prize, courtesy of the Defense Advanced Research Projects Agency (Darpa). The official course of the race will only be disclosed to the competitors as a series of about 1,000 GPS waypoints two hours before the contest begins, and the participating vehicles must be able to contend with uneven terrain, water, other natural and man-made structures, and the occasional person or animal with no human assistance, and can only have "incidental contact" with each other. Competing machines will have to address the issue of speed, which can confuse visual sensors and onboard software, impeding safe navigation. Darpa notes that nearly all contestants are building machines with navigation elements that combine radar, ladar, GPS, and stereo vision, while their software components must be able to understand the layout of the path ahead of the vehicle in real time in order to make timely speed and course adjustments. Most teams' first step is designing the vehicles to be highly durable and shock-proof, which allows for slightly less complicated software. But Carnegie Mellon University's Red Team, which has built an automated Humvee dubbed Sandstorm for the Grand Challenge, believes that no blend of hardware and software can successfully manage the speed of the race without a clear understanding of the environment beforehand, which is why the team is compiling a detailed multimedia road map. Darpa is holding the Grand Challenge to gain robotics technology that could be applied to warfare by 2015: Such technology would take the form of reconnaissance and sentry robots, "donkey" machines that transport supplies, and collaborative team robots or "wingmen." "We were trying to reach hobbyists, but this challenge has catalyzed and focused university courses and student research," notes Grand Challenge program chief and Air Force colonel Jose Negron.
    Click Here to View Full Article