Volume 5, Issue 554: Monday, October 6, 2003
- "Spam Fighters Turn to Identifying Legitimate E-Mail"
New York Times (10/06/03) P. C1; Hansell, Saul
Many email experts think that a system to identify legitimate email rather than spam will be a more effective strategy for controlling the spread of unsolicited commercial messages. This would entail the development of an email version of caller ID that allows the sender's identity to be verified by the receiver, which is a weighty challenge from a technical point of view. America Online senior director for mail operations Brian Sullivan says the world's four biggest service providers--AOL, Yahoo!, Earthlink, and Microsoft--have yet to agree on a framework for anti-spam standards, although there has been a consensus that such a system would be set up as an option for senders. Kevin Doerr of Microsoft says that clearly identified mail would be processed into the online equivalent of an airport express line: "You will only be frisked once and not thrown in with the unwashed masses," he explains. Major commercial emailers are expected to be the system's initial users, though ISPs believe small businesses and individuals will eventually be able to take advantage of the system as well. Most proposals suggest that several independent groups post email standards--a sender would select one standard to comply with, and its publisher would be responsible for monitoring the sender's observance of the protocol. Both simple and complex solutions have been proposed to identify senders in a counterfeit-proof way: A simple solution apparently favored by ISPs involves creating a registry of email servers with verified owners, based on the assignment of server-specific Internet Protocol addresses; the more complex approach would require every sender to label each message with a digital certificate that can be authenticated by the receiver's software or Internet provider. Also complicating matters is tension between bulk email companies and ISPs--the former are demanding a system that legitimizes their email and ensures its delivery, while the latter face pressure by complaining customers and do not wish to promise to deliver any specific messages.
(Access to this site is free; however, first-time visitors must register.)
- "With Software Jobs Migrating to India, Think Long Term"
Wall Street Journal (10/06/03) P. A2; Davis, Bob
Computer guru Ed Yourdon's performance as an IT job forecaster has been marked by peaks and valleys: In his 1992 book "Decline & Fall of the American Programmer," he prophesied that American programmers will lose their jobs to cheaper Indian professionals, but the apparent failure of this to happen prompted Yourdon to issue an apologia with his 1996 book "Rise & Resurrection of the American Programmer." Yet Yourdon's original prediction seems to be coming true after all with the rise in IT outsourcing to India--some 150,000 U.S. IT jobs have been lost to foreign competitors in the last three years alone--and Yourdon, now a director at Indian software firm iGate, is again forecasting a bleak outlook for American programmers. "It's almost like I've become a turncoat for issuing a warning, and the only ones listening were foreigners," he remarks. Yourdon projects that the least productive U.S. software professionals, which account for 20 percent of the American IT workforce, will vanish because of their consistent failure to update their skills and raise their efficiency. Indian software workers receive 25 percent less money than their U.S. counterparts, and with other poor countries vying for a piece of the software outsourcing market, India is under considerable pressure to maintain the salary disparity. However, India's growing presence as a software outsourcer may work to America's advantage: As more foreign software professionals train or work in the United States, the more talent is added to a pool that could be tapped to build U.S. firms with fast growth potential. Furthermore, as India's economy grows faster so does its potential as a market for products and services made in America. U.S. workers are likely to benefit if overseas competition stimulates more long-term potential for American innovation than short-term potential for jobs.
- "Vote, With No Confidence"
Baseline (10/02/03); Cone, Edward
Electronic voting systems, such as the touch-screen machines that will replace traditional punch-card machines in California for the Oct. 7 recall election and the 2003 presidential primary, are problematic for several reasons: They lack a printed audit trail and there are no guarantees of safety from tampering. Furthermore, the dedication of e-voting machine vendors such as Diebold to upholding the security of their products is debatable; a recent Johns Hopkins and Rice University study found security holes in Diebold machines, while writer Bev Harris published internal Diebold memos online indicating that the company is aware of system vulnerabilities. Yet both state and federal legislation is strong-arming California counties to switch to paperless e-voting, even though some feel their current voting machines are adequate. Although there are no major technical headaches to adding a paper trail to Direct Recording Electronic (DRE) machines, it does add costs for paper and printing, and complicates maintenance. Many counties report they are unconcerned about machines that lack paper ballots--Angela Burrell of the Orange County registrar of voters says the county will in March introduce paperless DRE systems from Hart InterCivic. The chief allure of e-voting machines is the potential cost savings, and promises that such systems yield a more accurate count. However, Yolo County clerk-recorder Freddie Oakley observes, "Election officials are very naive about technology. I'm not paranoid, but I do believe that 18-year-olds are sitting at home in their camouflage jammies, hacking these things." SRI principal scientist Peter G. Neumann warns that the move from punch-card machines to touchscreen voting is like "going from the frying pan into the fire," and argues that "lame" standards have been established to guarantee the accuracy of the new machines.
To learn more about ACM's concerns and actions regarding e-voting, visit
- "EU Directive Could Spark Patent War"
CNet (10/03/03); Broersma, Matthew
The Directive on the Patentability of Computer-Implemented Inventions recently approved by the European Parliament contains several amendments that Gartner states could create significant disparities between European and American software patenting approaches, which could in turn set off a patent war. The amendments prohibit the patenting of "pure" software and business methods, and do not require software makers to license patented technology for interoperability purposes. The patenting of pure software and business methods has become routine in the United States, and a magnet for critics who assert that such a policy retards innovation and competition. Gartner hypothesizes, for instance, that a patented e-commerce technology enforceable in the United States but without legitimacy in the European Union could make American users guilty of lawbreaking if they access an EU Web site that uses the technology. A U.S. official sent a letter to the European Parliament citing three articles of the directive as "problematic," and called for the deletion of one article declaring that patents cannot be leveraged to limit interoperability. The Foundation for a Free Information Infrastructure, a strong supporter of the amendments, said that relying on antitrust law to shield the software industry from corporate maneuvers to control standards of data exchange is ridiculous. The European Commission will review the patents directive, after which the parliament and the Council of Ministers will vote. Yet the commission has expressed doubts about the amendments' acceptability and has not ruled out the possibility of withdrawing the directive.
- "Buggy Software Taking Toll"
Orlando Sentinel (10/05/03); Cobbs, Chris
Software is increasingly pervasive in modern products, as are the glitches that inevitably show up in growing pools of code. Unlike relatively lithe programs that operated computers years ago, many of today's software programs have millions of lines of code; enhanced programming tools have enabled this productivity, but the test and debug tools used to check these products have remained mostly unchanged, according to Florida Institute of Technology computer science professor Cem Kaner. SRI International principal scientist Peter Neumann predicts the software bugs that inhabit these programs will have more impact on people's everyday lives in the future--a message Neumann has brought before Congress five times as an expert witness. Besides inconvenience, software bugs cost the U.S. economy at least $59.5 billion each year, according to a study by the National Institute of Standards and Technology. Neumann says one underlying reason software is so error-prone is because the marketplace rewards speed-to-market and functionality more than reliability, and he notes that software engineering is a still-developing art compared to other engineering disciplines such as building architecture and civil engineering. Defects in software programs are also embedded in numerous details, and their presence is not as obvious as a missing strut in a bridge, for example. Carnegie Mellon University associate professor Philip Koopman says the average high-end European sedan made today uses around 70 computer chips and is programmed with 500,000 lines of code, with an average error rate of one defect per 1,000 lines of code. In a car, some of these defects could have life-threatening consequences, while recent catastrophes caused by software errors include the crash of the Mars Polar Lander in 1999, where programmers mixed metric and standard measurements.
- "Becoming a Security Guru Without Breaking the Law"
E-Commerce Times (10/04/03); Diana, Alison
Computer security is a hot area for students, especially those looking for a lucrative career after they graduate. There is a wealth of courses and resources offered by higher-education institutions for the prospective IT security expert: Agnes Chan of Northeastern University reports that her school has a Computer Science Ph.D. program with a specialization in security, while a masters degree in Information Assurance is planned for next year; Houston's Rice University offers courses in the subject areas of cryptography, wireless security, intrusion detection, tamper resistance, viruses, spam, smart cards, untrustworthy platforms, and modern programming; the London School of Economics established the Computer Security Research Center; many institutions promote security-related co-op or internship programs for students; and Thomas Algoe of New York's Hilbert College says that organizations such as the Computer Security Institute offer online resources available to anyone. He explains that ethical issues are a major area of concentration, and says that his approach to teaching information security focuses on social engineering issues rather than technical ones. "Most of the really 'good' hackers were social engineers rather than real technology people," he observes. Some members of the educational community do not think students need to write malware to devise a sound cyberdefense strategy, and have taken the University of Calgary to task for offering a course in which students are taught to code worms and viruses. Dan Wallach of Rice University asserts, "You learn about security by doing security. That doesn't require doing anything even slightly illegal." Chan notes that Northeastern will run experimental courses in which students try their hand at attacking cyberdefenses on an isolated network, while security related-courses will feature faculty oversight.
- "Pushing Peer-to-Peer"
Technology Review (10/03/03); Garfinkel, Simson
Peer-to-peer networking is a powerful concept that has never really taken off on the Internet, even though the Internet itself was originally intended to be peer-to-peer; instead, most of the Internet uses a client/server architecture that is simpler to build, but is also more inefficient and vulnerable to attack. In order to significantly disrupt a client/server network, either the attacker needs to take down the server machine or sever the network links. Making server data redundant on different machines is also difficult to do because the systems must be continuously synchronized. True peer-to-peer networks--ones that do not rely on central hubs to find other peers--are largely immune to these type of attacks because resources are spread throughout the network and are much more available. Researchers, however, are still struggling with implementing peer-to-peer concepts because of fundamental complexities, including how to protect against hostile peers, how to efficiently distribute data, and how to publish the presence of newly arrived peers. Nevertheless, as peer-to-peer architectures become more mature, the concept will likely take over critical functions such as domain name system archives, global Web cache, and resilience against major physical disaster. Already, peer-to-peer networks have overlaid the regular Internet: Gnutella, Kazaa, and Morpheus file-sharing networks are overlay networks, as is Akamai's global Web cache infrastructure. These types of overlay systems also serve as test beds for new peer-to-peer experiments that are too experimental to try on the regular Internet. The success of peer-to-peer technology in distributing digital music is just one example of how the concept will revolutionize distribution models.
Click Here to View Full Article
- "Advanced Chip Opens Door to Software Choice"
New Scientist (10/06/03); Ananthaswamy, Anil
Intel recently announced plans for Vanderpool, a next-generation computer chip that will be capable of running multiple operating systems simultaneously. System crashes are typically the result of conflicts between several operating systems running on the same hardware, which is why programmers write virtualization software. Intel's Mike Ferron-Jones says, "Vanderpool doesn't eliminate the need for virtualization software, but it's going to make it perform a lot better." Intel is currently mute when it comes to detailing the hardware redesign. Gartner's Martin Reynolds characterizes Vanderpool as a "miniature operating system built for robustness and simplicity rather than for flexibility and complexity." Such a breakthrough, which analysts speculate could be one of the decade's most profound computer advancements, would enable garden-variety PC users to use Linux and other alternative operating systems--and their attendant applications--while retaining Windows. Determining the developers and retailers of virtualization software is likely to be a major area of contention. The Vanderpool chip is expected to be introduced within five years.
- "Researchers Say They're a Few Years Away From Self-Healing Electrical Grid"
Chicago Tribune (10/02/03); Van, Jon
Software designed to anticipate future electrical power consumption by analyzing patterns of past electrical usage is undergoing testing at Argonne National Laboratory, where researchers believe such technology would be a cheaper and better alternative for avoiding major outages than rebuilding the national grid. The software, dubbed TELOS, was developed by Purdue University engineers and trained on a year's worth of data about Argonne's electrical usage; TELOS is currently running simulations that forecast power consumption, which are then compared to actual consumption. Argonne energy technology manager Yung Liu reports a fairly close correlation between the predicted and the actual outcomes, and says the scope of cascading power failures could be reduced significantly with computers, especially in cases where split-second decisions are required. Purdue engineering professor Lefteri Tsoukalas explains that nobody can effectively monitor and control national power grids, and argues that monitoring local grids and anticipating their specific power requirements is a far more effective solution than trying to predict the demands of the overall grid. This solution not only requires computer technology, but the establishment of sufficient generation and transmission infrastructure. Tsoukalas suggests that every electric meter in the United States could have an IP address. Six consortia, including one based at Purdue, are participants in a network reliability project that seeks to develop a "self-healing" electrical grid, and project manager Massoud Amin attests that smart grid technology Argonne and other groups are testing is a key component of the self-healing grid. "[The self-healing grid] is no longer a distant dream, but something we can make happen over the next three to five years, depending upon funding," he declares.
Click Here to View Full Article
- "Playing With Technology"
Iowa State Daily (10/03/03); Mumford, Summer
Iowa State University held an open house to show off some advances in human-computer interaction (HCI). Faculty, staff, and students attending the open house were able to personally experience HCI, take a quiz by pointing lasers at a large screen in front of the room, and play the game Pong. Technology demonstrations included teleoperating, which allows virtual operation of remote controlled objects, and Augmented Reality, which makes use of computer-type screens in glasses. In the future, teleoperating might allow a doctor in the United States to operate on a wounded soldier overseas, while Augmented Reality might allow a bulldozer operator to see the depth perception of a stake in the ground. Iowa State University has a new graduate and research program in HCI, which was approved by the Board of Regents in July, making it the second university in the country to offer a master's and a doctorate in HCI. The university's involvement with HCI dates back to the Atanasoff-Berry Computer, which was created at the university, said Jim Oliver, graduate program chair and associate professor of mechanical engineering. Oliver says HCI is a promising and fertile ground for research and development. "We're just starting to scratch the surface," he said in a speech.
Click Here to View Full Article
- "Q&A: Congressman Explains His Opposition to H-1B Visas"
Computerworld (10/01/03); Willoughby, Mark
The issue of immigration reform has put Rep. Tom Tancredo (R-Colo.) at odds with the Republican Party. Tancredo, chairman of the Congressional Immigration Reform Caucus, in early July introduced a bill that seeks to repeal H-1B visas for temporary workers, but H.R. 2688 still has not been scheduled for a hearing in the House Judiciary Committee. Tancredo says American workers are being displaced because the H-1B visas allow foreign workers to come to the United States, no one knows how many return home, and now IT companies are exporting an increasing number of jobs overseas. Tancredo says IT companies no longer oppose reducing the number of H-1B visas, but adds that they are now taking advantage of the L-1 visa program, which has no cap, few restrictions, and is difficult to determine whether visa holders are replacing U.S. workers. L-1 visas, which are good for seven years, have increased 58 percent from a year ago, while H-1B visas have declined, he says. Tancredo says President Bush opposes his bill, but says there are efforts to restrict L-1 visas as well, including separate legislation introduced by Reps. John Mica (R-Fla.) and Rosa DeLauro (D-Conn.). The congressman says that temporary worker visas are unlikely to become an issue in the election next year, unless Democrats can offer a viable alternative plan on guest-worker visas.
Click Here to View Full Article
- "Huge Computing Power Goes Online"
BBC News (09/30/03)
A new computer project by the Cern labs in Geneva could have a profound impact on the world of computing in the years to come, Cern says. Cern will link computers from 12 countries around the world in an effort to test the Big Bang theory, a scientific problem so enormous that some 70,000 computers will be used to analyze data. Researchers will use Grid network technology to analyze the equivalent of more than 20 million CDs a year generated from smashing protons together at high energies with the Large Hadron Collider particle generator. The Grid will enable the Cern labs to access processing power for a scientific problem that is clearly too massive for a single machine to handle by itself. Researchers envision such a virtual supercomputer network being made widely accessible, giving anyone enormous processing power from a desktop computer. "The technology now being developed for particle physics will ultimately change the way that science and business are undertaken in the years to come," says Ian Halliday, chief executive of the U.K. Particle Physics and Astronomy Research Council.
- "Innocent File-Sharers Could Appear Guilty"
New Scientist (10/01/03); Knight, Will
Peer-to-peer (P2P) networks might be vulnerable to attacks in which an unsuspecting party is tricked into downloading copyrighted files, according to computer experts, who were responding to "Entrapment: Incriminating Peer to Peer Network Users," an anonymous paper posted on a free Australian Web hosting service. The paper contends that file-sharers facing legal action from the Recording Industry Association of America (RIAA) should question the evidence of the music industry. The RIAA monitored P2P networks to discover the user names of alleged copyright infringers, and then went through Internet service providers to track down users. The paper maintains that the Gnutella file-sharing network can be used to trick a third party into believing an innocent user is searching for copyrighted files, and adds that innocent users can be fooled into hosting copyrighted files. Gnutella is vulnerable to such misuse, acknowledges Adam Langley, a U.K.-based P2P programmer. "The core point the author is making--the unreliability of the 'evidence' used to sue file sharers--is valid," says Ian Clarke, who invented the file-sharing network Freenet. Langley says other P2P networks might also be vulnerable and notes that innocent people could be victimized in other ways. Imperial College peer-to-peer networking researcher Theodore Hong notes that "technical weaknesses" in the design of P2P networks could shield users from liability.
- "Online Registries: The DNS and Beyond"
CircleID (10/01/03); Dyson, Esther
Esther Dyson has completed a report examining ICANN and the DNS in relation to all the developing active registries in the digital sphere, and a CircleID piece highlights some of the report's findings. Dyson contends that registries offer the most straightforward process for identifying, monitoring, and cross-referencing information, and are becoming more known in the computing sector. Groups need to monitor a greater amount of data within and outside of their own operations and are, in some cases, interested in going beyond a lookup to have more of an interactive relationship with people and things. The DNS represents the most visible active external registry, while the new Electronic Numbering (ENUM) standard that works to convert telephone numbers into domain names is emerging to support other Internet-based functions, as is the Handles System and the AutoID Object Name System (ONS). Dyson notes that questions about ownership, access, neutrality, and other issues are arising as these functions develop, pointing out that implementation of registries involves more than just software issues. "Registry systems need corresponding infrastructure: Standards, federated identity management, business 'protocols' and policies, and sometimes even political buy-in or active registration," Dyson writes. Governance issues concerning these registries remain in question, with ICANN's handling of the DNS still not proven either successful or unsuccessful even as ENUM and ONS are, in varying measures, dependent on the DNS. Dyson contends that the Handle System embodies superior technology and governance to the DNS but is not as visible.
- "Fighting Talk"
The Engineer (09/29/03); Coppinger, Rob
The U.S. Defense Advanced Research Projects Agency (DARPA) is credited with the creation of the Internet, NASA, Stealth aircraft, and other technologies and services originally designed for military use that have had a huge impact on civilian life as well. The agency will spend approximately $30 billion this year on 200 initiatives with primarily military applications, and DARPA director Anthony Tether says enhanced battlefield operations via pervasive communications technologies and robotic vehicles is a priority. Other projects DARPA is focusing on include U.K.-based initiatives to design battlefield lasers and secure orbital and battlefield communications, but there are many more that are classified. Without specifically naming such projects, Tether characterizes DARPA's role as a provider of technologies that allow the U.S. military to be omnipresent, and capable of monitoring and/or hitting any target. DARPA has stirred up criticism through its secrecy and several recent projects that have attracted controversy: A proposed terrorism futures market and Terrorism (formally Total) Information Awareness (TIA), a system that would mine U.S. transactional databases to track terrorist suspects, were roundly blasted, and U.S. press reports expect the Senate to dramatically trim DARPA's budget as a result. Surveillance technology projects such as TIA have aroused a chorus of demands that DARPA declassify all its projects. And though DARPA research chiefly benefits the military, battlefield enhancement projects are causing friction between the agency and line combat officers, who dislike DARPA's apparent attitude toward war as a purely technical matter, according to Gary Chapman of the University of Texas. Science and Government editor Daniel Greenberg notes that DARPA's tremendous influence is rooted in its vast financial resources and profound lack of accountability, while Tether argues that although DARPA itself may not be wholly responsible for breakthrough technologies like the Internet, their development would be considerably slower without the agency.
Click Here to View Full Article
- "DHS Initiates Real-Time Cybersituation Project DHS Initiates Real-Time Cybersituation Project"
Computerworld (10/06/03) Vol. 31, No. 46, P. 16; Verton, Dan
The National Cyber Security Division (NCSD) of the Department of Homeland Security (DHS) is leading an effort to create a national cybersituation-awareness system capable of real-time analysis of cyberattacks, according to DHS executive Sallie McDonald. She says the NCSD has partnered with Symantec, SRI International, and Computer Associates International to devise a nonproprietary data collection system that operates on an automated security extranet and transmits incident reports to private-sector Information Sharing and Analysis Centers, which would in turn send the data to the situation-awareness system. "We will be deploying this in the federal sector, starting at the U.S. CERT first so we can see in real time what is happening across the nation," McDonald declared; U.S. CERT, a joint venture between Carnegie Mellon University's CERT Coordination Center and the Federal Computer Incident Response Center, was announced by the DHS on Sept. 15. Robert Liscouski of the DHS testified before Congress in September that the national incident reporting and analysis system will take information currently gathered by over 200 public, private, and university-based CERTs and send that data to U.S. CERT. He noted that his department expects to lower cyberattack response times to about 30 minutes within the next year. The new incident reporting and analysis system will make its debut at the first DHS-sponsored Cyber Security Summit in December, stated McDonald, while the DHS also intends to announce a security awareness project directed toward 50 million small businesses and home users. Bindview information security VP Scott Blake explained that recent controversies about the DHS' leadership and incident reporting framework have distracted people from government's failure to come through for the private sector with its national cybersecurity strategy.
- "Outwitting Spammers"
Network World (09/29/03) Vol. 20, No. 39, P. 48; Bort, Julie
The growing spam glut is a source of frustration for enterprises, which lose precious productivity in order to deal with unwanted emails. Spam filters are a popular anti-spam tool, but they come with their own drawbacks: Keeping networks up-to-date with the latest filters means frequent upgrades, while the risk that such tools will mislabel legitimate emails as spam increases as more filters are activated. "Machine-learning" technologies such as Bayesian filters and neural networks are being heralded as much more effective anti-spam measures, although they are not perfect. Users of Bayesian filters place spam and non-spam messages into two separate folders, and the filter trains itself to distinguish between the two by analyzing the unique identifying characteristics of the folders' contents; any errors the filter makes are sent by the end user to the appropriate folder, so the filter can note them. In this way, Bayesian filters can adapt to spammers' changing tactics, but the technology's chief disadvantage is its client-side orientation, making it unable to relieve the pressure that spam exerts on network processors. Some vendors are calling for Bayesian-like solutions that run at the email gateway to prevent both network clogging and false positives. Meanwhile, some vendors tout neural networks as a safer machine-learning alternative. The networks' spam-training software is placed on vendors' sites rather than on users' clients, and the email the network trains on is culled from bogus in-boxes set up for the express purpose of capturing spam. Neural network-enabled products function best when users update the gateway software at least once daily.
- "Ruling Over Unruly Programs"
CSO Magazine (09/03); Garfinkel, Simson
Sandstorm Enterprises CTO Simson Garfinkel writes that technical rather than legal issues make it theoretically impossible to write a program that can analyze any given suspect program to ascertain whether it contains friendly or unfriendly code. He explains that "The mathematics of computing make it impossible to write software that can figure out what other programs can do, prior to execution," and notes that current antivirus systems label programs as clean or infected by scanning them for known virus signatures--an approach that is ineffective when confronted with unknown viruses. Mathematician Alan Turing proved almost 70 years ago that the actions of even the simplest type of hostile program cannot be predicted. A popular strategy people use to "solve" the desktop security conundrum is to modify the operating systems so they will only run programs certified by publishers such as Adobe and Microsoft; but Turing's research demonstrates that even those programs may contain vulnerabilities. "Just about the only way to take back computer security from the morass that Turing created is to restrict what computer programs can do--that is, make computers less general-purpose," writes Garfinkel, who adds that a program's behavior can be made incalculable with very little effort. Another theoretically insurmountable barrier is computers being unable to crack truly complex "NP" problems such as code-breaking by deactivating the mathematics that support the problem's complexity. Brute-force search--the longest and most arduous technique--is the only way people know to search for a solution. Garfinkel acknowledges that solving an NP-complete problem, unlikely as that may seem, could facilitate the reverse-engineering of practically all encryption schemes that have ever been developed.
- "Rethinking Software Testing"
Software Development Times (10/01/03) No. 87, P. 29; Rubinstein, David
Buggy software, products that fail to function as they are supposed to, and lost profits are the result of developers not testing their software until very late in the development process, and rushing through testing in order to get products out quickly. Many development proponents strongly agree that the development cycle, software production costs, and time-to-market can be significantly reduced when flaws are detected earlier. According to vendors and industry experts, the chief strategy is to move up testing to an earlier point in the development cycle, writing and testing code simultaneously being one example. Delaware North's Internet and Information Systems has chosen a different route by keeping software development and quality assurance testing separate, employing virtualization software to replicate the development environment for testers and cut hardware and software costs, not to mention reduce DLL conflicts and versioning difficulties. AutomatedQA's Robert Leahy reports that more developers need to conduct unit testing, and Peter Varhol of Compuware comments that developer/tester convergence requires new tools, while developers must become familiar with conducting static source-code analysis or unit tests before moving on to the next coding step. QACenter product manager Mark Eshelby believes that companies facing a tight product deadline can lower the odds of failures by identifying the highest areas of risk and testing them when the crunch comes, but says that automated tests are still an important component. He adds that 11th-hour code revisions can be made without dramatically affecting the test environment via the design of object-oriented test scripts and data-driven test automation.