Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to technews@hq.acm.org.
Volume 7, Issue 750: Friday, February 4, 2005

  • "Prototype Printer Fails to Satisfy E-Voting Activists"
    Associated Press (02/03/05); Konrad, Rachel

    Diebold has developed a prototype printer designed to complement its touch-screen voting machines and satisfy many people's desire for voter-verifiable paper trails, although critics claim the device does not address all their concerns about intentional and unintentional voting snafus. Diebold's AccuView Printer Module displays voters' choices under glass or plastic alongside the voting machine, allowing users to cancel ballots and start over if they spot an error; what is more, elections officials can use the paper records to confirm close races. Avi Rubin with Johns Hopkins University's Information Security Institute reports that Diebold's move is but a baby step toward election security, and thinks that paperless voting systems should be prohibited entirely. The AccuView module has been criticized for not taking into account the potential for software bugs or faulty hardware that could record the wrong votes or make voting systems vulnerable to hackers, deletions, or other calamities. Stanford University computer scientist David Dill says the presence of a printer does not eliminate the potential for paper jams and breakdowns, and others claim printers further complicate the U.S. voting system quagmire. Critics are also concerned that county elections officials may not purchase printers because of budget limitations, while computer scientists are frustrated that the few private labs licensed to certify voting hardware keep their operations clandestine and outside of federal regulations. The Voting Systems Performance Rating group is developing standards for evaluating the reliability, accessibility, security, and privacy of voting machines.
    Click Here to View Full Article

    For information on ACM's e-voting activities, visit http://www.acm.org/usacm.

  • "Spammers' New Strategy"
    Washington Post (02/04/05) P. E1; Krim, Jonathan

    Experts warn that spammers have found a way to circumvent spamming machine "blacklists" by sending their junk email through ISP computers rather than transmitting it directly from individual PCs, a strategy that could further inflate the spam problem. Spamhaus Project director Steve Linford says spammers' discussion forums indicate that the leading bulk emailers are adopting this new tactic, and unless ISPs rapidly implement preventative measures, "We're really looking at a bleak thing." America Online director of anti-spam operations Carl Hutzler reports that ISP machines are now the direct source of 95 percent of all spam aimed at AOL members, and he and Linford agree that ISPs must be more assertive in monitoring and restricting the amount of mail sent from individual machines on their networks. Linford says ISPs should also adopt a more aggressive email authentication scheme, and laments the lack of improved anti-spam enforcement among many U.S. ISPs. "We're trying to get the word out, but we're not sure that people have taken us that seriously," Hutzler notes. Linford says the spammers' blacklist-subverting mechanism is contained in Send-Safe software available on a Web site hosted by MCI's UUNet Technologies division, and Spamhaus has repeatedly requested that MCI remove the site. Timothy Vogel with MCI's legal team for tech issues reports that his company cannot censor a product that is merely being advertised; what is more, UUNet does not actually host the site, but leases the Internet address to a company that serves as host. A recent study from Rockbridge Associates and the University of Maryland Robert H. Smith School of Business' Center for Excellence in Services estimates that deleting spam adds up to $22 billion in yearly productivity losses.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Global Software Patent Scrap Continues, EU Restarts Process"
    eWeek (02/03/05); Broersma, Matthew

    The majority of the European Parliament's legal affairs committee voted on Feb. 2 to request that the European Commission restart the legislation process surrounding the European Union's proposed IT patents law. The current proposed legislation, which the Commission adopted as its Common Position last May, aroused the ire of open-source proponents claiming that the text would allow Europe to be inundated with pure software patents. Most observers doubt that the Commission will continue to push the current text, given EP members' fierce opposition. The Commission has several options at this point: To resubmit the original proposal or submit an amended proposal to the EP; conduct further analysis on the proposed directive's economic and legal ramifications; completely withdraw the proposal and push its reforms through other channels; or disregard the EP's vote and pass the current legislation. Or the Commission could request that the EU Council reexamine the proposal and develop a new Common Position that considers the Parliament's concerns as expressed in its first reading two years ago. Those who prefer a tighter clampdown on software patents doubt that legislators will agree to a Common Position similar to the one the Commission arrived at last May, given their recent realization of the dangers such patents represent. Software patents are currently unenforceable in Europe, whereas in America they are: This situation has made Europe a friendlier environment for open-source initiatives and smaller software developers. A representative of the European Information and Communications Technology Industry Association says his organization still fully supports the May 2004 text, arguing that "it blocks software patents, and we don't see what all the fuss is about."
    Click Here to View Full Article

  • "Sorting Out Web Services Security Standards"
    Computerworld (02/03/05); Salz, Rich

    DataPower chief security architect Rich Salz notes that there is an excess of XML Web services security standards in various stages of development or deployment, which can be an overwhelming prospect for developers or architects; he therefore outlines a plan for sorting through these numerous specifications. Salz splits the development of Web services security standards into three components--initial development, standardization, and profiling. Initial development is frequently spearheaded by vendor-driven organizations such as the IBM/Microsoft-led WS-Workshop and the Liberty Alliance. The former, whose target is to devise a SOAP-based architecture for secure, distributed network transactions, appears to be standards-body-neutral, although not immune to political infighting among workshop members; the latter focuses on the development of Web- and XML-based federated identity. The primary standards bodies are the World Wide Web Consortium, which usually focuses on fundamental XML Web services standards, and the Organization for the Advancement of Structured Information Standards, which tends to develop higher-level standards. The industry currently boasts a formal open standard that is approved and often refined by an open standards body. Profiling standards and winnowing them down to a small number is often required to boost interoperability, and Salz writes that the Web Services Interoperability Organization (WS-I) is the source of the most mature and compatible standards. The author recommends that basic two-party interactions should employ the WS-I Basic Security Profile to specify the use of XML Digital Signature and encryption for securing SOAP messages, and SAML for sender identification.
    Click Here to View Full Article

  • "Mirror That Reflects Your Future Self"
    New Scientist (02/02/05); Knight, Will

    Researchers at Accenture Technology's lab in France are developing a digital "mirror" that modifies a person's image to show the predicted effects of overindulgence, inactivity, and other factors based on the subject's behavioral patterns. "Helping people visualize the long-term outcomes of their behavior is an effective way to motivate change," contends Stanford University's B.J. Fogg. The system is designed to display an image of the subject via wireless camera, while computer software constructs a lifestyle profile based on camera surveillance of the person's daily activities, as well as dietary information provided by the subject himself. A different software package will then apply this profile to determine how the digital image should be changed to reflect anticipated weight gain, changes in skin tone, and other physical manifestations of excess. A third software package will modify the subject's face, and the Accenture team would like the system to function in real time. Accenture lab director Martin Illsey wants a prototype mirror to be ready by mid 2005. But though some people think the technology could be very effective in spurring users to change their lifestyle, others are skeptical. "I don't think any system which presents a negative image of the user will be taken up by many people," says University of Bristol ubiquitous computing expert Cliff Randell.
    Click Here to View Full Article

  • "UC Riverside Researchers Are Testing the Accuracy of Data Mining Programs Developed for Homeland Security"
    UCR News (02/03/2005)

    The University of California, Riverside (UCR) is working with Lucent's Bell Laboratories on a data-mining validation project that will help the Department of Homeland Security test the accuracy of behavioral patterning. The one-year project is funded with an $800,000 grant from the Department of Homeland Security and will develop a model for testing how accurate data-mining programs are, especially data-mining used to identify patterns of behavior. The U.S. government plans to use such technology to better understand potential national security threats, says UCR statistics professor and principal investigator for the project Daniel Jeske; "These are the tools that look through different types of data and try to piece together a story," he says about the Department of Homeland Security data-mining tools that will be examined. "The tools say an event could happen based on patterns that are found in the data. Sometimes the tools are referred to as information-discovery systems." The data-mining accuracy model needs to be flexible to accommodate the widely varying types of data that will be used to identify behavioral patterns, and could be applied to commercial purposes in the future; marketers could use the system to more accurately characterize customer behavior, for example. The project will involve five graduate students and four staff members from UCR's statistics and computer science departments, as well as researchers from Bell Laboratories.
    Click Here to View Full Article

  • "A New Lease on Life for Biological Information"
    IST Results (02/04/05)

    European researchers working under the European Union's Information Societies Technologies program have created a set of information search and management tools that work with large, distributed data stores. The Online Research Information Environment for the Life Sciences (ORIEL) project provides software tools that help bioinformatics researchers more easily query different databases containing various data types, including publications information, complex molecular datasets, and multimedia data. Previously, executing a simple query meant finding relevant databases, searching them, and then patching the results together to create an integrated picture. Project coordinator Les Grivell says ORIEL tools can be used for other applications that involve large collections of multimedia information. ORIEL boosted existing semantic links between different databases, and created a number of widely used standalone tools, such as those that allow the exploration, extraction, and integration of information from scientific literature. The Information Hyperlinked over Proteins (iHOP) system creates interlinked references to genes, proteins, chemical compounds, and other information that is culled from the 15 million abstracts contained in the U.S. National Library of Medicine's PubMed bibliographic database; researchers can access iHOP and other ORIEL-created tools over the E-BioSci platform, which is a federated network of European life sciences platforms. E-BioSci provided a rigorous testing environment for ORIEL prototypes. Although federating all scientific data through interoperable standards used globally would be ideal, the reality is that tools such as ORIEL technology are needed to effectively make use of available scientific data, says Grivell.
    Click Here to View Full Article

  • "Open-Source Backer Warns of 'Patent WMDs'"
    IDG News Service (02/02/05); McMillan, Robert

    The issue of software patents was discussed by a panel of major open-source figures at the recent Open Source Development Labs (OSDL) Enterprise Linux Summit. Linux kernel project developer Linus Torvalds noted that the open-source community has long identified software patents as a problem, and said proprietary vendors are also starting to come to the same conclusion. It is estimated that 150,000 to 300,000 software patents exist in the United States, and many open-source developers regard the bulk of these patents to be trivial, arguing that software innovations are better shielded by copyright law. Lotus Development founder Mitchell Kapor partly blamed the proliferation of software patents on the U.S. Patent and Trademark Office's laziness in enforcing its own policies. He warned that Microsoft will eventually take action against open-source projects by launching "patent weapons of mass destruction" in the form of patent lawsuits. "Their business model no longer holds up in an era where it's clear that open-source is simply an economically superior way to produce software," Kapor said. In January, IBM released 500 of its patents to the open-source community in the hopes of encouraging IT innovation. Sun Microsystems made over 1,600 of its own patents available shortly afterwards, and OSDL CEO and panel host Stuart Cohen said more vendors are likely to follow suit.
    Click Here to View Full Article

  • "That Next Call May Come From a Wireless Hot Spot"
    New York Times (02/03/05) P. E8; Eisenberg, Anne

    Voice over Internet protocol over Wi-Fi handsets are expected to deeply penetrate the consumer and enterprise markets, according to experts such as Abi Research analyst Philip Solis. Voice over Wi-Fi phones digitize voice calls and transmit them over the Internet in the same manner as voice over IP (VoIP) devices--only wirelessly, over broadband networks. Solis expects the technology to be very appealing to people who wish to cut their long-distance phone bills, and he notes that dual-mode handsets that facilitate both voice over Wi-Fi and cellular calls are beginning to show up. Martin Fichter of Siemens Communications says such handsets will lower the amount of traffic on cellular networks, making it less urgent for carriers to build more cellular capacity and saving them money: "The call is being connected through the Internet, so everybody wins," he comments. Fichter also says the phones would allow a company's office staff to save money, while consumers residing in areas with poor cellular coverage could also benefit. VoIP service provider Vonage plans to start selling dual-mode handsets by mid 2005, according to CEO Jeffrey Citron. Meanwhile, Motorola's Bob Doerr says his company is working on a dual-mode phone for business clients that uses a single number for both cellular and broadband calls, and that should effect seamless call transfers between the two networks. Nokia's Dan McDonald is convinced that voice over Wi-Fi will proliferate to the point where "literally every mobile device will have Wi-Fi capability in it."
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Microsoft Invests in Europe Through EuroScience Initiative"
    EurekAlert (02/02/05)

    In his Feb. 2 keynote address to government leaders and public officials attending the Government Leaders Forum in Prague, Microsoft Chairman Bill Gates announced the EuroScience Initiative, a multi-year collaborative research program designed to ramp up science and computing innovation and fuel economically and socially valuable scientific discoveries. The initiative boasts three core research programs: Novel computing paradigms for tackling basic computing science challenges through the application of biological, chemical, and natural-world principles; computational science, a collaborative program to cultivate next-generation computational tools that augment researchers' productivity; and intelligent environments built from new applications for technologies such as sensors, embedded computing, and wireless networks that are seamlessly integrated with everyday life and society. Microsoft Research will commit considerable resources and expertise to the EuroScience Initiative and forge new public-private alliances with European institutions. A major area of investment is a series of Microsoft Research centers of excellence, the first of which officially opened on Tuesday. The goal of the center for Computational and Systems Biology, a joint venture between Microsoft, Italian governments, and the University of Trento, is to develop new computational tools for improving the comprehension and prediction of complex biological system processes, which could yield a better understanding of disease causes, more effective vaccines, and new treatments. Another major component is the Career Development Fellowship Program, and additional scholarships, awards, and workshops designed to advance and support European intellectual capital.
    Click Here to View Full Article

  • "Antispoofing E-Mail Technology Deployed"
    IEEE Spectrum (01/31/05); Svoboda, Elizabeth

    Spammers have long relied on address spoofing to trick email users into responding to their messages, but two new technologies are being broadly deployed by email service providers, ISPs, and domain name holders in order to prevent spoofing. Spoofed email appears to come from respected domains, such as citibank.com or whitehouse.gov, in order to play on people's respect for their institutions; but the Anti-Spam Technical Alliance approved two new sender-authentication mechanisms that effectively eliminate the threat of address spoofing given proper and widespread implementation. The members of the group include Microsoft, Yahoo!, America Online, Earthlink, and others, and they have been joined by thousands of other domain holders that have adopted the technologies. DomainKeys, developed by Yahoo!, associates cryptographic public keys with domains in the Domain Name System (DNS), allowing recipient servers to check the authenticity of encrypted header file information; only emails that originated with the correct domain will fit with the public key stored on the DNS. The Sender Policy Framework (SPF) developed by Microsoft and America Online does not use cryptography, but instead uses IP addresses to show where email originated. The drawback with SPF is that forwarding addresses sometimes replace the original. DomainKeys also faces a hurdle with forwarding mechanisms because any appended information can signal possible spam activity. SPF and DomainKeys are complementary technologies and together create a more secure email system, while experts say spammers still can operate in the overall email system, but that spoofing will become more difficult.
    Click Here to View Full Article

  • "Video Game Doctor"
    California Aggie (02/02/05); Xie, Fei

    Doctors consider video games and other simulation technologies to be valuable tools for becoming more effective surgeons. Dr. James Rosser, director of the Beth Israel Medical Center's Advanced Medical Technology Institute, has found that students who frequently play video games perform laparoscopic surgery more efficiently and with fewer errors. The procedure employs joystick-controlled fiber-optic cameras and very small tools, and Rosser concludes that computer games "were the determining factor--more than years of experience, gender, dominant/non-dominant hand, all of that." Meanwhile, the UC Davis School of Medicine and Medical Center's Center for Visual Care trains students on cutting-edge instructional simulators such as "Stan," a mannequin that can mimic breathing, heartbeat, pupil dilation, and other bodily functions, as well as how those functions change when medication is administered. Center for Visual Care medical director Peter Moore notes, "We can provide a realistic setting to develop critical thinking and practice a variety of techniques that play out in real time" so that students can gain new skills and doctors can become more effective team members. Video-game companies and hospitals are also collaborating on simulations that can train professionals for mass-casualty events without disrupting hospital operations. Few people concur on what health care problems video games should be designed to address. Federation of American Scientists' Learning Federation director Kay Howell hopes that events such as the recent Video Game/Entertainment Industry Technology and Medicine Conference will lead to guidelines.
    Click Here to View Full Article

  • "Mandrakesoft, BRGM to Take on Clustering Project"
    HPC Wire (02/02/05)

    The computer science research center INRIA will lead a research effort into finding ways to turn the unused computing power of a network of desktop machines into a cluster. As part of IGGI, the research project into cluster and grid computing, INRIA will study technologies that allow for the dynamic tracking and controlling of resources. Linux clustering offers a more affordable way to obtain the high speeds of supercomputers, but working with dedicated machines that computers set aside only for clustering, has been a problem. "IGGI will develop an integrated solution for harnessing this power--so that a desktop machine can be running an Office program one minute and processing some kind of distributed computing job the next," says Jacques Vairon of BRGM, another participant in IGGI. The French geological survey institute will make its intranet (about 700 PCs) available for the project, and Mandrakesoft, the European leader in Linux and Open Source software, will make its Linux technologies more suitable for the IGGI's work. "An IGGI grid will have all the capabilities of a Linux cluster, but that cluster will constantly evolve according to available resources," says Mandrakesoft CEO Francois Bancilhon.
    Click Here to View Full Article

  • "Computer Woes Hinder FBI's Work, Report Says"
    Washington Post (02/04/05) P. A15; Eggen, Dan

    Inspector General Glenn Fine of the Justice Department reported yesterday that the FBI's crime prevention operations are significantly impeded because its current IT systems do not allow agents to adequately share information, while costly attempts to upgrade those systems may have to be jettisoned due to bad planning and flawed management. The FBI admitted last month that the latest version of the $170 million Virtual Case File system, designed to let agents handle almost all documents electronically, is already obsolete and is likely to be scrapped before deployment. Sen. Patrick Leahy (D-Vt.) told a Senate Appropriations subcommittee yesterday that "escalating costs, imprecise planning, mismanagement, implementation concerns and delays" were responsible for the bureau's botched tech overhaul, and criticized the FBI for not being more up-front with Congress about the problems. Officials with Science Applications International, the company contracted by the FBI to design the Virtual Case File software, said the project was plagued by management turnover and frequent design changes from the bureau, which Fine's report verified. FBI director Robert Mueller III acknowledged at the Senate hearing that the delayed computerized case-file system is frustrating, but disputed Fine's assertions that agents' job performance is suffering. He argued that the lack of such a system "does not prevent us from fulfilling our counterterrorism, intelligence and law enforcement missions." He also said that not all of the $170 million Virtual Case File investment will be written off if the project is scrapped, as some $66 million has either not been spent or channeled into products with other applications.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Strict US Visa Rules Benefit European, Asian Firms"
    Agence France Presse (02/02/05)

    Foreign executives discouraged from going to America because of strict U.S. visa policies are likely to turn to more cordial Asian and European countries. This could amount to significant business losses for American firms, warned U.S. National Foreign Trade Council President William Reinsch at a recent forum hosted by the Global Business Dialogue. Addressing an audience that included Home Security and State Department officials, Reinsch said the difficulties faced by foreigners entering the country for business, education, or tourism following Sept. 11, 2001 have cultivated a negative perception of the United States that will take a long time to shake off. The Santangelo Group estimated in a recent report that the U.S. economy lost at least $30 billion during the year up to last summer because of visa processing delays and denials. Reinsch disputed immigration policy director Lora Ries and deputy assistant secretary for visa services Janice Jacobs' assertions that the government is progressively tackling the issue, citing a January meeting with company representatives; he reported that the meeting only demonstrated "idiosyncratic improvements here and there around the edges." Reinsch also noted that U.S. companies were facing difficulty in bringing in foreign customers for trade conferences or business negotiations, while sending executives for training or participation in collaborative software or high-tech projects was also fraught with problems. He concluded that companies discouraged by visa restrictions will establish branches elsewhere. Meanwhile, Microsoft leader Bill Gates told attendees at the recent World Economic Forum that the number of Asians entering U.S. computer science departments has dropped 35 percent because of visa restrictions, which is putting the United States' status as the "global IQ magnet of the world" at risk.
    Click Here to View Full Article

  • "A Step Forward in the Voting Wars"
    Newsweek (02/07/05); Levy, Steven

    Despite the pressing need for uniform electronic voting machine standards, their development has been hindered by fierce disagreements between computer scientists, election officials, and system manufacturers. The scientists say that election officials are unfamiliar with the realities of high-tech security; election officials think the scientists are ignorant of the intricacy of real-world voting; and the manufacturers are reluctant to publicly disclose their products' inner workings. However, members of all three groups as well as voting-interest proponents have formed a coalition dedicated to creating unified voting-system standards that states and counties could use to rate the security, privacy, reliability, and accessibility of polling devices before purchase. The Voting Systems Performance Rating (VSPR) coalition is the brainchild of cryptographer David Chaum, who says transparency is the most critical element in voting systems. The public nature of VSPR's work would enable voters to watchdog the voting process more effectively. Two of the three major U.S. election-equipment manufacturers have joined VSPR. Diebold's Mark Radke says his company is considering coming aboard, although he cannot say when it would make its decision. He also says a recent report commissioned by Ohio's secretary of state concluded that Diebold had patched all major voting-system flaws flagged in a previous study, but Johns Hopkins professor Avi Rubin claims the Ohio testing was carried out by people who "really didn't know much about security."
    Click Here to View Full Article

  • "Little Sensors, Big Bucks"
    Electronic Business (01/05); Takahashi, Dean

    Smart dust networks are being deployed in a number of test projects, just eight years after University of California computer science professor Kris Pister began researching the technology. Supervalu grocery stores use smart dust sensors to monitor the efficiency of their refrigerators, and British Petroleum monitors vibrations on tanker equipment via smart dust networks. Smart dust has moved from the laboratory to the real world in the last year, but costs and standards issues remain unresolved, though the ZigBee Alliance represents one of the largest groups behind smart dust development and is creating the standards that will underlie smart dust radio networks. The market for ZigBee devices is expected to reach 150 million units by 2008, according to In-Stat analyst Joyce Putscher; predicted applications include building control systems that turn off air conditioning in rooms when unoccupied or temperature-sensing chips that guarantee the quality of a bottle of wine to the customer. Science Applications International is researching smart dust networks that could be spread over battlefields and allow military commanders to track the movement of enemy troops. Currently, however, complete smart dust nodes that include sensors and radios cost between $25 and $125. Although there are 30 chip firms targeting the ZigBee market and others specializing in necessary components, industry insiders say companies that can integrate all the analog and digital parts needed for a smart dust mote will be successful. Intel intends to be a smart dust player even though the company has traditionally shied away from low-cost chip markets; Intel Research associate director Hans Mulder says the smart dust market will take several decades to mature, but that it eventually could outsell the CPU market.
    Click Here to View Full Article

  • "Quantum Key Distribution"
    Industrial Physicist (01/05) Vol. 10, No. 6, P. 22; Ouellette, Jennifer

    The ever-escalating arms race between code generators and code crackers could potentially be broken with quantum cryptography, and public and private sources worldwide are expected to inject approximately $50 million into quantum cryptography research over the next few years. Major research efforts are investigating quantum key distribution (QKD), an encryption methodology that functions according to Heisenberg's uncertainty principle, which dictates that a particle's behavior is disrupted by the mere act of observing or measuring that particle. Thus, any attempt to intercept quantum-encrypted data is immediately detectable by the sender and receiver. Most current data-transmission security systems employ public-key cryptography, in which messages are decoded by two keys--a public key that anyone with access to the global public key registry can obtain, and a private key that only the transmission's receiver can access. However, this scheme can be thwarted by a powerful enough computer, whereas a QKD scheme would constantly and randomly generate new private keys that are automatically shared by both sender and receiver. Secret keys are generally encrypted in either the polarization or relative phase of photons emitted by conventional lasers as very dim light pulses. Commercial QKD systems have been released by ID Quantique and MagiQ Technologies: ID Quantique's scheme encodes data in the photon's phase. The last several years have seen progress in various efforts to extend the transmission range of quantum-encrypted data, and NEC and Hewlett-Packard are developing quantum repeaters with this goal in mind. Another area of interest involves tapping the phenomenon of quantum entanglement to make transmission interception impossible.
    Click Here to View Full Article

  • "Computational Science Demands a New Paradigm"
    Physics Today (01/05) Vol. 58, No. 1, P. 35; Post, Douglass E.; Votta, Lawrence G.

    Los Alamos computational physicist Douglass Post and Sun Microsystems engineer Lawrence Votta write that, as a research methodology, computational science will not enjoy the same vaunted status as theory and experimentation until it meets three challenges: The performance challenge, the programming challenge, and the prediction challenge. The performance challenge requires the continued exponential growth of computer performance and is expected to sustain its momentum for at least another 10 years thanks to increasing processor speed and massive parallelization. The programming challenge, which involves the generation of code that can effectively tap the capabilities of ever-more complex computers, is on its way to being met as researchers start to develop languages and software tools. But the prediction challenge, whereby truly predictive complex application codes are developed, is the most pressing obstacle, as such codes are not only complex, but developed over long periods of time by large teams whose efforts are difficult to integrate. The authors explain that computational science must become mature enough to produce results whose accuracy and reliability matches those of theoretical and experimental science, and key elements in this maturation include retrospective study of successful and unsuccessful prediction efforts, and the community's acceptance of the insights they yield. A solid code verification/validation framework is essential to code predictions' credibility, and Post and Votta write of the need for better verification methods and greater support for code-validation testing throughout the scientific community. The authors also recommend that computational scientists should look to the IT community for insights on the qualities of successful code-development team organization, management, and leadership, as well as methods for improving software quality.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)