HomeFeedbackJoinShopSearch
Home

       HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.

To send comments, please write to [email protected].

Volume 5, Issue 454: Wednesday, February 5, 2003

  • "The Net Is Dangerous, Research Says"
    IDG News Service (02/03/03); Roberts, Paul

    In its latest Internet Security Threat Report, Symantec notes a 6 percent decline in the number of weekly cyberattacks on corporate networks between the first half and second half of 2002, as well as a decrease in the number of severe events. However, the security vendor also records over 2,500 new software vulnerabilities disclosed last year, an 81 percent increase over 2001 estimates; meanwhile, the increase of moderate- and high-severity security holes was almost 85 percent over 2001. Symantec says that these flaws could be especially vulnerable to blended attacks that exploit multiple security holes. The report finds that power and energy companies suffered more attacks with greater severity than companies in other industries, while telecommunications providers, financial services companies, and major nonprofits were also frequently targeted. Furthermore, 78 percent of all attacks recorded by Symantec in the last six months of 2002 were launched from Microsoft operating systems. Most cyberattacks originated in the United States and South Korea, while Poland and South Korea were the nations that suffered the most attacks per Internet user. Symantec suggests that companies and organizations could better protect their networks by honing their security configuration and patch management strategies, and by carefully examining vulnerable instant messaging and peer-to-peer applications that run on corporate networks. They should also be cognizant of internal threats and the security risks of Web-based applications that can be accessed remotely.
    http://www.pcworld.com/news/article/0,aid,109187,00.asp

  • "Cyber-Security Plan Counts on Private Sector's Input"
    eWeek Online (02/04/03); Fisher, Dennis

    The final version of the National Strategy to Secure Cyberspace will be issued in the next couple of weeks, and one of its priorities is a national cyberspace security response system that will have a heavy reliance on information contributed by the private sector. Key to its success will be an expansive information-sharing initiative both inside and outside the federal government, while a separate office within the Homeland Security Department will be tasked with managing the exchange of information between government and industry. The plan also suggests that industry build a centralized network operations center that runs round-the-clock to evaluate the health of the Internet and integrates with the cyberspace security response system as well as the Homeland Security Department's centralized capability. This facility should also direct the storage of information concerned with critical infrastructure protection that is volunteered by non-government agencies. Civil libertarians and privacy proponents are likely to notice a recommendation authorizing that the Justice Department and the Census Bureau collaborate on improving data about victims of cybercrime, while another controversial point is the lack of suggested remedies for flaws in software products and critical protocols and systems. "As we move to wireless everywhere and universal Web-control of appliances, if the government doesn't act quickly, millions of unprotected systems will by made available to any attackers who choose to use them," warns SANS Institute research director Alan Paller. Whereas an earlier version of the strategy condoned bringing market forces to bear on software vendors in order to boost their products' security, the latest draft merely recommends that the industry should promote more secure products and deployments.
    http://www.eweek.com/article2/0,3959,861870,00.asp

  • "Copyright Legislation Unlikely, Both Sides Say"
    Los Angeles Times (02/05/03) P. C3; Sanders, Edmund

    Representatives of entertainment and technology concerns agreed at a Precursor Group investor conference in Washington, D.C., on Tuesday that there is little chance of Congress enacting any significant copyright legislation this year. Both sides have repeatedly locked horns over the issues of piracy and copyright, and have lobbied Congress to pass mandates that have spurred the controversy. However, such measures will probably remain in limbo while the government deals with more critical issues, such as the economic slowdown and the increasing likelihood of war with Iraq. General counsel for the Motion Picture Association of America (MPAA) Fritz Attaway declared that no bills will be passed as long as the various industries fail to reach some kind of accord. Although the Commerce Department's Bruce Mehlman anticipates "a lot of congressional sound and fury," he said the chances of Congress passing major legislation this year are 10 percent. Bills that became a rallying point for both sides last year included a proposal from Sen. Ernest F. Hollings (D-S.C.) that would require tech companies to make copyright protection a standard feature, while Rep. Howard L. Berman (D-Calif.) introduced a measure that would give record companies the legal right to block file-sharing of copyrighted works. Meanwhile, Rep. W.J. Tauzin (R-La.) is drafting a proposal to accelerate the deployment of digital TV by settling major copyright issues. One bill that Mehlman thought had better odds of passage this year calls for the adoption of broadcast flags that block the transmission of TV programs over the Internet.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Group Sees Beauty in an Attempt to Revive BeOS Operating System"
    Wall Street Journal (02/05/03) P. B7B; Baker, Nick

    Programmer Michael Phipps and his team of 20 other volunteers are attempting to create OpenBeOS, an open-source version of the BeOS operating system, which was the flagship product of now-bankrupt Be Inc. Palm stopped development on BeOS after it purchased Be's intellectual property, leaving it in its fifth version in 2001. Phipps is pursuing OpenBeOS because he says he appreciates the artistic simplicity of the design, and wants to create a nimble and robust operating system for PCs, even though that market is dominated by Microsoft and Apple. Phipps says other operating systems have accumulated unnecessary legacy baggage, which he calls "cruft" and compares to barnacles growing on the side of a ship. BeOS was built primarily for Be's BeBox computers, and rarely causes system crashes, unlike many previous versions of Microsoft Windows. Open-source leader Eric Raymond says this aesthetic aspiration is key to well-built code, just as it is for well-built bridges. "Bridges should look beautiful," Raymond opines. "If they don't, there's a kind of ungracefulness that's likely to reflect structural problems in the design." Raymond says that because of the popularity of Linux, instead of actually using OpenBeOS, programmers are likely to incorporate the software's best features into Linux. Still, Phipps expects to release a test version of OpenBeOS by August, two years after the project started, and give it a new name in order to avoid branding conflicts with Palm.

  • "Report Envisions a Future Cyberinfrastructure That Will 'Radically Empower' the Science and Engineering Community"
    ScienceDaily (02/04/03)

    The National Science Foundation's (NSF) Advisory Committee for Cyberinfrastructure believes scientific and engineering research will be able to gain significantly from upcoming progress and confluences in computing technology. In a report released today, "Revolutionizing Science and Engineering Through Cyberinfrastructure," the NSF committee argues that computing infrastructure connected to the Internet is crucial to scientific and engineering research needs. This cyberinfrastructure--which includes distributed computing and information sharing technologies--may develop to the point that the medium will become a fundamental component of how research is conducted and who gets to collaborate. University of Michigan professor and committee member Dan Atkins foresees cyberinfrastructure connecting research and science "on a global scale." However, more funding for research is needed to develop the necessary technologies for this scientific information super-highway, concludes the report. Technology projects such as the NSF-supported NVO, GriPhyN, and NEES, and NSF-participant projects Partnerships for Advanced Computational Infrastructure (PACI), TeraGrid, and the Digital Libraries Initiative all received laurels from the report. The report envisions a cyberinfrastructure program that brings together cyberinfrastructure research with research on science and engineering applications, market-grade software, and operations and equipment. The study argues that acting now with this research will help prevent the seclusion of data in non-interoperable applications and protocols.
    http://www.sciencedaily.com/releases/2003/02/030204075912.htm

  • "Investigation to Include Onboard Computers"
    Washington Post (02/04/03) P. A16; Cha, Ariana Eunjung; Tate, Julie; Drezen, Richard S.

    Although inquiries into the disintegration of the space shuttle Columbia during reentry are currently focused on its heat-resistant tiles, investigators have not ruled out the possibility that the computerized onboard flight controls could have made a catastrophic error that caused or contributed to the tragedy. Still, NASA shuttle program manager Ron Dittemore notes that there were no recorded malfunctions with the shuttle's computers before communications with the spacecraft ceased. The Columbia was equipped with four computers and one backup designed to maintain the ship's safety and course using data collected from the orbiter's sensors and satellites. The onboard systems registered a temperature surge on the left side of the shuttle and increased drag on the right wing, and compensated by igniting two of the four right-side jets; investigators are using software to simulate the shuttle's final moments in an effort to see whether the computers may have over- or undercompensated. Navy Postgraduate School professor John Arquilla is doubtful that a software failure is primarily responsible for the crash, given the current evidence as well as the fact that more than 110 "programmed control flights" have been completed without a major software bug cropping up. Still, a 2002 report from the National Academy of Sciences recommended that off-the-shelf software products be eliminated from the shuttle program, given that they are more likely to develop glitches than customized programs. IBM contractors built the shuttle's core software, while the United Space Alliance maintains and upgrades it. Flight control software such as that used on the shuttle is also used in commercial aircraft.
    http://www.washingtonpost.com/wp-dyn/articles/A21041-2003Feb3.html

  • "Data Storage Leap Could Produce Film Library on a Disk"
    NewsFactor Network (02/04/03); Martin, Mike

    A team of Washington University researchers led by experimental physics professor Stuart Solin report that they have created a prototype disk drive with at least 40 times the storage capacity of conventional models, a breakthrough that could pave the way for drives with enough space to store a million novels or a thousand films, according to William Sweet of the IEEE. In typical magnetic storage devices, plastic disks coated with several layers of thin metallic films store information by converting electrical signals into magnetized "bits" on the film; increasing storage capacity means shrinking the bits, which weakens their magnetic field. The solution is to apply a magnetic field to the disk films to boost their electrical resistance and prevent electron leakage. Solin's team inserts a "cylindrical inhomogeneity" in the matrix of a homogeneous semiconductor, which acts as a short circuit when current flows in. The current is altered with the application of a high magnetic field, and is deflected around the inhomogeneity, consequently opening the circuit. "The transition of the inhomogeneity from short circuit at low magnetic field to open circuit at high magnetic field results in an enhancement of the measured magnetoresistance of the composite structure," Solin explains. "We call this extraordinary magnetoresistance, or EMR." He says the ultimate goal of his team is to make a disk with terabit capacity.
    http://www.newsfactor.com/perl/story/20662.html

  • "Sapphire/Slammer Worm Shatters Previous Speed Records"
    Newswise (02/04/03)

    The rapidity with which the Sapphire or Slammer worm proliferated over the Internet 11 days ago beat all previous speed records, according to report from a team of California-based network security experts. "The...worm represents a major new threat in computer worm technology, demonstrating that lightning-fast computer worms are not just a theoretical threat, but a reality," warns Silicon Defense President Stuart Staniford. The team notes that the number of Sapphire worms doubled every 8.5 seconds in the first minute of the attack, and contaminated over 75,000 computers within 10 minutes of the worm's Jan. 25 release. The worm spread much faster than Code Red because its software instructions were minuscule in comparison and able to fit inside a network "packet" that was sent one-way to prospective targets. The capacity of individual network connections determines the speed of Sapphire's propagation; Staniford estimates that an infected computer with a 1 Mbps connection could send out 300 copies of Sapphire per second, while a machine with a 100 Mbps connection would generate 30,000 copies per second. Most of the havoc wreaked by Sapphire was "collateral damage" in which online services were shut down or blocked because of network overload, explains Colleen Shannon of the San Diego Supercomputer Center's Cooperative Association for Internet Data Analysis (CAIDA). Sapphire exploited a security hole in Microsoft SQL servers for which a patch was already available, but which many companies failed to install. Using Internet "telescopes" to collect data about the attack, the team discovered that almost 43 percent of the machines infected by Sapphire are in the United States, nearly 12 percent are in South Korea, and over 6 percent are located in China."
    http://www.newswise.com/articles/2003/2/SAPPHIRE.SSC.html

  • "Why We Need H-1B Professionals"
    ZDNet (02/03/03); Carroll, John

    Ireland-based software engineer and Turtleneck Software founder John Carroll argues that the H-1B visa program's current cap of 195,000 workers should be maintained in 2003 and perhaps extended beyond that. He writes that the fear of foreign IT workers from lower-income countries accepting below-market salaries in the United States is exaggerated. Carroll also cites a 2001 Labor Department study that pegs the number of U.S. computer professionals at about 2.8 million, while a 2000 INS study of H-1B candidates estimates the maximum number of H-1B IT professionals entering the market to be 105,300--a mere 3.7 percent increase in the number of workers available. Furthermore, most American companies are more likely to favor an American programmer over a non-American because most software development requires a close interchange with predominantly American customers. Carroll adds that, from the point of view of purchasing power, foreign IT workers expect to get a salary commensurate with what they would receive in their native countries, and it is a requirement of the H-1B program for hiring companies to pay salaries in accordance with market conditions. Carroll contends that cutting back on the number of IT workers allowed in America will encourage more U.S. companies to outsource IT work to low-cost, offshore regions. He writes that there are economic benefits to bringing in proficient, well-educated computer professionals, whose attraction to the U.S. sector reflects the buoyancy of the American economy. Finally, Carroll argues that the H-1B program can bridge the geographic and cultural gap between Americans and foreigners.
    http://zdnet.com.com/2100-1107-983066.html

  • "Growth: Cities Try to Cash In"
    CNet (02/03/03); Junnarkar, Sandeep; Charny, Ben

    Cities such as Athens, Ga., and Long Beach, Calif., are among a growing number of communities that expect to boost revenues by offering Wi-Fi networking and related services--a considerable challenge, given that Wi-Fi is still largely a protean technology. Some cities are taking a direct route--Ashland, Ore., for instance, charges a monthly fee to wireless networks for the use of its fiber-optic communication lines; others are hoping that Wi-Fi services will pay off in indirect ways, such as through business taxes, property licenses, and other kinds of government revenue. Although Wi-Fi networks are for now mostly free community access points established by nonprofits, enthusiasts, and municipalities, there is a chance that they will switch to paid service once telecoms and local governments see a way to profit from them. "People might be giving away free access today because it is easy and cheap to put the equipment in place, but we believe that to have a consistent, quality service, it is important to form a partnership with a national provider and to charge a fair subscription fee," comments Paul Mozak, Borders Books' director of business development. Major technology developers are trying to promote Wi-Fi, while smaller players expect Wi-Fi networking to become a basic utility, subject to utility-like regulations. This possibility does not sit well with certain businesses that already pay taxes and other government fees for water, power, and phone service. Some Wi-Fi advocates are concerned that the FCC might try to regulate the unlicensed spectrum, but the diverse array of products and technologies that use the spectrum, not to mention the protest from manufacturing industries that proposed regulations are likely to engender, makes this a remote possibility. However, the tradeoff is overcrowding of the spectrum.
    http://news.com.com/2102-1033-982322.html

  • "Professor, Students Set Transistor Speed Mark"
    Chicago Sun Times Online (01/30/03); Wolinsky, Howard

    A team of University of Illinois graduate students led by electrical and computer engineering professor Milton Feng has developed the world's fastest transistor, which can achieve transmission speeds of 382 GHz. Feng expects this breakthrough will pave the way for 38 GHz microprocessors that will be over 10 times faster than today's computer processors, and predicts that circuits with high-speed transistors will be available to government security agencies in two to three years, and for consumer products in four to five years. The key ingredient in Feng's transistor is Indium Phosphide (InP), whose speed in carrying electrons is unsurpassed. The Defense Advanced Research Projects Agency (DARPA) has poured $2.1 million into Feng's research in the hopes that it will lead to circuits capable of digitizing analog data in real time that can be incorporated into "electronic combat systems," according to Feng's research assistant Walid Hafez. DARPA had set a target speed of 500 GHz by spring 2004, but Feng expects to reach 700 GHz by then. The University of Illinois' Jose Schutt-Aine says Feng's research could lay the groundwork for the development of analog-free "soft radio" devices that can tune directly to preferred radio frequencies via software. "You could use your cell phone as your TV" with such a breakthrough, he notes. Feng believes basic circuits that use his transistor will be available by this summer, and explains that faster transistors mean more secure signals.
    http://www.suntimes.com/output/news/cst-fin-chip30.html

  • "What Python Can Do for the Enterprise"
    NewsFactor Network (02/03/03); Brockmeier, Joe

    The open-source, object-oriented programming language Python is ideal for companies that need flexible code for use on a variety of platforms, but do not have a lot of programming resources. Python was created in 1991, the same year as Linux, and named after Monty Python's Flying Circus, says creator Guido van Rossum. He says Python is supported by nearly every computing platform, even the Palm operating system, and its applications port very easily from one platform to another. Python also allows programmers to finish their work faster, with short edit-and-test cycles and little code, making them more productive. Van Rossum points out, however, that Python is not suitable for performance-critical duties, such as device drivers and operating systems, but is ideal for rapid development projects, or for creating sparsely coded programs when mixed with other languages. Jython, a combination of Python and Java, allows Java applications to be developed quickly, for example. Other famous examples of Python code are the Apache Toolbox and the Oak DNS server. While fast development and simple coding are a boon to programmers, those aspects are even more important to people who are not full-time programmers. ActiveState senior developer David Ascher says scientists favor Python because it reduces their dependency on professional programmers and gives them more control over their own projects.
    http://www.newsfactor.com/perl/story/20645.html

  • "When Computer Code Becomes a Moral Dilemma"
    Boston Globe (02/03/03) P. C3; Bray, Hiawatha

    Security experts face a difficult moral choice when they discover new software flaws--whether to disclose them and risk their being used for malicious ends, or hold back. Such was the dilemma British software engineers Mark and David Litchfield faced when they stumbled across a vulnerability in Microsoft's SQL Server database software. They notified Microsoft about the security hole, and waited until the company had released a patch before publicly announcing and detailing the flaw at a convention in Las Vegas. Security experts test potential bugs by writing an exploit program, and such programs are often distributed to other experts who wish to test their own systems or use them for research. The Litchfields' work came back to haunt them last weekend with the outbreak of the Slammer worm, which used exploit code that David Litchfield published as a template, according to an email from Litchfield himself. The Microsoft patch did little to halt the spread of Slammer because many companies failed to deploy it. Hackers rely on such negligence to launch effective attacks, and use the availability of exploit programs to make such attacks easy to carry out. However, many computer security experts say the public disclosure of security vulnerabilities encourage rapid and complete fixes. But the Slammer experience has made the Litchfields more cautious: "The next big worm could take out enough critical machines that people are killed...I don't want to feel that I've contributed to that," David Litchfield wrote in his email.
    Click Here to View Full Article

  • "PC Makers to Fight New Copyright Fee"
    Wall Street Journal (02/05/03) P. B3; Mitchener, Brandon

    VG Wort, a German reprography rights society, wants German PC maker Fujitsu Siemens Computers to pay copyright owners a fee for every new item it sells as compensation for the private digital copying of their works. A mediator from the German Patent Office suggested a levy of 12 euros per computer, compared to 30 euros per computer that VG Wort originally sought. Fujitsu Siemens attorney Thomas Vinje says that his client, along with the rest of the PC industry, will likely reject the recommendation and challenge the levy in court. A Bitkom official claims the recommendation would cost German consumers an additional 70 million euros annually. Germany is the first European country where a collecting society has tried to foist a copyright tax on new PCs. However, there has been a long tradition in Europe to impose copyright fees on the sale of blank audio and videocassettes, and collecting societies are attempting to place such levies on the sale of other devices that could be used to make digital copies. European consumer organizations are fighting the PC copyright levies, claiming that they cause price increases and are founded on the mere conjecture that people are copying protected content with their computers. The United States, England, Ireland, and Luxembourg lack such a fee system.

  • "Women in IT (For a While)"
    ZDNet Australia (01/30/03); Oliver, Jane

    Last week's "Women in IT: Engaging and retaining for success" conference highlighted the problem of employee retention, which was raised by keynote speaker Patricia Hewitt, Australia's secretary of state for trade and industry. "[Women are] coming in but they're not staying," she lamented, and attributed this trend to several factors, notably the decision to start families and discouragement upon hitting the glass ceiling. Jane Lodge of Midlands Deloitte and Touche cited a lack of ambition as the reason many talented women are leaving IT. Solutions Hewitt suggested included the reorganization of work time, such as shortening the work week to four days. She pointed to the European initiative of instituting such practices, adding that there has been no loss in productivity or staff turnover as a result. In April, British employees will be able to propose an amended work schedule to their employers, who can only reject it for business reasons. Also speaking at the conference was Elaine Clark, group manager for Chelsea FC, who recommended that prospective female IT workers should have a clear idea of what they want to do, and then work out a plan to achieve their goals. Meanwhile, conference attendee Jane Oliver writes that the key to recruiting and retaining more female IT workers should involve the elimination of certain industry practices she terms misogynistic.


    For more information about ACM's Committee on Women in Computing, visit http://www.acm.org/women

  • "DMCA Resists Challenges, Despite Recent Acquittal"
    National Law Journal (01/20/03) Vol. 25, No. 22, P. C3; Chovanes, Joseph

    Two court challenges to the Digital Millennium Copyright Act (DMCA), which is supposed to curb digital piracy by prohibiting certain efforts to bypass copyright protections, alleged that it violated the Constitution on a number of counts, all of which the courts rejected. The defendants of Universal City Studios Inc. v. Corley and U.S. v. Elcom Ltd. raised First Amendment issues, claiming the DMCA unconstitutionally restricts free speech--in their case, their computer programs--and attempts to ban protected third-party rights. Although both courts ceded that the programs constituted free speech, they dismissed claims that the DMCA took aim at that speech, and was therefore content-neutral; as a result, the DMCA was evaluated under "intermediate scrutiny" in which both courts determined that that there was substantial government interest in upholding the law to prevent unauthorized copying, and that the DMCA balanced that interest with the burden on speech. In Elcom, the court dismissed the defendant's argument that the DMCA violates third parties' right to access public-domain materials, counter-arguing that the law protects the use of a specific publisher's copy of the work in the public domain; it did, however, agree with the defense's allegation that the law diluted third parties' right to make backup copies, but concluded that it was of no constitutional significance. In Universal City, the court found the defendant's position that that the DMCA injured fair-use rights to be "extravagant," and observed that the defendant was arguing for fair-use rights for end users of the program, not for himself. As for the Elcom defense's claim that the DMCA's restrictions on copyright circumvention tools stifled fair use and represented an overbroad limitation on free speech, the court ruled that overbreadth was not applicable because the law was not regulating spoken words or expressive conduct. In Elcom, the defendant also raised the Fifth Amendment, contending that the DMCA violated due process guarantees by being too vague in its restrictions on circumvention tools, but the court rejected this also. Despite the court's overwhelming support for the plaintiff, the jury in U.S. v. Elcom Ltd. acquitted the defendant.

  • "Information Highway Needs Women Drivers"
    Maryland Daily Record--TechLink (01/03) P. 5; McCausland, Christianna

    The belief that girls are more inclined to social interaction and that to have an interest in computers makes one a geek is curbing girls' interest in technology, says Eileen Ellsworth, co-chairwoman of Girls in Technology (GIT). "These are stereotypes we need to disabuse girls of by making computers social and interesting to them," says Ellsworth, an attorney who also provides legal counsel to a software company. National studies about the societal and academic pressures girls face when they enter middle school support Ellsworth's belief that elementary school is the time to cultivate an interest in technology among girls. In addition to her involvement with the outreach committee GIT, which tries to get more girls interested in technology by providing speakers, mentors, and scholarships, Ellsworth is also executive director of Empower Girls, an organization designed to serve as a safe-haven for girls as they explore computers and technology. Empower Girls gives girls an opportunity to take computers apart, and learn how to use software programs. Girls will need to become more comfortable using computers and technology if they plan on having rewarding careers in the future, adds Women in Technology board member Paula Jagemann. Technology impacts every facet of contemporary life, and will continue to create exciting career opportunities in the years to come, she says.

    For information on ACM's Committee on Women and Computing visit http://www.acm.org/women

  • "Right Data, Right Now"
    InformationWeek (02/03/03) No. 925; Garvey, Martin J.

    The year 2003 will be marked by advanced storage-management software that allows business-technology managers to better control data in order to make critical decisions while lowering IT department costs. More companies are upgrading their storage management infrastructures so that they can back up their data in real time. The coming year will see the emergence of enhanced storage products that boast greater ease-of-use and efficiency, less operational costs, faster disaster recovery, better provision of real-time data, and improved assurance of business continuity. Thanks to virtualization and visualization features bundled into software, administrators are able to map, monitor, and manage storage systems as if they were a single resource. "We see the need for more and more of a mix between consolidated and distributed data," observes EMC CTO Mark Lewis. Veritas CEO Gary Bloom recommends that companies should hire storage architects to optimize their storage resources and choose the most suitable storage and transport technology. Notable products from storage management technology vendors to be introduced or enhanced this year include policy-based management software from Hitachi and IBM; Active Management from Hewlett-Packard; and a new version of EMC's Symmetrix system that boosts performance speed and capacity. Factors that could influence the future of storage technologies include emerging standards such as iSCSI, the still indeterminate role of storage services, and the advent of new vendors and technologies.
    http://www.informationweek.com/story/IWK20030131S0006

  • "Can't We All Just Get Along?"
    IEEE Spectrum (01/03); Cass, Stephen

    Software companies feeling the pinch from the collapse of the dot-com bubble could achieve significant growth from businesses seeking to squeeze the most efficiency out of existing enterprise software; to aid them is an industry-wide push to make software more interoperable. Incompatibility has long been a staple of the software industry, given rival vendors' penchant to enhance basic products with special features for competitive advantage, to the point that they can no longer interoperate. However, by the late 1990s, many companies realized that homogenizing systems to one platform was impractical, and the advent of XML was a great leap forward to compatibility. In addition to being platform independent, XML schemas supply some minimal certified information about the data contained, and there is an XML schema database for developers to use if they get confused. The latest effort is to establish interoperability between software applications that use other applications, and this has resulted in plans to develop Web services. Leading software makers are moving to set interoperability standards by collaborating in consortia such as the Web Services Interoperability Organization (WS-I). However, Scott Valcourt of the University of New Hampshire's InterOperability Lab cautions that conforming to a standard does not guarantee compatibility without detailed review of the standard's written specification; otherwise, ambiguities that go unnoticed may cause different vendors to deploy the standard in different ways. The formal standards process has also drawn fire for its requirement that a consensus be reached between rival vendors with a long history of distrust.
    http://www.spectrum.ieee.org/WEBONLY/publicfeature/jan03/soft.html

 
                                                                             
[ Archives ] [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM