ACM TechNews is published every week on Monday, Wednesday, and Friday.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.
To send comments, please write to firstname.lastname@example.org.
Volume 5, Issue 520: Wednesday, July 16, 2003
- "Probing the IT Skills Shortage"
E-Commerce Times (07/15/03); Millard, Elizabeth
Many corporate IT departments are riddled with vacancies because of cutbacks, and there appears to be a shortage of qualified IT personnel to fill these positions, even when organizations can afford to hire new staff. Sectors that are experiencing rapid technological advancement--security and wireless being two--suffer from worker shortages because there are not enough qualified IT people to keep pace with such accelerated development. A lack of sufficient staff is also symptomatic of legacy systems management, and Aberdeen Group President Steven Lane attributes this trend to the retirement of a large portion of the IT workforce in recent years. CompTIA's Gene Salois reports a shortage of skilled IT people in project management, and he identifies basic differences between traditional and IT projects as the root cause; he notes that traditional projects rely heavily on business management expertise, while IT initiatives "involve more communication, customer service, problem solving and changing the business process"--aspects often ignored in IT project manager training. Some sectors have IT employee gaps because of recent policy changes: The Health Insurance Portability and Accountability Act mandates that technology must become the primary driver of the healthcare industry. Healthcare organizations are scrambling to deploy technology as a result, but many companies have a dearth of qualified personnel to meet the challenge. Reduced competition as a consequence of the IT skills shortfall may be a positive development for those with the requisite expertise, but that does not mean that job candidates should ignore opportunities to expand their qualifications through additional training and certification. Meanwhile, Challenger, Gray & Christmas CEO John Challenger argues that the IT staff shortage is chiefly driven by professionals with unrealistic wage expectations.
- "Teaching Computers to Work in Unison"
New York Times (07/15/03) P. D1; Lohr, Steve
The creation of software that enables distributed or grid computing--the harnessing of multiple computer sources working in tandem to carry out a single function, such as a complex simulation--has been led by Dr. Ian Foster of Argonne National Laboratory and Dr. Carl Kesselman of the University of Southern California, who together founded the open-source Globus Project. Grid computing's long-term objective, which may take 10 or more years to reach, is to empower ordinary people to tap into supercomputing-like processing power on demand from a handheld computer or desktop. "The ultimate goal is a fundamental shift in how we go about solving human problems, and a new way of interacting with technology," Kesselman explains. With the help of Argonne software designer Steve Tuecke, Foster and Kesselman have devised fundamental grid standards including Globus resource allocation manager (GRAM), monitoring and discovery service (MDS), grid security infrastructure (GSI), and grid file transfer protocol (GridFTP). Rice University's Dr. Ken Kennedy proclaims that the Globus founders will have a major effect on computer science because they "envisioned big systems and then came up with simple but smart mechanisms for building those systems." The Globus researchers elected to release the software code as open-source to ensure that the most people would adopt it. Grid computing took another step toward commercialization in July when the Globus Project issued new software tools that merge grid standards and corporate-developed Web services. Computer companies that are expressing increasing interest in grid computing include Microsoft, Hewlett-Packard, IBM, Sun Microsystems, and Platform Computing, while a National Science Foundation panel submitted a study calling for a yearly allotment of $1 billion to develop grid computing as a research tool.
(Access to this site is free; however, first-time visitors must register.)
- "Researchers "Privacy Appliance" Seeks to Harness Government Snooping"
Associated Press (07/14/03); Fordahl, Matthew
Privacy and security researcher Teresa Lunt is building the privacy protection device that would supposedly filter out sensitive information from government investigative queries. Her work at the Palo Alto Research Center is funded by the Defense Advanced Research Projects Agency (DARPA) for its Terrorism Information Awareness (TIA) project. Lunt says much of the debate about TIA is uninformed and that many people mistakenly believe the government is going to pool consumer data in a central database; according to what she knows, Lunt says the TIA system would be a front-end connecting to privately-owned databases of consumer records, and that her privacy appliance would filter out personal details while narrowing in on suspicious individuals. Eventually, investigators would go to a judge for a special court order to identify a limited number of suspect individuals, and Lunt says it should be no more invasive for most people than current marketing schemes. However, Electronic Privacy Information Center general counsel David Sobel worries that the judicial system is not designed to handle the type of situation presented by TIA. A similar privacy device and government investigative system is set for testing this year that strips personal identifiers from hospital records, allowing bioterrorism experts to quickly confirm possible attacks. Investigators in that scenario, however, are looking for general trends, unlike anti-terrorism investigators that are trying to find the isolated, tell-tale signs of an impending terrorist attack. Proponents of TIA say that only such a system would have
been able to uncover the Sept. 11 attacks before they occurred.
- "Cracking Data Hiding: Theory Can Help Disable Terrorists' Messages"
A new theory that establishes how much data can be hidden within a system and then supplies criteria on how to archive and decipher data has been developed by Jody O'Sullivan of St. Louis' Washington University and former graduate student Pierre Moulin. O'Sullivan says the hypothesis, which incorporates components of game, communication, and optimization theories, sheds light on both the properties of the best data-hiding tactics and the properties of the best data disruption tactics. Potential applications for this breakthrough include counterfeiting detection and the interruption and decoding of terrorist communications. "One hundred years from now, if someone's trying to embed information in something else, they'll never be able to hide more than is determined by our theory," O'Sullivan declares. "You basically plug in the parameters of the problem you are working and the theory predicts the limits." O'Sullivan serves as executive director of Washington University's Center for Security Technologies, which was founded last year as a facility committed to the development of anti-terrorist technologies, cybersecurity and critical infrastructure solutions, and defenses against environmental disasters and natural catastrophes. O'Sullivan and Moulin's theory was published in the March edition of IEEE Transactions On Information Theory. Fully implementing the theory to advanced security systems will be a formidable challenge.
- "Could Your Computer Be a Criminal?"
MSNBC (07/15/03); Sullivan, Bob
The worlds of virus writers, spam senders, and porn purveyors are converging, according to computer security experts, who cite a recent spate of Trojan horse infiltrations in home PCs. Lurhq senior intrusion analyst Joe Stewart witnessed one of these operations first-hand, when he intentionally followed a spam message to Windows-update.com, where his computer was supposed to download a Microsoft security patch. Instead, the PC was directed to an Internet Relay Chat room where a hacker waited to install a credit-card fraud tool. In just one hour, approximately 800 PCs were roped into the scheme, observed Stewart. Computer security expert Richard Smith, who recently helped uncover a network of roughly 1,000 hacked computers used to send Web porn, worries about the increasing criminal nature of viruses and hack attacks. Unlike previous Trojan programs, the new automated software quietly runs on the users' computers without disturbing normal operation and doing the dirty work of some hidden online criminal. Smith says he discovered the recent porn network when tracking the target host of a "phishing" email that lured users to update their PayPal.com password. These types of fraud attempts are common and quickly shut down at their source, but Smith notes that this one was able to stay operational for one week because it kept moving from one drone computer to the next. Experts argue that the group or groups that capitalize on zombie networks for criminal gain are relatively few right now, but are expected to grow as other criminal hackers catch on.
- "The Politics of Open-Source Software"
CNet (07/14/03); McCullagh, Declan
Over 70 proposed bills have cropped up both nationally and internationally suggesting that governments favor the procurement of open-source or free software, while the Initiative for Software Choice (ISC) was reportedly founded to ensure that governments do not show excessive preference for nonproprietary software, which could supposedly imperil the future of the global commercial software industry. The ISC, which was organized under the aegis of the Computing Technology Industry Association (CompTIA), is keeping tabs on proposed open-source legislation in a number of U.S. states, while Peru, Spain, Brazil, and South Australia are among the countries that have either enacted or are considering similar initiatives. ISC executive director Bob Kramer comments that open-source procurement legislation, if passed, would create a major headache for software resellers, who would be forced to dramatically modify their business models. However, free-software and open-source advocates claim that the ISC is a mouthpiece for Microsoft, a staunch free-software opponent and a founding member of the initiative. Declan McCullagh writes that passing legislation that only allows open-source or free software to be procured by the government would limit users' choice, especially when one considers that nonproprietary software is not always better than proprietary products. Even debating such legislation could provoke closed-source vendors to strike back with legislation of their own mandating the strict procurement of proprietary software. McCullagh writes that the wisest course of action is to allow government agencies to make their own individual software choices, thus upholding the free market model.
- "Attack of the Clones Becomes a Legal Drama"
Financial Times (07/15/03) P. 11; Cole, George
Hollywood and 321 studios are locked in a bitter battle over whether the latter should be permitted to sell software products that allow DVDs to be copied. In the continuing saga of lawsuits and countersuits, the film studios maintain that 321 has violated the Digital Millennium Copyright Act (DMCA), illegally circumventing digital copy controls, and insist that they will only be satisfied if its DVD-copying products are taken off the market. The argument proffered by 321 upholds "fair use" rights outlined by copyright law, and 321 CEO Rob Semaan insists that the primary purpose of the DVD-copying software is to allow consumers to make back-up recordings of their DVDs. "We recognize the rights of copyright owners and for people who own intellectual property to be well rewarded, but we don't think that means that they should have total control over how people use it," Semaan notes. "We are trying to take the middle road between bodies like the MPAA, who want to have the final say in how you use their products, and the Napsters, who say: 'If it's available, it should be free.'" MPAA legal counsel Patricia Benson claims that Semaan is misinterpreting the concept of fair use, which historically applies to only fragments of copyrighted material that are duplicated strictly for educational purposes. District judge Susan Ilston said at a May hearing in San Francisco that the film studios made a convincing case, but at the same time questioned a provision of the DMCA that essentially makes copyright perpetual. If the court rules in favor of the studios and drives 321 out of business, Semaan is confident that similar services will start popping up and multiply Hollywood's difficulties in controlling their copyrights, as happened with Napster.
- "Computer Simulations: Modeling the Future"
TechNewsWorld (07/15/03); Koprowski, Gene J.
State-of-the-art computer simulation technologies are being developed and employed for commercial, medical, and military projects. The University of Southern California's Information Sciences Institute, in conjunction with Options Technology, incorporated artificial intelligence, cluster computing, and high-speed networks to perform a simulation for the Joint Forces Command involving 1 million computer-generated vehicles capable of autonomous movement and environmental response, whereas previous simulations could only support about 100,000 vehicles. "We are within sight of being able to create a large-scale, high-resolution battlefield environment detailed enough to let us experiment and see how a given system might perform," declared USC's Robert Lucas. This summer will witness a real-word military exercise at the same scale to give analysts the ability to track vehicle and troop movements by controlling sensors in the field. Companies such as Insight and Clockwork Solutions are building modeling and simulation technology that can be applied to supply chain management: Insight's solution simulates the impact of a natural disaster or terrorist attack on an individual organization's supply chain, while Clockwork Solutions is working on software that can model industrial systems to anticipate the effects of wear and tear on factory operations over time. Meanwhile, researchers at the Case Western Reserve University School of Engineering have devised the Model Integrated Metabolic system as a tool that can simulate how the heart, liver, and brain react to physical exertion. Bayer, Organon, and other pharmaceutical companies are employing Entelos' PhysioLabs software, which can model different disease states to see how afflicted systems will respond to drugs.
- "A Quantum Leap in Cryptography"
Business Week (07/15/03); Salkever, Alex
Quantum cryptography is gaining headway through a number of solutions currently being tested. Government and financial services firms are likely to be the first to adopt quantum cryptography, given the nature of their operations. Basically, the technology leverages quantum physics laws where the mere observation of a particle will irrevocably change its state. Current quantum cryptographic devices use single or paired photons to transmit data, and can alert both sender and receiver of someone trying to tap the line because of the laws of quantum physics. The U.S. Defense Advanced Research Projects Agency is spending $20.6 million on a number of quantum cryptography research projects around the country, including one by BBN that aims to create a super-secure fiber-optic loop supporting multiple parties. Other research at Los Alamos National Laboratory has succeeded in detecting single photons sent through the air, an innovation that could eventually mean satellite communications secured with quantum cryptography. Still, technical immaturity leaves much to be improved with quantum cryptography, such as the current 100 km reach of photons via fiber-optic lines, and the unreliability of sensitive photon detectors. Quantum cryptography is also too slow to serve as a general data communications tool and is currently being used to distribute traditional digital keys; this pairing situation is still a tremendous improvement on previous key distribution schemes because it detects tampering and can refresh keys several times per second without disrupting data transmission.
Click Here to View Full Article
- "Pentagon Alters LifeLog Project"
Wired News (07/14/03); Shachtman, Noah
The Defense Advanced Research Project Agency's (DARPA) LifeLog project seeks to build a database of every possible aspect of a person's life and use the information to trace relationships, incidents, and experiences. The agency argues that LifeLog could be used to develop computerized training programs more representative of the real world and robots designed to aid battlefield commanders. However, civil liberties advocates and defense analysts sharply criticized LifeLog's potential to become a profiling tool of unprecedented scope, and yet another part of a wide-ranging Pentagon initiative to monitor the activities of American citizens. DARPA responded to this criticism by modifying the LifeLog proposal request, forbidding LifeLog researchers from capturing images or audio of anybody "without that person's a priori express permission." The altered request also states that the acquisition of audio or visual footage of anyone apart from the user should be avoided even with a priori permission. Recipients of LifeLog research and development grants will be required to subject themselves to LifeLog profiling as a test of the system: They will travel to Washington, D.C., where their excursions and activities will be recorded by camera and geolocation, while biomedical sensors track their health. Additionally, all their credit card transactions, their email, and even reading material they are exposed to will be indexed into a searchable database. LifeLog bidders have until July 14 to submit their proposals.
- "Car Communication"
ABCNews.com (07/15/03); Eng, Paul
Carnegie Mellon University researchers are engineering various context-aware communications solutions for automobiles at the General Motors Collaboration Laboratory, including a gesture interface system that enables drivers to control various in-car operations through hand signals. The system involves an installed camera connected to a laptop containing software that interprets gestures captured by the camera as instructions. These directives for now govern "nonsafety-critical" car controls such as the radio. With such a system, a driver could, for example, direct the radio volume to go up or down by twirling his finger, or program the voice-mail system to automatically answer incoming calls with a wave of his hand. The GM Collaboration Laboratory is a $8 million joint venture between GM and Carnegie Mellon to design more intelligent car controls that allow drivers to multitask without being distracted. "We really want the car of the future to be an able companion, a thing that knows you, knows what you're up to, knows where you're going, and make intelligent suggestions," comments lab co-director Ed Schlesinger. Other interfaces being developed at the lab include information display systems in which data is projected on the windshield, and products that link mobile devices to automotive electronics through voice-recognition and wireless communications technology. Practical applications of these technologies are unlikely to be commercially implemented for at least five to 10 years.
Click Here to View Full Article
- "Big Brother Gets a Brain"
Village Voice (07/15/03); Shachtman, Noah
The Pentagon's $12 million Combat Zones That See (CTS) project, on the surface, is designed to enhance military operations and troop safety in urban areas through a combination of video surveillance and computer technology in which all movements in a given area are constantly tracked, so that specific vehicles--and later people--can be cross-referenced with lists of known or suspected terrorists as well as the proximity of certain vehicles to terrorist incidents. However, "when [CTS] gets up and running, there's going to be a huge temptation to apply it to policing at home," suggests Jim Lewis of the Center for Strategic and International Studies. "There's almost a 100 percent chance that it will work, because it's just connecting things that already exist," he notes. After CTS identifies a car or driver, other controversial data-mining programs, such as the Defense Advanced Research Projects Agency's (DARPA) LifeLog and Total Information Awareness, could be used to access the individuals' online audit trails and activities. One of the difficulties in implementing CTS will be enabling cameras separated by hundreds, if not thousands, of meters to maintain an accurate ID of target vehicles and drivers, because variant lighting conditions and camera angles can change vehicle appearance; still, this problem does not seem to be impossible to tackle. Although Tom Strat of DARPA's Information Exploitation Office insists that devising homeland security solutions is not his agency's goal, he acknowledges that CTS technology could find its way into domestic surveillance initiatives. The domestic deployment of such technology could lead to a dramatic shift in people's behavior once they become aware of being monitored. Bill Brown of Surveillance Camera Players observes that awareness of video surveillance breeds paranoia, which goes against authorities' objective to foster more law-abiding, rational citizens.
- "A Better Way to Deal With Vulnerabilities"
EarthWeb (07/10/03); Desmond, Paul
Shawn Hernan of the Carnegie Mellon University Software Engineering Institute (SEI) told attendees at a presentation at the Institute for Applied Network Security's recent Southeast Network Security Forum that some 5,500 software vulnerabilities will be identified this year, in addition to the 4,129 defects detected in 2002. He estimated that companies would have to spend 229 eight-hour work days in order to read the descriptions of those vulnerabilities, while 69 more working days would be spent to install patches. This strategy is unattainable for most organizations, meaning that they become open to intrusion once a security hole that affects them is disclosed, and they remain so until the appropriate patch is deployed. There is also a hefty monetary cost associated with compromised servers, even when the hacker is not directly attacking one's company. "If your security strategy values patch production and response over [software] quality, you're doomed to failure," warned Hernan. He added that improving software quality requires organizations to consider security in the product's design stage, and the SEI and Carnegie Mellon's CERT Coordination Center have co-developed the Team Software Process (TSP)-Secure, a suite of techniques and processes for creating secure, high-quality software that is derived from current TSP criteria. "If you take a more disciplined approach to how you build software, you can avoid a lot of these problems early on in the process," declared Hernan. He reported that SEI workshops on TSP-Secure methods have been very successful.
Click Here to View Full Article
- "Tech Firms Voicing Objections to Legislature's E-Waste Bill"
Silicon Valley/San Jose Business Journal (07/11/03); Roberts, Timothy
A number of tech companies spoke out against a California bill that would establish new rules for the disposal of electronic waste, but they were unable to sway the state's Assembly Natural Resources Committee, which ultimately approved the measure on a 7 to 3 vote. Although the bill passed the Senate on June 4, Sen. Byron Sher (D-Palo Alto) has offered amendments that address some of the concerns of the tech industry by moving the first year of the program from 2004 to 2005, requiring 50 percent of e-waste to be recycled or disposed the first year, and by giving the state Waste Board the power to set future targets. Sher's legislation seeks to make California the first state to force computer and electronics manufacturers to dispose of computers and TVs because of the lead and other hazardous materials contained in their products. Advocates of Sher's legislation are concerned about the potential poisoning of people as a result of obsolete computers and TVs in homes and the contamination of drinking water if computers and TVs placed in landfills leak lead into the ground. Companies would have to set up recycling or disposal plans, but they would be able to charge a $6 fee at the time of purchase for collecting units, and a maximum $10 fee if they use a public agency for recycling. However, manufacturers have not decided whether to recycle or dispose of the e-waste themselves, or pay cities and counties to provide such a service.
Click Here to View Full Article
(Access to this site is free; however, first-time visitors must register.)
To view the California e-recycling bill, visit http://www.acm.org/usacm/Legislation/StateBills/CA_RecycleBill.htm.
- "DNA Makes Nano Barcode"
Technology Research News (07/09/03); Patch, Kimberly
Increasing the speed of computers means shrinking circuits, and Duke University computer science professor John Reif expects the threshold of traditional, top-down lithography techniques to be reached within 10 or 20 years. Reif and associates have developed a bottom-up method of molecular self-assembly by coaxing synthetic DNA strands to configure themselves into a two-dimensional barcode-like "scaffold" that allows encoded patterns to be read by an atomic force microscope. This breakthrough can help boost the usefulness of nanoscale construction, according to University of Delaware computer science professor David Harlan Wood. A single DNA scaffolding strand was employed containing fragments of base sequences that complemented segments of DNA barcoding strands. These strands alternately possessed or lacked hairpin loops and also featured sections of base sequences that compelled barcoding strands to mesh with similar barcoding strands. The self-assembly process was programmed to generate two different 5-bit barcodes, 01101 and 10010, which were contained in a DNA lattice measuring 75 nm in length. "Using these patterned DNA lattices as scaffolds, we intend...to self-assemble molecular electronic circuit components...with the goal of forming molecular-scale electronic circuitry," reports Reif. He adds that the technique could be ready for practical applications within less than a decade.
Click Here to View Full Article
- "Smooth Talkers"
CIO (07/01/03) Vol. 16, No. 18, P. 98; Edwards, John
Speech integration technology is of interest to companies that want to cut costs through the enablement of customer self-service or the automation of internal operations, such as the dissemination of critical information to employees or business partners. The technology is becoming a more viable and cost-effective choice for the enterprise, thanks to preconfigured templates and other packaged tools as well as faster processors; the spread of mobile wireless devices and speech input/output's advantages over keyboards and small displays are also causing speech integration's stock to rise. Speech integration products come in two categories: Directed dialogue products in which users are prompted to answer a series of questions with simple answers ("yes," "no," etc.), and more expensive natural language tools that can support a wide range of applications because of their ability to understand whole sentences and engage users in more conversational interplay. The technology's appeal to callers lies in its convenience, since it reduces or eliminates their dependence on a live agent to handle customer requests, though most observers doubt that speech integration will completely supplant flesh-and-blood support, especially for technical issues. The corporate advantages of speech integration include more cost-effective, round-the-clock user support and data access. However, CIOs caution that speech integration is not a cure for increasing call center costs, because the technology must be constantly monitored and updated. Furthermore, Dollar Thrifty Automotive Group's Bob DuPont says potential adopters must look beyond IT issues and try to drum up support for speech integration deployment across the entire enterprise.
- "Red Alert on the E-War Front"
New Scientist (07/05/03) Vol. 179, No. 2402, P. 38; Rowe, Duncan Graham
A nightmare scenario in which a dedicated, technology-savvy enemy could knock out the United States' critical infrastructure via cyberattack was one of the salient points of the White House's National Strategy to Secure Cyberspace, and this argument--along with feelings that such an attack is inevitable--has spurred the military, federal agencies, security consultants, and private companies to send representatives to hacking schools that provide crash courses on Internet intrusion techniques in order that they might formulate more effective countermeasures. One vulnerable point are the supervisory control and data acquisition (SCADA) programs that facilitate the centralized management of electricity, gas, and water supply networks; market pressures have changed SCADAs from customized, isolated systems to more nonexclusive systems that run on publicly accessible networks. "The Chinese use the same SCADA vendors as they use here in America," observes Bill Flynt of TRC Solutions. Cybersecurity worries prompted the Clinton administration to issue a presidential directive whose goal was to get private companies to boost their IT security investments and divulge any perceived threats, vulnerabilities, and abnormalities to law enforcement. This plan backfired, however: Companies did not want to reveal their security shortcomings out of fear they would jeopardize their competitive advantage. "Less than 10 percent of cyber-crimes get investigated because CEOs are reluctant to get involved," notes Information Technology Association of America President Harris Miller. Meanwhile, security consultants such as James Lewis of the Center for Strategic & International Studies believe the cybersecurity threats are overblown, and doubt that the entire country could be hamstrung by a cyberattack. Nevertheless, many companies have apparently beefed up their security with illegal programs that strike back against intrusions by disabling hackers' computers.
- "Big Players Push IPv6, But Masses Still Resist"
Network World (07/07/03) Vol. 20, No. 27, P. 1; Hochmuth, Phil; Greene, Tim
Supporters of Internet Protocol version 6 (IPv6) argue that industry should adopt the standard in order to exploit its advantages, which include better security and a solution to the IP address shortage. This shortage, which may not seem apparent now, is expected to become evident as handheld wireless devices proliferate. For this reason, the Internet Engineering Task Force is readying the Mobile IPv6 specification. IPv6 moved to draft standard status in 1998, and recently got a boost on two fronts: Cisco has IPv6-enabled its router and switch gear and firewall offerings, while Defense Department CIO John Osterholz announced that the standard will be a procurement requirement beginning in October 2003. He also set a 2005 deadline for all Defense Department networks to be fully compatible with all IPv6 networks. However, Gartner principal analyst Lawrence Orans observes that "for regular companies and businesses, [IPv6 adoption] is not on their radar screens," adding that IPv4's shortcomings are not that problematic for most U.S. enterprises. Experts such as Cisco's Martin McNealis cite implementation costs as one barrier to IPv6 acceptance, while another is the fact that U.S. companies and governmental entities still have an abundance of IP addresses. Additionally, the life of IPv4 addresses can be extended via numerous workarounds, such as Classless Interdomain Routing and network address translation., although such methods can reduce network security and impede network management. Still, North American IPv6 Task Force Chairma Jim Bound, a Hewlett-Packard staff fellow, says, "IPv4 cannot take us into the next century...We're keeping [IPv4] alive with chewing gum." Meanwhile, the ability to embed all types of devices, from credit cards to cars, with IP addresses promises to push wider adoption of IPv6 technology in the near future.
- "Reclaiming the Digital Commons"
Information Today (06/03) Vol. 20, No. 6, P. 33; Poynder, Richard
Certain information professionals are investigating ways to protect the "digital commons" in response to worries about projects that seek to privatize the Internet by imposing restrictions on public access to copyrighted content. Debate has sprung up regarding whether patented technologies, copyrights, or a combination of the two are a bigger threat to the digital commons: For example, compared to patents, copyrights are relatively cheap and last longer. Many librarians, worried that copyright legislation and digital rights management (DRM) technology will bar the public's access to important historical data, are leading the charge to preserve the public domain. One strategy, exemplified by Canada's Universite du Quebec a Chicoutimi, is to archive publicly available books and documents online. Another option is copyleft, the implementation of alternative copyright licenses to ensure that content stays freely available to anyone, even when modified; the most famous example is the GNU General Public License (GPL), which allows anyone to use, alter, and redistribute software under specific circumstances, and stipulates that any modified versions must also be distributed under the GPL. Meanwhile, the Creative Commons released a number of open licenses on the Web for public use in December 2002, the point being to "provide an easy, cheap way for people to announce to the world that they don't mind people making certain uses of their work on certain terms and conditions," according to Creative Commons executive director Glenn Otis Brown. Some people are concerned that releasing digital content into the public domain will be pointless if the search-and-retrieval systems needed to access such information are themselves proprietary, which makes the open-source encyclopedia Wikipedia, which supplies copyleft content and search tools, so attractive. Prentice Hall editor in chief Mark Taub argues that open licensing can actually help print sales by allowing customers to sample content online prior to purchase.