HomeFeedbackJoinShopSearch
Home
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM.

To send comments, please write to [email protected].

Volume 5, Issue 568: Friday, November 7, 2003

  • "W3C Criticizes Antirobot Tests"
    CNet (11/06/03); Festa, Paul

    Visual verification tests employed at popular Web sites such as VeriSign's Network Solutions, CNet's News.com, and Yahoo! and Microsoft's free email services are designed to thwart software robots trying to register accounts and amass data for spamming and other intrusive purposes, but a Nov. 5 draft from the World Wide Web Consortium (W3C) slams the tests for locking out visually handicapped users. The tests require account applicants and other users to read and type out characters that are visually distorted in bitmap images, a procedure that human beings can pass but computers cannot. W3C Web accessibility expert Matt May wrote in the draft that the test "comes at a huge price to users who are blind, visually impaired or dyslexic." Work-arounds are offered at certain sites: Hotmail provides an audio verification test, while Yahoo! reportedly allows users to call the company's customer service department for confirmation; the W3C draft notes that the first work-around involves a certain degree of distortion that could confuse people, while the second has a maximum 24-hour delay that could prevent users from completing transactions. The W3C pointed out that there are other options to the visual verification tests--biometric devices, credit card verification, logic tests, etc.--but none of them are foolproof. "What we attempted to do was provide a way to help people think through the problem they're trying to solve and to point out that the [visual verification] solution may not be the solution they think it is," declared May. The W3C reported that the spread of such tests is causing more international industry, standards, and accessibility groups to focus on the accessibility issue. Meanwhile, May is hoping that the best solution lies in federated identity systems that allow people to set up online IDs that are hard to spoof.
    Click Here to View Full Article

  • "Could Antivirus Apps Become Law?"
    IDG News Service (11/06/03); Gross, Grant

    Rep. Charles Bass (R-N.H.) suggested at a Nov. 6 congressional committee hearing that the nation's critical infrastructure could be bolstered by a federal mandate for all U.S. computer users to deploy antivirus software on their PCs. His proposal was sharply criticized by computer experts, who cited both ethical and technical reasons why such a measure would not work: VeriSign's Ken Silva said that such a law would be "tantamount to trimming a little fat off the Constitution" and that users would balk, while Internet Security Alliance CEO Bill Hancock noted that computers used for factory automation or power plants are not antivirus-enabled and would lead to an infrastructure collapse. There was also a lack of consensus over other ways the government could encourage cybersecurity--Richard Pethia of Carnegie Mellon University's CERT Coordination Center suggested that software vendors should be pressured to write more glitch-resistant code, a goal Silva claimed is unattainable. Silva and Hancock supported congressional promotion of cybersecurity education, while Pethia was skeptical that enough computer users could be reached through such an initiative, insisting that vendors must be held liable for security flaws in their products. Rep. Gene Green (D-Texas) declared that "The combination of email spam and viruses is like putting a SARS patient on every airline flight in the country," and argued that his Anti-Spam Act of 2003 would be an effective antivirus measure. A greater commitment of law enforcement resources to anti-cybercrime efforts was supported by Hancock and Business Software Alliance (BSA) President Robert Holleyman, with the latter also lobbying for international accords for cybercrime law enforcement and the creation of a global "culture of security." Hancock pointed out that American statutes will not curb cybercrime by themselves, since hackers and spammers will still find safe havens outside the United States.
    Click Here to View Full Article

  • "Despite Lack of Chads, Touch-Screen Voting Systems Drawing Fire"
    Investor's Business Daily (11/06/03) P. A8; Deagon, Brian

    California's decision on Nov. 4 to postpone the certification of an upgrade to software used in Diebold's touch-screen voting machines is seen as a triumph for academic researchers, computer scientists, and grass-roots activists who claim the systems are vulnerable to cyberspace intrusions. Doug Stone, a representative for California's secretary of state, reports that California intends to eventually buy more touch-screen e-voting machines, but purchases are on hold until allegations that Alameda County employed uncertified software in a recent election are thoroughly probed. Touch-screen machines are used in approximately 200 municipalities across 13 states, and more implementations are expected in order to comply with a federal mandate to modernize state voting systems by 2006. But Aviel Rubin of Johns Hopkins University says the transition to e-voting is being rushed, which leads to less rigorous security analysis and oversight. Rubin co-authored a report noting that vendors are proceeding to market products regardless of scientists' warnings about security and reliability issues, and criticized Diebold in particular for not adhering to a meticulous software engineering ethic. Some states have delayed their purchases of e-voting machines because of the report, while others are forging ahead because they are confident that the electoral process and the way voting machines are used ensure adequate security. Worries about the technology are increasing partly because of internal data and documents from Diebold and Sequoia Systems leaked onto the Web, which some critics say contain "smoking gun" evidence of security vulnerabilities. Meanwhile, legislation from Rep. Rush Holt (D-N.J.) calling for all voting machines to print a voter-verifiable ballot has yet to reach committee.

    To learn more about ACM's activities regarding e-voting, visit http://www.acm.org/usacm.

  • "New Measure of Success Cited for Statistical Prediction"
    EE Times (11/06/03); Brown, Chappell

    University of California-San Diego researchers have found new insights into the well-known Good-Turing estimator used to decode German encryption during World War II. Variations of the estimator created by I.J. Good and Alan Turing is used in spell-checking, data retrieval, and speech recognition programs today. The researchers have discovered ways to quantify the correctness of the Good-Turing algorithm, thus paving the way for optimum probability and advanced machine-learning software. Although using estimators with small sets of data is relatively easy, figuring out the probability of large sequences requires careful tailoring of the Good-Turing algorithm. Turing created the program to model the probability of which German keys would be used in secret communications, but the large set of keys introduced too much noise--a problem Good helped solve with a smoothing algorithm. Today, different smoothing algorithms are suited for different applications. By creating a mathematical formulation of how various Good-Turing estimators approach a definite limit, University of California-San Diego researcher Alon Orlitsky and colleagues expect to be able to create near-perfect estimators. Orlitsky says a surprising result of the research shows the famous add-constant estimator invented by Pierre-Simon Laplace, who created probability theory, is actually not as good as widely believed; although effective for small data sets, the add-constant estimator provides increasingly poor estimates as the data stream lengthens.
    Click Here to View Full Article

  • "Virginia Tech to Offer Supercomputer in a Box"
    TechWeb (11/05/03); Gardner, W. David

    Virginia Tech is expected to release its groundbreaking off-the-shelf supercomputer architecture as a "supercomputer kit" once licensing and patenting issues are addressed, according to a school representative. With the assistance of 165 faculty members and students, Virginia Tech's computer science department installed a 1,100 PC supercomputing cluster in 10 days at a cost of only $5.2 million using commercially available dual-processor G5 Macs from Apple, Infiniband cluster technology from Mellanox Technologies, a customized cooling system from Emerson Network Power, and secondary 2 GB Gigabit Ethernet links from Cisco Systems. The system, which can manage 17 trillion operations per second, is expected to rank at least fourth in a list of the world's top supercomputers compiled by the University of Tennessee's Jack Dongarra. In contrast, Japan's Earth Simulator can handle 35 trillion operations per second, but that machine was built at a cost that topped $250 million. The university representative noted that the National Security Agency and the Argonne weapons laboratory are among those interested in Virginia Tech's supercomputer technology. The supercomputer at the university will be applied to a variety of research areas, including chemistry, nanoscale electronics, and aerodynamics. The supercomputer kit is expected to become available in early 2004.
    Click Here to View Full Article

  • "Tech Prepares for Worldwide Contest"
    Collegiate Times (11/07/03); Pritchard, Jessica

    The Association of Computing Machinery's International Collegiate Programming Contest takes place Saturday at Radford University, where programming teams will compete against each other in one of 30 regional competitions worldwide. Teams of three programmers each, usually one graduate student with two undergraduates in tow, have five hours and just one computer to complete six to eight complex programming problems. The questions and set up are designed to not only test programming skill, but also ingenuity and teamwork. Virginia Tech is sending five teams of programmers to the contest, which the school has historically dominated over the last 20 years, but has especially fared well in the last 10 years, often capturing first place. Virginia Tech senior math major Keith Ferguson says divvying up the problem into solvable components and then debugging programming work usually takes up most of the team's time. Computer science master's student Greg Grothaus says the format and problems are fairly predictable, but that each year competing teams improve their performance. The competition boosts students job prospects, says Grothaus, but computer science junior Joe Gleason says the greatest appeal of the event is the chance to test one's skill against the best programmers in the region, as well as internationally. The 2004 World Finals will be held in Prague, Czech Republic, this spring.
    Click Here to View Full Article

  • "The Many Shapes of Tomorrow's PC"
    Business Week (11/04/03); Salkever, Alex

    PCs have shrunk in size and grown in power, but their architecture and design methodology is relatively unchanged, as is their chief uses of word processing, graphic presentation, and spreadsheets. However, PC manufacturers will likely approach personal computing from new angles as they hit technological thresholds over the next 10 years, concurrent with the migration of many PC-associated operations away from the computer and onto networks or the Web. Faster chips guzzle more power and generate more heat, which has prompted researchers to concentrate on chips that operate more efficiently at current clock speeds; Intel's Shekhar Borkar notes that chipmakers could boost performance and promote power efficiency by building a computer on a chip via the inclusion of specialized circuits in the central processing unit, while Cooligy is developing a cooling system in which water is pumped through miniature radiators by electrically charged mechanisms. The proliferation of wireless broadband connectivity, broadband DSL, cable-modem service, and digital cell-phone networks may finally push the concept of the Network PC out of the theoretical realm because users will be able to capture a fast signal from nearly anywhere. The growing reliability of online computing will reduce the need for laptops, while ergonomic problems with tiny screens should be addressed with the advent of flexible displays. Niche devices with PC-like capabilities have already begun to spread in the form of PCs that do not resemble PCs and other systems, and the Linux open-source operating system is expected to play a dominant role in such devices in the short term. The expansion of storage capacity, furthered by the emergence of terabyte internal drives and other planned innovations, could allow users to realize new applications for PC and PC-like devices, and perhaps enable home media servers to penetrate the mainstream in the next five years. This could give consumers less reason to have multiple PCs in the household.
    Click Here to View Full Article

  • "Computer Glasses Can Boost Memory"
    United Press International (11/03/03); Choi, Charles

    Memory glasses are spectacles outfitted with computer screens to serve as memory recall aids, and MIT researcher Rich DeVaul says a commercial product based on this concept could emerge within a few years. "Just about anyone could benefit from this system, particularly busy people who have too much to remember, older people who are having some additional memory difficulties, and people who work in specialized professions such as ER doctors or firefighters who need a huge amount of specialized information at their fingertips, but can't afford to be distracted by conventional memory aids," DeVaul explains. The glasses, which are linked to a computer installed within a vest, project subliminal flashes of data at 1/180 of a second so wearers will not be distracted, especially when they are engaged in an activity, such as driving or combat. The prototype glasses were tested on volunteers sitting at desktop computers; after the volunteers attempted to memorize 21 name-face pairs displayed on the desktops, they had to match names to faces correctly while the memory glasses flashed one of three types of subliminal cues at them--blank screens, correct names for faces, and incorrect names. The tests showed that volunteers cued with the correct names matched names to faces with 50 percent more accuracy than those who received no cues, while DeVaul discovered that users were not misled even when incorrect names were flashed. "That's an unexpected find, and in science, any unexpected find is worth its weight in gold," notes Thad Starner of the Georgia Institute of Technology, who adds that the technology could be useful on desktops as well as in wearable computers. Many people perceive subliminal messaging as a shady technology designed to influence behavior, and Starner and DeVaul acknowledge the likelihood that memory glasses will eventually have to contend with this onus.
    Click Here to View Full Article

  • "Passphrase Flaw Exposed in WPA Wireless Security"
    TechNewsWorld (11/06/03); Lyman, Jay

    Bob Moskowitz of TruSecure's ICSA Labs has written an online paper describing security flaws that could permit hackers to breach the WiFi Protected Access (WPA) wireless cryptography protocol through dictionary attacks, and he urges WPA users to abandon the traditional practice of using simple eight-character passwords and get in the habit of employing longer passphrases. The researcher notes that the passphrase problem was discussed as the latest WPA standard was created and deployed about 12 months ago, but points out that the specification's ease of implementation has resulted in greater vulnerability to attack. Moskowitz explains that intruders armed with network sniffers could compromise WPA. The WPA standard recommends that passphrases be at least 20 characters long at minimum, while the maximum limit is 63 characters. "If vendors supplied a tool to make good passphrases and allow people to put them in, that's all that would be needed," Moskowitz remarks. He maintains that the WPA protocol is a solid and measurable improvement over Wired Equivalent Privacy (WEP), despite the passphrase flaw. Moskowitz also reports that Microsoft's free Windows XP WPA software provides better security by employing separate encryption keys for different systems that link to the network, instead of a single key that could be exploited by hackers.
    Click Here to View Full Article

  • "Study: Salary Gloom Ahead for IT Workers"
    CNet (11/06/03); Frauenheim, Ed

    Overall starting pay in the IT sector is set to fall an average of 1.6 percent next year, but that certain in-demand areas such as IT security and systems audit would experience gains, concludes a new report from IT staffing firm Robert Half Technology. Robert Half Technology executive director Katherine Spencer Lee says positions that saw tremendous growth during the last part of the 1990s are now reducing base pay and that CIOs are tightening their requirements of new hires. Other job data for the IT industry has been equally depressing, with the Economic Policy Institute reporting the highest unemployment for mathematicians and computer scientists on record in conjunction with reduced inflation-adjusted wages for technical and professional workers. Meanwhile, the American Electronics Association reports 5.1 million jobs were lost in the U.S. technology sector between January 2001 and December 2002--a 10 percent reduction. Salary and employment data have fallen with the drop-off of corporate IT spending and the increased use of foreign labor either through outsourcing or H-1B and L-1 workers in the United States. Meta Group gives a contrarian view to IT sector gloom, showing a 5 percent rise in base salary overall, according to the research firm's survey of North American IT managers. The Robert Half Technology report said data security analysts would experience the greatest increase in starting salary--2.1 percent--while LAN administrators and desktop support analysts are expected to see a 4.5 percent and 5.3 percent drop in base salary next year, respectively. The report says starting salaries for application architects, senior Internet developers, and systems auditors should rise next year.
    Click Here to View Full Article

  • "A World of Opportunity for Tech"
    Fortune (11/05/03); Kirkpatrick, David

    Information technology use and its global importance will be the theme of the United Nations World Summit on the Information Society in December, and David Kirkpatrick writes that all tech companies should support the conference and send executives to the event. Nitin Desai, U.N. Secretary-General Kofi Annan's special advisor for the Summit, says the event's timing is critical because technology demand is no longer a given. Desai says, "Now the opportunity is to get technology into geographies where it has not penetrated, and also to sell it into new areas of application" such as health, education, and government. The advisor says the goal of the Summit is twofold: To create a set of policy objectives for technology that will be accepted by all world nations, and to showcase successful applications and systems that can help facilitate development at a "policy trade show." Desai plans to spotlight projects such as an Indian initiative in which land records are made available online to peasants via kiosks, and the public posting of municipal contracts in Seoul, South Korea, over the Internet. Desai notes that many companies believe the summit is strictly a telecom conference, and acknowledges that the U.N. has not done enough to educate businesses otherwise. But he also points out that U.S. businesses are especially ambivalent because of a deep-seated skepticism about public intervention's advantages. Kirkpatrick observes that many U.N. member nations are uninterested in empowering their citizens with technology, which makes the summit look like lip service, to which Desai responds that "By setting markers on what is considered good policy, this sort of conference shifts the weight of opinion worldwide, and that's what counts."
    Click Here to View Full Article

  • "Making the U.S. Safe for Spam"
    Salon.com (11/04/03); Honan, Mathew

    Despite the media buzz surrounding the Senate's recently passed Can Spam Act, the spam situation in the United States is unlikely to improve any time soon. The bill, which is expected to reach the House floor soon, allows legitimate commercial spam as long as it offers a valid opt-out option; in addition, it gives the FTC the go-ahead to research a do-not-spam list similar to the list used for telemarketers. Although the intentions of the bill may be sincere, the effect will be disastrous, according to anti-spam activists. The main foible is that it does not preclude unsolicited email; simply offering an opt-out means each consumer email account could potentially get one email from each of the 24 million small businesses in the United States, not to mention email from large corporations and illegal spammers, says Coalition Against Unsolicited Commercial Email co-founder John Mozena. He estimates that just 1 percent of those small businesses sending an email message once per year would result in 657 messages each day. The federal act would also supercede state laws, such as the California anti-spam law that was set to go into effect in January 2004; that law, seen as an impetus for the national Can Spam Act, would have allowed individuals to sue spammers, effectively creating "an army of enforcement," says anti-spam attorney David Kramer. Under the Can Spam Act, spammers have more protection since they can only be sued by ISPs and federal regulators would have the additional burden of proving knowledgeable violation. Earthlink law and public policy vice president David Baker says the Can Spam Act is a move in the right direction, but that the do-not-spam list would be difficult to enforce against unscrupulous spammers; even worse, such a list could potentially be mined for legitimate email addresses.
    Click Here to View Full Article
    (Access to full article available to paying subscribers only.)

  • "Lab Rat: Swapping Gets Legit"
    Red Herring (11/04/03)

    The much-maligned peer-to-peer (P2P) methodology may finally gain legitimacy with corporations employing it to facilitate the exchange of data across distributed networks as researchers throughout the United States attempt to boost the security of P2P network content. A Stanford University P2P project initiated by HighWire Press assistant director Vicky Reich and developed by software engineer David Rosenthal sought to solve the problems of securing and accessing electronic journals at the school's libraries without relying on a centralized architecture. Rosenthal's solution, LOCKSS (Lots of Copies Keep Stuff Safe), is a secure, low-cost, and simple P2P digital preservation system built with funding from the National Science Foundation, Sun Microsystems Labs, Stanford Libraries, and the Andrew W. Mellon Foundation. All a library needs to join LOCKSS is the software client installed on a PC and a network link to the Internet and the library's local network. LOCKSS generates a permanent cache of the journal articles that is never erased, but can only archive copies of the journals for which a specific library has legitimate subscriptions. The security of cache files is maintained by a continuous auditing process that compares the same articles stored in the many caches of computers in the network, but this process is slow. Rosenthal reports that LOCKSS is an excellent measure for storing static documents that need to be preserved for long periods. He says the project's ongoing goal is "to make a system that is hard for the bad guys to interfere with, without requiring that secrets--such as passwords and encryption keys--be kept secret for long periods of time."
    Click Here to View Full Article

  • "Experts Debating Future of IT Careers"
    Dallas Morning News (11/02/03); Godinez, Victor

    Richard Ellis of Ellis Research Services is one of a group of experts arguing that the offshoring of entry- and mid-level technology jobs has led to a permanent reduction in demand for IT professionals in the United States. He adds that although the pervasiveness of technology may increase, there is an overabundance of people available to build that technology. A recent report from Ellis on the status of the IT workforce estimates that the number of IT jobs, which stood at 2.5 million three years ago, has dipped by 150,000 since then, while a Gartner study predicts that 10 percent of professional IT positions and 5 percent of non-IT positions will be outsourced to overseas workers by the end of next year. "The United States does not lack--either now or in the foreseeable future--sufficient numbers of capable people who would like to work in IT," writes Ellis. "But those people may not be willing to conclude that long-run demands for their services will be good enough to support IT as a sensible career choice." There has also been a national decline in U.S. computer science and computer engineering enrollments: A report from the Computing Research Association indicates that the number of computer science and computer engineering program entrants fell from 23,090 to 23,033 last year; enrollment had jumped from about 10,000 in 1995 to 20,000 in 1997. Dr. Stu Zweben of Ohio State University is confident that, despite the permanent outsourcing of many entry-level positions, employers will seek more skilled IT employees in the United States. However, Ellis foresees a significant erosion of middle-class U.S. tech jobs if current trends are allowed to continue, and notes the lack of any serious strategy to address the problem.
    Click Here to View Full Article

  • "Indiana University's Advanced Visualization Lab Develops, Licenses, Deploys 'John-e-Box' Stereo Display Systems"
    AScribe Newswire (11/04/03)

    Recent innovations in off-the-shelf equipment including digital projectors, PC processors and graphics cards, and open source software tools are incorporated into the John-e-Box, a portable 3D stereo display system designed, licensed, and deployed by Indiana University's Advanced Visualization Lab as an analysis and communications instrument for researchers, students, and artists. Small groups of users can experience the John-e-Box's 3D effect simultaneously, and use it to understand the 3D architecture of engineering models, virtual environments, scientific data, and medical scans. As part of a National Science Foundation grant, IU will initially deploy eight John-e-Box systems across the campuses of IU Bloomington, IUPUI, and IU Northwest. The device was developed by the AVL's John N. Huffman and Eric Wernert, with the assistance of John C. Huffman of IUB's Chemistry Department, while Indianapolis-based CAE-net is commercializing the technology. The John-e-Box is part of an overarching initiative to allow users to directly access visualization technologies in IU labs, classrooms, and studios. "The John-e-Box represents significant steps forward in usability, maintainability, and affordability," declares IU Associate VP for Research and Academic Computing Bradley Wheeler. The AVL is developing, enhancing, or implementing a variety of visualization, virtual reality, advanced graphics, and visual collaboration technologies, of which the John-e-Box is just one component. "Many of the same basic technologies that have brought about revolutions in the gaming, entertainment and home theater industries are being harnessed to bring about significant improvements in the accessibility and affordability of these technologies to the broader university community," notes Wernert.
    Click Here to View Full Article

  • "Face-off: Should the U.S. Govt. Do More to Keep IT Jobs for Citizens?"
    Network World (11/03/03) Vol. 20, No. 44, P. 39; Biggs, Matthew; Turner, Scott

    Matthew Biggs of the International Federation of Professional & Technical Engineers reports that globalization is eroding the dominance of U.S. high-tech industries and destabilizing the security of high-tech and IT companies and their employees. Among the globalization-related causes Biggs cites are government-sanctioned free trade policies, a shortage of domestic education, tax schemes that allow and encourage corporations to export jobs to cheaper overseas labor markets, and visa programs that permit foreign guest-workers to displace American workers. "While corporate leaders casually refer to these practices as 'sound business policy,' we in the labor community more accurately describe it as 'corporate malfeasance,'" the author writes. Biggs adds that federal legislators must lobby for "fair and balanced trade policies" and oppose corporate tax schemes and unfair guest-worker visa programs if the United States is to avoid an economic debacle. NaviSite systems engineer Scott Turner comments that IT outsourcing is simply the latest manifestation of the historical commoditization cycle, and he believes that government attempts to control this trend will, in the end, come to naught. "We...have to understand, and accept, that the 'Technological Revolution of the 1990s' was a rare and wonderful one-time occurrence that we were lucky to have experienced, but it is not the job of the U.S. government to artificially and indefinitely extend that boom," Turner argues. The author notes, however, that IT workers can avoid losing their jobs to outsourcing by freshening up their skills and using them to boost the bottom line and thus cement their value to the company. Turner also points out that small-to-midsize businesses are unlikely to outsource personnel because their IT development projects are too small to reap benefits from offshoring, while all companies still need desk-center and desk-side support workers.
    Click Here to View Full Article

  • "Developing IT Skills 'To Go'"
    Computerworld (11/03/03) Vol. 31, No. 50, P. 44; Brandel, Mary

    November will see the launch of the National Information Technology Apprenticeship System (NITAS), which seeks to build skills and credentials in at least seven IT tracks, and six NITAS pilot programs administered by the Computing Technology Industry Association (CompTIA) are producing data used to determine the components of these tracks. The first two NITAS tracks, IT Generalist and IT Project Management, are ready for deployment; tracks planned for next year include information assurance/security, IT enterprise management, database, Web e-commerce, and network specialization. In one of the pilot programs, McDonald's employees are trained to become IT project managers by mastering 37 skills while simultaneously applying those skills to real-world initiatives. "It responds to what employers are asking for--a combination of knowledge, certification and experience," explains Neill Hopkins of CompTIA. "This is the first system of its kind to directly relate IT training and certifications with actual on-the-job skill validation." McDonald's apprentices regularly meet with mentors to measure their progress, a process that CompTIA describes as contextual learning. Meanwhile, the Naval Undersea Warfare Center's Keyport, Wash., division is engaged in an apprenticeship program requiring participants to master key "building blocks," complete 120 credits of education or training, earn four certifications, and amass 4,000 hours of on-the-job learning over a three-year period. Robert Half Technology IT recruiter Jeff Markham reports that economic factors will play a critical role in wider corporate adoption of such apprenticeship programs.
    Click Here to View Full Article

  • "Open Source Everywhere"
    Wired (11/03) Vol. 11, No. 11, P. 158; Goetz, Thomas

    Open source is growing very popular among scientists and innovators because it allows challenging problems to be addressed with democratic principles, and its long-term promise is more efficient and accelerated innovation across many fields. One of the most widely publicized applications of the open-source approach is software development, which provides an excellent model for the strategy; open-source software is freely distributed to thousands of people via the Internet, and these thousands experiment and refine the code to improve it overall. Open source's popularity has expanded significantly because of the Internet, which makes massive decentralized projects feasible, while researchers have embraced open source as an alternative to a rigid intellectual property scheme that is seen as a hindrance to innovation. The U.S. patenting system has mutated into a cumbersome architecture that keeps important research and information out of the public domain, and gives intellectual property owners the right to charge for use of concepts and ideas, not just products--and scientists, engineers, scholars, programmers, and designers view this situation as anathema to true progress. Unsurprisingly, patent holders and companies that follow the closed R&D approach see open source as a threat, when in actuality it is far less costly to license from a trusted collaborative project. Open source, combined with virtually nonexistent replication and distribution costs stemming from technological advancement, could rekindle listless economic sectors, or support the creation of new ones. "Open source can build around the blockages of the industrial producers of the 20th century," says Yale law professor Yochai Benkler. "It can provide a potential source of knowledge materials from which we can build the culture and economy of the 21st century."
    Click Here to View Full Article

 
                                                                             
[ Archives ] [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM