Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to technews@hq.acm.org.
Volume 7, Issue 774:  Monday, April 4, 2005

  • "Pentagon Redirects Its Research Dollars"
    New York Times (04/02/05) P. B1; Markoff, John

    The Defense Advanced Research Projects Agency (Darpa) is cutting back its basic computer science research at universities in favor of military contractors and projects focused on short-term results. Pentagon officials admitted to the shift in priorities for the first time at a recent Senate Armed Services Committee hearing, where they said university projects only comprised $123 million of $583 million spent last year on computer science research, compared to $214 million out of $546 million devoted to university computer science projects in 2001. ACM President and University of California, Berkeley computer scientist David Patterson said the cutback could staunch the flow of important technology advances in the future. He says, "I'm worried and depressed. I think there will be great technologies that won't be there down the road when we need them." Besides cuts in funding, Darpa has added requirements that graduate student contributors have U.S. citizenship and classified some projects even though they were to be distributed under open source licenses. Darpa director Anthony J. Tether has led the changes in computer science funding and has said increased secrecy is needed for computer science projects because computer technology is now as vital to military operations as weapons themselves. The Network Embedded Sensor Technology program that was underway at five universities has been diverted to military contractors, for example. Darpa's funding changes have led some researchers to decline working with the agency, while others decry the shift away from long-term projects. University of Washington computer scientist Ed Lazowska says, "The federal government is...killing the goose that laid the golden egg." Former Darpa administrator Robert Kahn, now Corporation for National Research Initiatives president, said the changes are a necessary response to the pressures and demands on the agency.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Computers Obeying Brain Signals"
    Associated Press (04/03/05); Ritter, Malcolm

    Technologies designed to improve the quality of life for paralytics and other mobility-challenged people by enabling them to communicate and operate devices using brain-wave signals are under development around the world. The last few years have seen many promising brain-machine interface technologies come to the fore, thanks to combined advances in neuroscience, computer software, and electronics. Dr. Jonathan Wolpaw of the Wadsworth Center of the New York State Department of Health's Brain-Computer Interface lab has developed a wearable nylon mesh equipped with electrodes that monitor neural activity near the brain's surface and measure the brain's "beta rhythm" to direct the movements of an onscreen cursor; the beta rhythm originates from the section of the brain that receives movement-related information, and the patient learns to adjust this rhythm in order to control the cursor. Cyberkinetics Neurotechnology Systems chief science officer John Donoghue thinks implantable brain-computer interfaces such as his BrainGate are more effective tools because they allow patients to operate devices simply by imagining their limbs moving. Scientists say implantable systems can facilitate much more sophisticated control and natural movement of artificial or even human limbs than Wolpaw's scalp recorder, though Wolpaw thinks that scalp electrodes could translate thought into movement just as effectively when combined with software. Duke University scientist Miguel Nicolelis envisions lightweight "wearable robots" that enable paralytics to walk and reach for objects, while Dr. Philip Kennedy of Neural Signals believes electrodes implanted in the brain' speech centers would make "locked-in" patients able to communicate via a synthesizer.
    Click Here to View Full Article

  • "Revamp for Web Navigation System Urged"
    New Scientist (03/31/05); Biever, Celeste

    A report funded by the U.S. National Academies, the Department of Commerce, and the National Science Foundation is calling for changes that would make the Internet's Domain Name System (DNS) less susceptible to corruption by spammers and identity thieves. The DNS is a network of servers that connects domain names to Internet protocol addresses. According to the report, DNS records are too easily corrupted by denial-of-service (DOS) and spoofing attacks that can disable Web sites. The report suggests increasing the number of "copy" DNS servers and distributing them throughout the world to help thwart such DOS attacks. In order to prevent another common scam, in which identity thieves "poison" a cache of DNS records in order to point legitimate Web addresses (usually banking Web sites) to fake sites, the report calls for the deployment of DNS Security Extensions (DNSSEC), which use cryptographic keys to digitally verify every returned Web page. The report authors also recommend that the DNS "continue to be run by a non-governmental body," although the system's current governing body, ICANN, is affiliated with the U.S. government. Karl Auerbach, a former member of ICANN, says a better alternative would be to have DNS servers run by several different private companies in a free-market system, which he says would make the system harder to attack.
    Click Here to View Full Article

  • "Women Dominate IT Courses But More Men Get Degrees"
    Stuff (NZ) (04/04/05); Schwarz, Reuben

    The Tertiary Education Commission (TEC) estimates that female IT students at tertiary institutions outnumbered male IT students in 2004, yet more men than women continue to earn IT degrees. Fifty-five percent of students in polytech IT courses last year were women, compared to about 38 percent of students studying IT at university over the past two years; women accounted for 63 percent of IT students at private training schools in 2004, compared to less than 50 percent in 2003. However, just 25 percent of Bachelor of Information Science students at Wellington-based WelTec polytech were female last year, although WelTec director Murray Wills hopes that more women will be drawn to polytech IT courses because classes are smaller and there is a greater focus on application. TEC's Bill Lennox thinks the gap between TEC and polytech estimates may be due to the TEC's concentration on courses taken, compared to polytechs' emphasis on students attempting to gain qualifications. Women in Technology general manager Cheryl Horo notes that TEC's figures do not estimate the number of women who complete their courses or enter the industry upon graduation, which would demonstrate an even smaller female presence. Gwyn Claxton with Auckland University of Technology's (AUT) computer and information sciences school reports that many potential female students are discouraged from enrolling in IT courses because they perceive IT as a geeky, mathematically-inclined boys' club. Fifty percent of AUT's faculty are women, but that does not seem to have encouraged more female IT enrollments. Horo sees a need to step up efforts to repudiate the geeky image of IT students by "[promoting] careers in industry rather than qualifications."
    Click Here to View Full Article

    To learn more about ACM's Committee on Women in Computing, visit http://www.acm.org/women.

  • "'Body Talk' Could Control Mobiles"
    BBC News (04/04/05); Twist, Jo

    University of Glasgow professor Stephen Brewster says the use of Visually based mobile-device interfaces on the move can be problematic, and his team has been working on "audio clouds" to increase the safety and ease of use of such gadgets by facilitating control and operation via sound and movement. "We hope to develop interfaces that are truly mobile, allowing users to concentrate on the real world while interacting with their mobile device as naturally as if they were talking to a friend while walking," Brewster explains. The Engineering and Physical Sciences Research Council is funding the three-year audio clouds research project. Brewster and his Multimodal Interaction Group have devised a method for controlling gadgets using gestural input and 3D audio output; the latter involves the development of bone conductant headphones. The team has also been working on systems that employ motion-sensitive accelerometers that instruct devices. One student project involves a wizard game where each player possesses one of the setups prototyped by Brewster's team. The scheme involves a person "hearing" another person in a different location through the audio cloud, and using gestures to interact with the other person. The Multimodal Interaction Group is also investigating techniques for managing the vast amount of data and functions accessible via smart phones, but Brewster expects a considerable amount of time to pass before audio-cloud gadget control becomes inexpensive and socially acceptable.
    Click Here to View Full Article

  • "Feds Complete Internet Traffic Report"
    Associated Press (03/31/05); Bridis, Ted

    The National Research Council has released an Internet traffic study that was commissioned by Congress seven years ago, just as the Internet boom was getting started. The 283-page report is titled "Signposts in Cyberspace" and is relevant despite the time passed and changes to the Internet marketplace, according to leading experts. Since the report was commissioned in 1998, Google became a leading Web company, Napster spread MP3 contagion, and the Web addresses the report was initially slated to study have grown from 2.2 million to over 65 million. The report details information on the domain name system (DNS) and trademarks, and says the current system is extremely well-suited to meet current and future needs of the Internet. The 13 DNS root servers should continue to be operated by volunteers, but their locations should be more geographically spread out to prevent disruption by natural disaster or terrorist attack. Concentrations of root servers around Washington, D.C., and Los Angeles should be dispersed, according to the study, which also advocates more Internet address suffixes to ensure an adequate supply for new Web sites and email accounts. National Research Council computer science and telecommunications director Charles Brownstein says the study was delayed in part because of dramatic changes to the Internet as those issues were being studied, but also because Congress did not fund the project's $1 million price tag until 2001. Paul Vixie, a prominent technologist who helped review the report, says the time it took to produce the study only underscores the rapid pace of change on the Internet.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "IST: Driving Road Transport Research"
    IST Results (03/31/05)

    IST-funded projects will play a significant role in the European Road Transport Advisory Committee's (ERTRAC) 20-year plan to improve road transport through research. The MITRA project's goal is to develop a prototype central information system that police and emergency services could access to know the location and contents of vehicles carrying dangerous materials in real time, thus enabling them to be prepared when they arrive at the scene of an accident. EURAMP, meanwhile, seeks to refine the traffic metering and control systems that coordinate motorway entry in an effort to reduce congestion, and pilot projects to test new metering and control algorithms are planned in Israel, Britain, the Netherlands, Germany, and France. Increasing driver safety through an adaptable driver/vehicle interface is the motivation behind the AIDE project, and Volvo Technology's Jan Arfwidsson says such an interface "can analyze the driver situation and the information load at that moment, and adapt the information accordingly." The Global System for Telematics (GST) project focuses on creating international telematics standards to accommodate the burgeoning population of in-vehicle electronic systems in an effort to make vehicles safer and reduce traffic congestion and atmospheric pollution, according to Peter Van der Perre with ERTICO. Another initiative to improve in-vehicle communications is SAFETEL, whose goal is to fortify in-vehicle electronics against electromagnetic disturbances and interference from other vehicle electronic systems. "We want to produce the standards and test methods that will help designers produce in-vehicle electronic systems that are better-protected against unwanted signals such as noise and cross-talk," notes Giovanni D'Anzieri of Italy's System Design and Research Association.
    Click Here to View Full Article

  • "Computerworld Development Survey Gives Nod to C#"
    Computerworld (03/28/05)

    A Computerworld survey of developers found that 72 percent of respondents used Microsoft's C# programming language, while 66 percent used Java; the third, fourth, and fifth most-used programming languages were Visual Basic (62 percent), C++ (54 percent), and JavaScript/ECMAScript (50 percent). Microsoft .Net was the framework/API of choice for 51 percent of respondents, and 48 percent said Unified Modeling Language was not in use at their organizations, compared to 33 percent reporting that it was. Thirty-seven percent said they used mostly Java, 26 percent said they used mostly .Net, and 23 percent reported using both Java and .Net. Half of the polled developers said they use open-source code, although the majority said that 64-bit applications, Linux applications, or wireless applications are not under development at their organizations. Fifty-eight percent reported that their organizations were already utilizing Web services and developing more projects; 15 percent said pilot Web services projects were underway, 12 percent claimed to have an active interest, and 9 percent indicated no interest. An integrated development environment (IDE) was the preferred editor for 45 percent of respondents, while 35 percent said they used both IDE and text editors and 17 percent said they favored text editors. Thirty-eight percent of those surveyed held the title of IT manager, and 36 percent held the title of developer. Thirty-nine percent of respondents said their company produces or sells software and services for internal use only, and 58 percent indicated that they develop applications that are deployed for corporate-wide or business-to-business use.
    Click Here to View Full Article

  • "How Universities' Intelligent Web Project Unlocks the Information That Really Counts"
    Computer Weekly (03/29/05); Kavanagh, John

    In his annual BCS and Royal Signals Institution lecture, Southampton University School of Electronics and Computer Science professor Nigel Shadbolt discussed intelligent Web searches that would produce incredibly accurate results through their understanding of Web page contents and their relevance to the user. "The huge growth of the Web and the massive advances in technology mean that computing brute force can deliver so much content so quickly that there is not enough human processing power to go through it in detail," he explained. "We are starting to see a requirement for our machines to know enough about structural descriptions to make the first cut of what we might be interested in, instead of giving all of it." The vehicle for intelligent searches is a "semantic Web" that enables computers to infer context through the use of metadata tags that describe the content's meaning and its relationship to other objects. Shadbolt is coordinating a six-year, multi-university research effort to make intelligent Web search practical, with a budget of 8.8 million pounds. Southampton University has developed a Web service for classifying computer science documents trained on 300,000 papers from a digital library, using the library's classification scheme, the authors' and editors' classifications, and machine learning and statistical analysis. Sheffield University is at work on a service for analyzing any Web page and applying the metadata to objects of interest, while Shadbolt said another area of concentration is the storage and retrieval of the reams of metadata, along with the concept of an information lifecycle that spans generation, usage, publication, maintenance, and decommissioning. Thus far the use of the semantic Web has been limited to military and scientific applications by specialist communities, which illustrates the need for consensus of terminology to facilitate effective search and results presentation.
    Click Here to View Full Article

  • "Apache Rolls Cocoon 2.1.7"
    DevX News (03/24/05); Kerner, Sean Michael

    The recently released Apache Cocoon 2.1.7 Web development framework from the Apache Software Foundation (ASF) features enhanced speed and ease of use through JDK 5.0 interoperability, various portal engine augmentations, and additional features in the Cocoon Forms architecture, according to Cocoon developer Carsten Ziegeler. "All these changes are driven/feedback by needs/requirements of real world projects," notes Ziegeler, who sees Cocoon as a competitor to other open-source Web development frameworks rather than a rival to enterprise offerings. He reports that the Cocoon Project has long been perceived as a research initiative, but nowadays people are realizing that Cocoon is both a practical and elegant tool for constructing complex Web applications. "It seems that especially the forms framework and the flexible portal engine are two key components [for building successful applications]," Ziegeler remarks. One Cocoon-based application, Apache Lenya Content Management System (CMS), was MIT's leading recommendation for CMS usage in a September 2004 report. Ziegeler foresees the next major Cocoon iteration, version 2.2, as boasting even more ease of use for developers. The Cocoon Project is one of the biggest ASF efforts to be based on Java.
    Click Here to View Full Article

  • "NASA Tests Shape-Shifting Robot Pyramid for Nanotech Swarms"
    SpaceRef.com (03/29/05)

    NASA's TETwalker robot is the prototype for autonomous nanotechnology swarms (ANTS) that will be able to assemble themselves into practical instrumentation or traverse uneven terrain by changing their shape. The TETwalker resembles a three-sided pyramid with electric motors at each corner or "node" that drive the expansion and retraction of telescoping struts that form the sides; this constitutes the robot's means of locomotion. The robot's field tests in Antarctica revealed that its performance could be improved by design modifications: Relocating the motors from the corners to the middle of the struts, for instance, will boost the nodes' reliability. Shrinking the TETwalkers down can be accomplished by replacing the motors with micro- and nano-electro-mechanical systems and the struts with metal tape or carbon nanotubes. TETwalker swarms would be able to reconfigure themselves to perform numerous tasks, such as flattening themselves into an aerodynamic shield while traveling through a planet's atmosphere; forming into a snaky configuration to slither over rocky terrain; and growing an antenna to send information to Earth. Furthermore, the TETwalkers' nodes will be designed to disconnect and reconnect to different struts so any damage the swarm suffers can be repaired. The researchers are investigating the facilitation of autonomous swarm movement, navigation, and collaboration through artificial intelligence. The stability and strength provided by the TETwalkers' tetrahedral shape means that they will not be impeded by falls--unlike current robotic rovers, notes ANTS project principal investigator Dr. Steven Curtis.
    Click Here to View Full Article

  • "MIMO Holds Promise for Wireless"
    eWeek (03/28/05); Garcia, Andrew

    New MIMO technology promises to dramatically increase range and throughput of wireless networks, but proposed MIMO implementations could hinder either performance or compatibility with existing technology. MIMO-OFDM (Multiple-input Multiple-output-Orthogonal Frequency Division Multiplexing) will be part of the IEEE 802.11n standard due for ratification late next year, and the TGn Sync proposal, which offers multiple antennas and spatial multiplexing technology, gained an advantage over the competing WWise proposal in this month's downselect vote. While garnering a majority, the TGn Sync proposal still lacked the 75 percent vote needed to confirm its position as a first draft of the standard. Some mandates for the 802.11n MIMO technology are backward compatibility with previous 802.11 technologies, which will mean switching between higher capacity 40 MHz channels and 20 MHz channels used in older implementations when those devices are present. Testers at eWeek Labs measured approximately twice the bandwidth with Linksys' MIMO-enabled Wireless-G Broadband Router with SRX and CardBus client adapter products over standard 802.11g technologies, though the performance boosts required slight changes to access point settings; also, large organizations will likely be hesitant about using CardBus adapters. The Linksys proprietary MIMO products use Airgo's True MIMO with spatial multiplexing. Spatial multiplexing involves separate spatial signatures sent over the same channel, while another MIMO proposal, Video54's BeamFlex, uses multiple identical data streams to boost signal distance and coverage. Another beamforming technology from Atheros Communications uses two channels, which could prove problematic when in crowded RF environments.
    Click Here to View Full Article

  • "Carnegie Mellon Unit Looks to Advance IT Security, Reliability"
    Computerworld (03/28/05) P. 23; Thibodeau, Patrick

    Pradeep Khosla, dean of Carnegie Mellon University's Carnegie Institute of Technology and co-director of CyLab, explains in an interview that CyLab is focusing on next-generation IT systems that incorporate measurability, sustainability, security, and trustworthiness. He says that CyLab absorbed the Sustainable Computing Consortium, whose goal was to enhance the quality and reliability of software by reducing the number of bugs. Khosla says CyLab splits up its research into "thrusts:" Its resilient and self-healing systems thrust, for example, is not about security per se, although it does address some security issues. Other thrusts Khosla mentions cover user authentication and access control, data and information privacy, business economics, and threat detection modeling. The CyLab co-director notes that CyLab has the same goals as IBM's autonomic computing initiative, although their approaches differ--CyLab, for instance, usually concentrates on higher-risk problems. Khosla reports that CyLab has produced a practical secure storage demo system which is being expanded to include self-security, self-analysis, and self-repair. Such a system would enable users to trace data packets back to the source, and Khosla predicts that a lab-developed coding scheme for facilitating packet tracing will become commonplace in the next three to five years. He thinks CyLab's backers could put malicious code detection on the CyLab 2006 agenda at next month's meeting.
    Click Here to View Full Article

  • "IPv6 Addresses Its Problems"
    ZDNet UK (03/29/05); Goodwins, Rupert

    IPv6 today poses little benefit to U.S. and European enterprises or ISPs, which receive more immediate payback when investing in basic network capabilities such as bandwidth, security, and reliability. However, IPv6 has garnered significant interest from Asian companies that need the improved configuration and quality of service characteristics, not to mention vastly more addresses, for increasingly networked Asian households. With older IPv4, NAT technology allows multiple devices on private networks use a single public address, but NAT offers limited and inflexible support for multiple services and devices operating behind the router; in contrast, IPv6 offers automatic configuration and understanding of quality of service requirements, and can support networked phones, televisions, game consoles, and recording devices. IPv6 improves VoIP operation through home routers, and is sure to benefit from the rapid expansion of VoIP telephony. Japanese ISPs NTT and IIJ are enthusiastic about IPv6, while many other overseas groups want to relieve themselves from IPv4 address limits. Another aspect of increasing IPv6 interest is government support, such as the Defense Department's June 2003 requirement that vendors support IPv6 in their network equipment. Specialist network deployments such as the European GEANT research system and European Internet Exchange Association provide valuable practical experiences for later commercial enterprise rollouts. Many large companies are already accumulating IPv6 technology by default when they buy new equipment, and could have significant infrastructure and implementation resources available once the business benefits of IPv6 migration become more apparent.
    Click Here to View Full Article

  • "Robot Translators Decipher Mountains of Enemy Messages"
    Knight-Ridder Wire Services (03/16/05); Boyd, Robert S.

    Machine translation (MT) is seen by U.S. intelligence agencies as an important tool in fighting the war against terrorism, as the supply of human translators is limited and a vast backlog of untranslated material exists. Spurred by terrorist attacks and wars in the Middle East, MT research has yielded products that soldiers in the field, for instance, can use to understand the basic meaning of documents in a foreign language, reports Army Research Laboratory researcher Melissa Holland. Booz-Allen Hamilton machine translation systems manager William McClellan notes that MT systems can automatically screen thousands of documents and route those that meet certain criteria to linguists and domain specialists, thus easing military and intelligence officers' burden of determining which documents to prioritize for translation. The original technique for MT was to teach computers the traditional rules of grammar, but this method made for sluggish progress, given the inherent complexity and ambiguity of human language. The technology took an enormous stride in the 1990s with the application of statistical analysis to repositories of untranslated texts in which new messages were compared to millions of archived sentences, words, and phrases to rapidly uncover the most probable translation. Data-driven machine translation is a process in which the computer scans text, lists every possible meaning of every word in the sentence, and organizes them in every possible order until it finds the most likely match for a good translation. Stephen Richardson, head of Microsoft's Machine Translation Project, notes that MT is being increasingly applied to international commerce, although the war on terror is still the chief driver of MT research and development.
    Click Here to View Full Article

  • "Road to Mars"
    EE Times (03/28/05) No. 1364, P. 1; Wilson, Ron

    Getting humans to Mars will take technologies that are currently off the industry's road map, but that are possible with basic microelectronics. NASA intends to meet this challenge by incrementally testing each key component, starting with unmanned vehicles sent into orbit, then proceeding to lunar missions and pre-positioned equipment and provisions. "This is a design that depends on modularity," explains Georgia Institute of Technology Guggenheim School of Aerospace Engineering professor Robert Braun, who stresses that the long travel times and communication lags of a Mars mission make self-repairing autonomous systems critical. He also says human-machine communications will need to be raised to higher levels for a successful Mars mission: Not only will robots pave the way for human exploration by setting up fuel, water, and food stores for later missions, but close interaction between people and autonomous agents will be an essential ingredient of manned expeditions. Robot agents will need to maintain abstractions of themselves, continuously updated via sensors that could relay the abstractions' current status to the astronauts. These sensors would need networks for facilitating fault-tolerant, deterministic connectivity in volatile environments, as well as the flexibility to perform segment rerouting should the vehicle suffer damage. Braun sees a need for small, reliable, and implantable sensors and on-board predictive models to monitor the astronauts' health, while processing the huge volume of data acquired by these various sensors will require a massive amount of computing power and logic. The autonomous technologies developed for space missions could be spun off into practical applications ranging from caregiving to battlefield operations to waste clean-up to automotive navigation.
    Click Here to View Full Article

  • "Search for Tomorrow"
    InformationWeek (03/28/05) No. 1032, P. 45; Claburn, Thomas

    Expanding federal and corporate investments in emergent IT areas such as electronic medical records, Internet telephony, and anti-terrorism technology are making improved methods for searching digital information more and more desirable. New approaches are under investigation for making it possible to retrieve information from databases, documents, Web pages, or audio and video clips; identify the names of people, places, dates, organizations, and dollar amounts, and establish their relationships to each other; and extract meaning from sounds and images. Keyword searches are the most common type of search, but retrieving the most relevant results via this method does not work well in enterprises, where experts say about four-fifths of the data on hand is unstructured. Google offers a special server "appliance" that companies can employ to index their information and expose it through the typical Google user interface; Google Enterprise general manager Dave Girouard says Google's PageRank algorithm determines data's relevancy by measuring over 100 variables. Keyword searching is criticized for being insensitive to PC users' activities, and Autonomy produces software that performs background scans of an organization's documents and then suggests search results that may be most relevant to the user's current project. IBM engineers are looking into search engines that can analyze multimedia, one example being the Unstructured Information Management Architecture, which is designed to assist other programs' acquisition, analysis, and presentable configuration of text, audio, and video. Other new search technologies being developed include Nexidia's "phonetic search engine" that can search recordings 50 times faster than their actual playback speed by analyzing phonemes, and a "question and answer system" from Arizona State University professor Dmitri Roussinov that can answer queries without depending on a database of linguistic conventions.
    Click Here to View Full Article

  • "The Enterprise Blogosphere"
    InfoWorld (03/28/05) Vol. 27, No. 13, P. 42; Delio, Michelle

    Companies are adopting blogs and wikis as tools for improving dialogue among workers, customers, and the public because they can effect information exchange and establish specially tailored, user-friendly data archives. Martin Wattenberg with IBM Watson Research Center's collaborative user experience team says blogs and wikis fulfill opposite roles: Blogs are used by individual users to put forward personal views, while wikis are a platform for integrating many views into a coherent, collectively edited whole. "The strength of blogs and wikis is that they provide direct interaction with readers," notes Bluebill Advisors CEO Frank Gilbane, who recommends that enterprise blogs be devoid of marketing jargon reminiscent of press releases if they wish to engage readers. He also says companies have a responsibility to establish a clear employee blogging policy so that workers will not leak sensitive information to readers, and to ensure that public blogs aimed at specific audiences will be written by specific, communication-adept personnel. Blogs that are too candid can trigger a branding implosion, according to concerned public-relations professionals. Wikis, meanwhile, are seen as excellent tools for disseminating info as well as accumulating feedback: "Unlike blogs, wikis are designed for continual editing of a set of documents, making them very suitable for developing a knowledge base," remarks IBM researcher Bob Gruen; in addition, wikis offer an easily accessible technique for facilitating collaborative content creation by groups or teams. Edward Williams, who supervises the fraud and security department at a consumer bank, says there must be an emphasis on user accountability and frequent review of postings when wikis are employed to exchange important data. Gilbane expects corporate wikis to more closely resemble blogs as permission-based features that enable posted content to be more rigorously controlled are incorporated.
    Click Here to View Full Article

  • "Hollywood Profits v. Technological Progress"
    Chronicle of Higher Education (04/01/05) Vol. 51, No. 30, P. B24; Ben-Atar, Doron

    As the U.S. Supreme Court listens to arguments in the MGM Studios v. Grokster case, a historical perspective shows the futility of entertainment companies' efforts to stop the spread of new technology, writes Fordham University history professor Doron Ben-Atar. If entertainment companies are successful in stopping the development of P2P software in the United States, innovation in that area will simply be shifted to other nations. Historically, U.S. economic success is founded on a strict intellectual property regime and loose enforcement of intellectual property laws; intellectual property protection is written into the Constitution from the beginning, and the American Patent Act of 1790 introduced the prior use principle, requiring patented ideas to be without precedent anywhere in the world. At the same time, the first 50 years of the United States saw wholesale piracy and theft of trade secrets, as evidenced by the textile mills at Lowell, Mass., for example. The result of both strict patent requirements and de facto encouragement to infringe on those protections was tremendous innovation and rapid dissemination of improved technology. American products were better and cheaper than those made elsewhere, and there was a culture of free exchange of ideas that made the United States the center of innovation and creative entrepreneurship. If the entertainment industry succeeds in blaming P2P software firms for secondary liability of copyright infringement, as the Bush administration has suggested in a friend-of-the-court brief, file-sharing will not stop, P2P software development will move to other countries, and the United States will undermine the free flow of knowledge, Ben-Atar concludes.
    Click Here to View Full Article