Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to technews@hq.acm.org.
Volume 6, Issue 707:  Monday, October 18, 2004

  • "Poll: Antiterror Tech Plans Are Flawed"
    CNet (10/18/04); Cooper, Charles

    A joint CNet News.com/Harris Interactive poll of 1,133 people conducted in late August found that Americans are generally unconvinced that the federal government is adequately applying technological solutions to homeland security, although they do display an eagerness to embrace technology to bolster security, even at the cost of rolling back civil liberties. Forty-five percent of respondents felt that the government's current tech programs are working, a figure that mirrors criticism leveled at technology spending and related operations by watchdog organizations since the Department of Homeland Security was founded. "From the beginning, we were concerned that this reorganization had gone the way of the Department of Energy--bigger bureaucracy and few results to show for it," says National Taxpayer Union VP Pete Sepp. "The picture isn't complete, but the brush strokes we've seen so far are pretty ugly." Roughly 53 percent of respondents said they would accept the repeal of certain privacy statutes and 70 percent approved the legalization of more intensive interrogation techniques; over 80 percent said they were willing to carry a national ID card, while approximately 70 percent believed such a card would improve security. Counterpane Internet Security founder Bruce Schneier says such results reflect a overreliance of faith in technology among Americans. However, almost 53 percent of those polled thought the government could do more with existing technology, while just 15 percent felt safer than they did a year ago and a mere 20 percent expected to become even safer in the future. Around 92 percent of respondents indicated that their vote in the upcoming presidential election would be affected by the candidates' position on security, and 62 percent said they would support a tax increase to underwrite new security measures.
    Click Here to View Full Article

  • "Problems You Can Shake a Joystick At"
    Washington Post (10/18/04) P. A1; Vargas, Jose Antonio

    The Serious Game Summit convening in Washington, D.C., this week will focus on what is considered by many to be the next major market for the video game industry: Non-entertainment sectors that value games as problem-solving tools, be those problems medical, educational, or military in nature. "What this emerging serious-games movement is trying to do is bring the concept of hands-on simulation training to as many activities as possible," notes Serious Game Summit keynote speaker and author Jim Dunnigan. Challenges the summit will concentrate on include using games to teach scientific principles to children, training soldiers to adapt to foreign cultures, and encouraging teamwork, says summit organizer and Digitalmill head Ben Sawyer, who quietly started the serious game movement with David Rejeski of the Woodrow Wilson Center. Rejeski explains that, with the exception of the military, discussion of video games inside the Beltway gives short shrift to their educational value, focusing instead on intellectual property or game violence and its impact on children. The U.S. military saw the usefulness of video games as a training tool early on, and has spent the last several years devising games such as America's Army and Full Spectrum Warrior, which Michael Macedonia of the U.S. Army's Program Executive Office for Simulation, Training, and Instrumentation describes as "first-person thinker games" that emphasize strategy rather than more traditional first-person shooter games. Guidance Interactive Healthcare founder Paul Wessell has developed Glucoboy, a glucose meter that can be linked to a Nintendo GameBoy, as a tool for encouraging kids with diabetes to score well on their blood tests. Other serious-game products include Virtual U, which simulates the life of a university president, and the Federal Budget Game, which will be available online before 2006.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "The Brains Behind AI"
    Wired News (10/18/04); Dean, Kari Lynn

    Stanford University computer scientist Daphne Koller recently earned a $500,000 MacArthur Fellowship on the strength of her work with solving basic machine learning problems and investigating intelligence fundamentals, an effort that is significantly expanding the scientific knowledge needed to construct computer programs capable of effective learning and intelligent reasoning. Association for Uncertainty in Artificial Intelligence Chairman Eric Horovitz cites Koller's development of knowledge-representative "fabrics" sewn together from logic and probability threads. Koller's UC-Berkeley post-doctorate advisor, Stuart Russell, is convinced that her work will help lead to computer systems that can truly comprehend natural language and answer questions based on all global information. Koller explains that accommodating an overload of information "using sophisticated data management and analysis tools is probably going to be one of the key tasks that [computer science] researchers have to face this decade." She has authored algorithms geared toward randomized game-theory strategies that have broadened the suitability of game-theory applications to the world of computerized solutions, and is currently focusing on using computer modeling to iron out the interplay between genes, proteins, and other cellular operations. The knowledge of cell functions gleaned from such research could lead to breakthroughs in cancer research. "Daphne has been at the forefront of work that demonstrates how real-world challenges can drive the developments of new theoretical principles and how those can be applied to teach us more about the world," Horovitz maintains.
    Click Here to View Full Article

  • "Instant Messenger Could Control Hacked Computers"
    New Scientist (10/13/04); Biever, Celeste

    American computer programmer Abe Usher's "Nmapbot" software robot can employ instant messenger (IM) to scan a remote network for vulnerable computers and use them to launch denial-of-service attacks once they are commandeered. Usher released Nmapbot to call attention to IM's potential security risks, but he notes that computer security staff can also use the bot to remotely scan networks using an IM-equipped cell phone or personal digital assistant. Nmapbot participates in IM chat sessions by waiting for specific directives and then performing the designated action. Nmapbot can scan for exploitable computers by running Nmap software, which also gives hackers a better picture of how tough the machines' defenses are by checking to see what programs are already running. Tenable Network Security CTO Ron Gula postulates that if Nmapbot was downloaded on a machine inside a firewall, then a hacker would be able to remotely access the computer over IM, and initiate the port scan from inside the network; Usher's bot can also launch denial-of-service attacks by sending out "pings" to all the networked computers with counterfeit "from" addresses. Meanwhile, Johannes Ullrich with the SANS Internet Storm Center reports that bots such as Nmapbot do not really expand a hacker's powers: For instance, hackers are already employing peer-to-peer networks and Internet Relay Chat to control compromised machines. Nevertheless, Usher warns that "the potential is there to control a vast army of IM bots through one master controller."
    Click Here to View Full Article

  • "Secure Online Transactions Worth Talking About"
    IST Results (10/13/04)

    The iPROVED project has earned one of 20 European IST Prizes awarded this year on the merits of its Vocalid crypto-acoustic smart card, which enables secure online transactions over any terminal anywhere in the world. The card can also be used anywhere physically because it interoperates with conventional magnetic stripe or chip readers, and iPROVED project director Serge Parienti reports that user feedback was generally "positive" among the 5,000 banking and telecom managers and decision-makers who tested the card in five European countries. The Vocalid card uses a dynamic authentication code comprised of a cryptogram, a fluctuating series of numbers that randomly changes every time the card is used. The cryptogram cannot be viewed by anyone, including the user, because acoustic transmission makes typing or displaying the authentication code unnecessary. Only the card can transmit the required authentication data to the server, which shields the Vocalid cardholder against hackers. The card is universally suitable for all kinds of card issuers, including e-learning, health care, call centers, government agencies, administration, financial institutions, and e-tailers. Between 38 percent and 41 percent of the users participating in the Vocalid trials said they would carry out more online transactions if iPROVED was rolled out on a global scale. Meanwhile, over 65 percent felt that Vocalid online payments demonstrated a need for additional security.
    Click Here to View Full Article

  • "Gadgets Getting Connected With DLNA"
    IDG News Service (10/15/04); Williams, Martyn

    Members of the Digital Living Home Network Alliance (DLNA), a consortium of over 180 companies that includes Intel, Microsoft, Sony, and Hewlett-Packard, announced plans at the recent Ceatac Japan 2004 exhibition to roll out commercial products based on version 1.0 of the group's home networking standard by year's end. Version 1.0 was published this past June. Chairman of the DLNA board of directors Scott Smyers reports that the transition from standardization to product deployment is accelerated because version 1.0 builds on existing standards: The network the specification requires is based on wired or wireless Ethernet running IP version 4, while media is conveyed over the network via HTTP, and universal plug'n'play facilitates the discovery, control, and management of linked devices. Interconnecting the various devices is just one component needed to fulfill the vision of a digital home network; another key element is getting the devices to communicate in common formats, which in version 1.0 consists of the JPEG image, Liner PCM audio, and MPEG2 video protocols. A follow-up specification, version 1.1, will encompass optional media formats that include GIF, PNG, and TIFF images, MP3, Windows Media Audio, AAC, AC-3, ATRAC3, and the MPEG4 Part 2, Part 10, and Windows Media Video 9 video formats. Smyers says the DLNA will concentrate on integrating existing or new technologies instead of devising its own standards. "DLNA's intention is to grow the technology over time because new technologies are always being invented," he notes. Firms are working outside of the DLNA on solutions that will accommodate digital rights management systems, which Smyers says is not part of the DLNA guidelines. "Our mission is not to promote technology that doesn't have a place but promote technology that is open," he asserts.
    Click Here to View Full Article

  • "Computer History Museum to Honor Five Innovators"
    Mercury News (10/18/04); Poletti, Therese

    The Computer History Museum will induct five new fellows at its annual awards dinner and ceremony on Oct. 19. Software Arts co-founders Dan Bricklin and Bob Frankston will be honored for their development of VisiCalc, the first electronic spreadsheet software program designed for the PC, which became a template for Lotus Development's Lotus 1-2-3 spreadsheet. Both Bricklin and Frankston earned computer science degrees from MIT, where Frankston contributed to Project MAC and the Multics mainframe time-sharing operating system. Washington Advisory Group principal Erich Bloch's induction is based on his engineering management of IBM's Stretch supercomputer and his work on the initial employment of miniaturized circuits in the IBM 360 mainframe. "A lot of things we learned in Stretch were carried over into the 360," explains Bloch. The success of the IBM System/360 project drove the transformation of the data processing business and contributed greatly to IBM's ascent, and the late Bob Evans will be recognized by the museum for his leadership of the project. The fifth honoree is Niklaus Wirth, who earned his fellowship for the development of the Pascal computer language. Other languages Wirth has helped develop include Algol-W, Euler, Modula, and Oberon.
    Click Here to View Full Article

  • "$4.5 Million for Research on Collaboration Technologies at Universities Across Canada"
    CNW Telbec (10/12/04)

    The Network for Effective Collaboration Technologies Through Advanced Research (NECTAR) will receive a contribution of $4.5 million, according to an Oct. 12 announcement from Minister of Industry David L. Emerson. The new research network, backed by Science and Engineering Research Canada (NSERC) under its Research Networks Grant Program, was established to develop technologies for improving remote virtual collaboration in order to "help Canadian companies bring new products and services to market to further support collaboration," said Emerson. The NECTAR research effort will involve the participation of 11 researchers at the University of Toronto, the University of British Columbia, the University of Calgary, the University of Saskatchewan, Dalhousie University, and Queen's University, with University of Toronto computer science professor Dr. Ronald Baecker filling the post of NECTAR's scientific director. Public- and private-sector partners will donate almost $1.2 million to the project, along with expertise and resources to guarantee the maximum dissemination of research results. NSERC's Research Networks Grant Program supports large-scale, complex projects involving collaboration on a common theme across multiple sectors, and that show the added benefits of a networking strategy in which both private- and public-sector partners contribute. "[NECTAR] will undoubtedly provide a better understanding of the possibilities collaboration technology offers," declared NSERC President Tom Brzustowski.
    Click Here to View Full Article

  • "Privacy Eroding, Bit by Byte"
    Washington Post (10/15/04) P. E1; O'Harrow Jr., Robert

    Less expensive computers, growing networks, and innovative engineers are all chipping away at people's ability to remain anonymous and unobserved: Security cameras, grocery-store loyalty cards, Web cookies, and now biometric identification systems are contributing to increasingly large stores of personal data. The most recent tools to help record personal data came from VeriChip, whose implantable microchip keeps health information available in cases of emergency, and Google, which launched a service that allows people to scan their own computers the same way they search the Internet. Internet security consultant Richard Smith says users enjoy the convenience these data-storing services provide, although "people kind of instinctively know there's a dark side to this." Ohio State University law professor and privacy expert Peter Swire says all these data-recording systems mean people's every move is becoming part of a permanent record. In 1999, University of California at Berkeley researchers predicted the amount of information stored would double by now, and that trend is confirmed by the growing stores of data at commercial consumer data aggregators such as Acxiom, which today stores about 1 million times more information on each adult American than when the company went public two decades ago. Commercial information services providers tap video-mining services as well as the traditional credit-card and loan records, and in the future could store information collected by pervasive RFID sensors as well. Some experts see a bright side in this growing trove of recorded information, such as futurist David Brin, who says online personal information will help breakdown social barriers and reconnect the global populace in a new worldwide "village." But even Brin admits there is the possibility that the data could be misused.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Mind Over Matter"
    Richmond Times-Dispatch (10/14/04); Loft, Kurt

    Emory University physicist and author Sidney Perkowitz postulates that self-aware robots could emerge as a result of a convergence between biological and mechanical systems--a convergence that Perkowitz claims is already underway with advancements in bionics and the science's movement toward neurorobotics. Some 25 million people in the United States have artificial components in their bodies, and people are expected to incorporate even more machinery as the human life-span increases and medical technologies become more sophisticated. Perkowitz eventually foresees "the formation of direct connections between living organic systems and nonliving ones at the neural and brain levels." Humanoid robot technology is making progress on several fronts: Japan's Humanoid Robots Project receives an annual government grant of $38 million to develop a machine with a five-year-old's physical, mental, and emotional capabilities, among other things; MIT researchers are working on robots that can imitate human actions and responses as well as smell odors; and Honda Motor's Asimo robot is impressing people with its facial recognition, movement, dexterity, balance, and command response capacities. However, these robots are a far cry from cyborgs--cybernetic organisms with a natural or naturally programmed brain. Perkowitz reasons that cyborg technology opens a can of worms, such as whether such machines can be self-directed, are capable of functioning intelligently in the real world, or are truly conscious.
    Click Here to View Full Article

  • "ICANN Bulks Up Board in Face of UN Moves"
    Computer Business Review (10/12/04)

    ICANN, through its Independent Nominating Committee, has appointed two new members to its board of directors while also announcing that board Chairman Vint Cerf will return to helm the board for another three years. The two new members appointed by the committee are Brazil's Vanda Scartezini and Japan's Joichi Ito. Both appointees are viewed as strategic moves to blunt Internet governance efforts by the United Nations and the International Telecommunications Union (ITU). Brazil and China were both strong supporters of a failed attempt to get the ITU to take over ICANN's responsibilities, so bringing Scartezini aboard could strengthen ICANN's relationship with Brazil. Scartezini's experience includes representing Brazil on ICANN's Government Advisory Committee; serving as the Brazilian National Secretary of Industrial Technology; and serving as the National Secretary of Information Technology. Ito, the founder of Infoseek Japan and PSINet Japan, has stated that he prefers ICANN to the ITU for managing the Internet. Ito says he decided to join ICANN "to try to prove that the people of the Internet can govern themselves without direct involvement from nation-states." If ICANN proves incapable of managing IP addresses, the name space, and other important elements, "ICANN will be dissolved and the ITU will step in," says Ito.
    Click Here to View Full Article

  • "Q&A: Agile Software Development Maximizes Project Rewards"
    Enterprise Systems (10/12/04); Swoyer, Stephen

    Independent consultant Roy Miller says traditional software development methods lead to budget overruns, late projects, and products of limited usefulness, while agile software development is a much more natural process that allows software development teams to more accurately target user needs and produce truly useful products. Traditional software development is based on the "scientific management" theories of industrialist Frederick Taylor, who transformed factory-type industries by instituting rigid management and worker frameworks. When software development began in the early 1950s, companies applied this thinking to software development, even though it dehumanized and regimented a process that requires a lot of communication and adjustment to succeed. Miller speaks from experience when he describes the months of planning that precedes software projects--plans he says are largely worthless because of unexpected obstacles and business requirements. Software development is a "messy problem" where the solution and understanding of the problem evolve together. With agile software development methods such as extreme programming, meetings are held every week to discuss issues and requirements and weekly plans are made on that basis. Requirements are written on note cards, which seems anathema to more analytical programmers and managers. Miller suggests placing reticent programmers on the production side of software development where they can apply their scientific management method to finding production-level bugs in existing applications.
    Click Here to View Full Article

  • "Tech Doors Opening to the Blind"
    Star (Malaysia) (10/14/04); Rozario, Brigitte

    Visually impaired people are gaining a small toe-hold in Malaysia's IT sector, according to testimonies from disabled Malaysians in the small tech-savvy state of Penang. B.G. Lim, a visually impaired programmer for the Altera semiconductor firm, usually inverses the text and background colors on his screen and uses keyboard shortcuts as much as possible to accommodate his limited peripheral and night vision; he has stopped using screen readers because of the hardware resources those programs required, and sometimes asks colleagues to "borrow their eyes." St. Nicholas School for the Blind technology training head Silatul Rahim Dahman, 36, was the first foreign blind student in the United States to train in technology, and now trains blind students from all over Malaysia in basic computer skills as well as database operation and Web design. Although learning computers is difficult for blind people, they improve access to information, mobility, and job opportunities. "Technology has somehow helped to minimize these problems," Rahim says. "We don't have to move around as much. We can connect via email, as well as read Web pages. We can work from home." Rahim aims to learn more programming skills so that he can help local developers create Malaysia-specific speech reading software that would be more affordable and translate the Bahasa Malaysia language more accurately. Both Lim and Rahim say the Malaysian government should do more to encourage IT accessibility and employment for disabled people through policies and private-sector benefits.
    Click Here to View Full Article

  • "One Grid to Rule Them All"
    Economist (10/07/04) Vol. 3731, No. 83961, P. 74

    CERN's Large Hadron Collider (LHC) is expected to churn out about 15 petabytes of data per year once the machine is up and running in 2007, and analyzing this massive load will take the collective effort of approximately 100,000 of today's fastest PCs. Tasked with processing the LHC-generated data is the LHC Computing Grid (LCG), a distributed computing system that has expanded from 12 participating computing facilities one year ago to some 80 centers in 25 nations, together comprising more than 7,000 computers. LCG project manager Lee Robinson is confident that the rest of the required computing power will be available by the time the collider is ready. The progress of the LCG project was reviewed at a September conference on computing in high-energy physics in Interlaken, Switzerland, where several "data challenges" were disclosed. One of the challenges detailed at the meeting involved using the grid to process simulated LHC data, but at only 25 percent of the collider's expected data generation rate. The LCG project has aroused the interest of industry heavyweights such as IBM, Hewlett-Packard, Oracle, Intel, and Enterasys Networks, which are using the LCG as a testbed for cutting-edge hardware and software under the aegis of their openlab partnership with CERN. LCG is thought to be the largest and most global computing grid in existence, although there are competing grids such as NASA's Information Power Grid and Scandinavia's NorduGrid, which is also being used by particle physicists. The vision of a single grid is a long way off from being realized because academic progress toward universal grid middleware standards has been sluggish.
    Click Here to View Full Article

  • "Scouring the Planet for Brainiacs"
    Business Week (10/11/04) No. 3903, P. 100; Engardio, Pete; Roberts, Dexter; Sandler, Neal

    Product research and development has become a world-spanning enterprise thanks to the Internet, inexpensive telecom links, and interactive-design software improvements, and the value of "global innovation networks" is proving itself as time-to-market becomes an increasingly important factor for companies in the wake of the economic recession. Also contributing to this trend is the fact that many firms have not been seeing returns comparable to their R&D investments. Krishna Nathan, director of IBM's 200-engineer Zurich Research Laboratory, says, "There is tremendous pressure on industry to innovate more--and do it more quickly." In response, companies are spending more on R&D while also expanding the research net worldwide. Innovation benefits from this R&D diffusion: Companies can ramp up their development cycles through round-the-clock recruitment of global R&D teams, with the result that consumers and industry will get cheaper and more diverse products faster. And multinationals can tap a deep reservoir of worldwide university researchers for consultation. "The more we can leverage outside talent and companies with great ideas, the more product we can get out," boasts Doug Raser, head of global strategic marketing for Texas Instruments. Global innovation networks also enable new players to enter the market with less difficulty. Keeping pace with the accelerated momentum of innovation is a particularly formidable challenge, one that IDEO meets by accumulating data on scores of chip, software, and manufacturing suppliers in 100-plus countries. "You have to be tireless about updating these databases because the half-life of most of the information is about four weeks," notes IDEO smart-products unit leader Dave Blakely.
    Click Here to View Full Article

  • "Cluster Headaches"
    Federal Computer Week (10/11/04) Vol. 18, No. 36, P. 39; Moore, John

    Various companies are developing file systems designed to improve the data storage performance of increasingly popular Linux clusters. The open-source clusters deliver a huge amount of computing power while costing dramatically less than proprietary symmetrical multiprocessor (SMP) systems, but the clusters cannot deliver the same level of performance because they do not possess SMP machines' proprietary interconnects, buses, and file systems. Idaho National Engineering and Environmental Laboratory (INEEL) CTO Dan Wickard reports that effectively weaving numerous computer processors with correspondingly high-performance input/output channels for transferring data into and out of storage is a major challenge, but analyst Arun Taneja says significant breakthroughs have emerged in the last five years. A collaborative effort between Hewlett-Packard and Cluster File Systems has won a contract to develop a scalable global secure file system as part of the Advanced Accelerated Strategic Computing Initiative. The project emphasizes the development of the Lustre file system for large-scale Linux clusters, which is maintained as open-source software by Cluster System officials under the GNU General Public License. The National Center for Supercomputing Applications and Lawrence Livermore National Laboratory are using Lustre, and Cluster File Systems CEO Phil Schwan says Lustre will be deployed on Lawrence Livermore's IBM Blue Gene/L cluster and Sandia National Laboratories' Red Storm cluster. Meanwhile, IBM has made its General Parallel File System available for Linux clusters, even though it was originally designed for Unix, while INEEL officials are implementing Sun's QFS file system on a pair of Solaris boxes, with plans to export the systems to a Linux cluster through Network File Systems.
    Click Here to View Full Article

  • "Global Talent. Local Markets. India Calls. Are You Ready?"
    Siliconindia (09/04) P. 28; Shankar, Pradeep; Revanna, Harish

    A recent SiliconIndia poll of Indian American executives in U.S. corporations found that 60 percent are actively seeking opportunities to emigrate back to their native country, an attitude becoming more prevalent as India's economy and outsourcing market expands and U.S. businesses become more willing to offshore jobs. Also fueling this reverse brain drain are the lack of a global outlook in the United States and an excessively management-centric U.S. market. Most returning Indians are higher management workers with at least 15 to 20 years of professional experience under their belt. They are not returning because they are homesick, but because they wish to take advantage of improvements in the quality of life, better job perks and challenges, greater foreign investment, and the growing presence of cutting edge technology development projects in India. Another plus cited by interviewed returnees is the separation of technology and management in Indian companies, through which they have secured higher positions than they had in American companies. BEA Systems' Bhavin Sheth notes that "The opening up of Indian markets has resulted in availability of the same goods and services one is used to in the U.S. at an affordable price," and adds that these goods and services cost more as a percentage of one's salary in India than in the United States. Most of the interviewed returnees admit to pining for the citizenship-acquired countries they left, but their homesickness is assuaged somewhat by modern communications such as the Internet and IP phones. Returnees are also appreciative of Indian culture for offering their relocated families a more social, less materialistic environment in which to grow up.

  • "Sir Tim Berners-Lee"
    Technology Review (10/01/04) Vol. 107, No. 8, P. 40; Frauenfelder, Mark

    Sir Tim Berners-Lee, inventor of the World Wide Web and founder of the World Wide Web Consortium (W3C), has spent 15 years working on the Semantic Web, a globally distributed database enabled by tagging Web-page data with definitions and weaving them together so that programs can truly understand the data and establish new connections between pieces of information. He expects such a breakthrough to enhance people's lives and revolutionize drug discovery, cancer research, and other life-sciences initiatives that involve a meshing between multiple fields, to name just one application; the development of artificial intelligence is another application Berners-Lee thinks the Semantic Web will be well suited for. Establishing cross-field interoperability and addressing privacy and intellectual property issues are also key to the Semantic Web's viability. Berners-Lee says the first phase of the Semantic Web's development is complete with the W3C's approval of standards for the syntactic and semantic languages necessary for enabling more efficient information-sharing between computers. The syntactic language provides a common data format in which to render information with hard data, while the semantic language establishes ontologies--arrays of names for concepts within the data, such as date and time, temperature, pressure, an event, location, and a transaction--whose connections are exploited by a system of rules. Berners-Lee says the Semantic Web dovetails with his vision of "a single Web of meaning, about everything and for everyone." This "consensual space" exists independently of hardware, software, operating systems, culture, format, language, character set, writing orientation, and so on, a concept he terms "Web universality." Berners-Lee hopes the Semantic Web will build up into a critical mass, fueling an exponential increase in the desire for expansion in keeping with the growing value of information resources.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)