Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM. To send comments, please write to technews@hq.acm.org.
Volume 7, Issue 752:  Wednesday, February 9, 2005

  • "White House 2006 Budget Proposal Would Boost IT Spending"
    IDG News Service (02/08/05); Gross, Grant

    President Bush's proposed $2.57 trillion federal budget for fiscal 2006 raises IT spending 7 percent and National Science Foundation (NSF) funding almost 26 percent, although 150 government programs would be scaled back or jettisoned. Approximately 55 percent of the requested $65.1 billion federal IT budget would be committed to defense and domestic security. The plan entails a more than 7 percent increase in federal agency information security spending, while the science and technology budget would fall from $61.7 billion in fiscal 2005 to $60.8 billion in fiscal 2006. The budget request is also pushing for a permanent extension to the Research and Experimentation Tax Credit, which gives U.S. companies engaged in R&D a credit of up to 10 percent of their R&D spending. If approved, Bush's 2006 budget would axe the National Institute of Standards and Technology's (NIST) Advanced Technology Program that apportions funding to "high-risk, high-payoff" commercial technology R&D, and which opponents have criticized as "corporate welfare." Office of Science and Technology Policy director John Marburger says the program's funds could be put to better use in other NIST projects, while supporters such as 2004 Democratic presidential nominee John Kerry say the program is needed to cultivate vital new technologies, including less expensive manufacturing breakthroughs. The Advanced Technology Program received about $137 million in the 2005 budget, including work on experimental wireless technologies and computer chip research.
    Click Here to View Full Article

  • "Made in Lower-Cost America"
    CNet (02/08/05); Frauenheim, Ed

    A new way for American companies to save money without offshoring IT operations is emerging: "Homeshoring" work to midsize U.S. cities or rural regions, where labor costs are cheaper, partly because housing is less expensive. Rural Sourcing founder Kathy White maintains that urban tech workers frequently hail from smaller communities, and her startup taps talent that prefers to stay in those communities; she also says her company, which has partnered with several non-elite universities, has found favor with U.S. business leaders who were educated at such schools. Ciber, a Colorado-based systems integration company, is setting up low-cost, "made in America" application development centers in Oklahoma City as part of a plan to create more than 1,000 new jobs throughout the nation. Ciber's Cibersites President Tim Boehm states that homeshoring offers customers "another choice in avoiding the hidden costs of offshoring, such as language gaps, intellectual-property protection, travel, time schedules, infrastructure vulnerability, political risks and increasingly high employee turnover." Meanwhile, Ciber CEO Mac Slingerlend reported last month that midsize cities often boast large populations of students, retirees, and military personnel whose experience often exceeds that of overseas IT professionals. Dell has also been opening facilities in midsize cities such as Lebanon, Tenn., and Twin Falls, Idaho, while a planned manufacturing plant in Winston-Salem, N.C., will benefit from a 15-year, $225 million tax credit from the state. Another factor that makes homeshoring palatable to companies is the attractiveness of the smaller-community lifestyle and culture to employees. The mayor of Oklahoma City, for instance, cites the town's high quality of life, inexpensive cost of living, and sparse traffic as advantages.
    Click Here to View Full Article

  • "Improving Computer-Supported Work Through Scenario-Based Evaluation"
    Penn State Live (02/07/05)

    Penn State researchers led by School of Information Sciences and Technology professor Steven Haynes conducted an 18-month study of an integrated digital environment (IDE) for managing hundreds of military vehicles that has yielded strategies for boosting the system's efficiency and usefulness. The research entailed examining IDE usage scenarios that defined the actual contexts in which those involved in the study employed the system, and four types of benefits were identified: Measurable benefits, tangible benefits in the form of saved time, intangible benefits such as enhanced personnel management, and unrealized benefits that could be achieved via a system-by-system redesign and organizational efforts. "Scenarios encourage consideration of the contextual factors that impact effective use of collaborative systems," noted Haynes. Evaluating multiple-user computer-supported systems is tough, given the complicated organizational, social, and psychological workplace processes they uphold; but evaluation is key to design, assessment, and justification of organizational investment. In a paper presented at the Association of Computing Machinery's Computer Supported Cooperative Work Conference last November, the researchers detailed a scenario-based evaluation method developed as a surrogate for assessments that concentrate on feature sets and their performance traits. Researchers worked on a Marine Corps base, studying how different users with unique requirements and expectations employed the IDE, and uncovered the system's organizational context as well as ways to augment its design.
    Click Here to View Full Article

  • "Software Ties Marks to Digital Text"
    Technology Research News (02/16/05); Patch, Kimberly

    Hillcrest Communications software engineer Kevin Conroy notes that word processor users regularly proofread printed documents, but cannot transfer annotation marks to the documents' digital counterparts, leaving them little option but to manually re-enter the marks electronically. Conroy and other researchers at the University of Maryland are developing a system that enables users to import paper-based annotations back into a word processor using software that pins annotations to the processor's layout and formatting engine. With the open-source distributed paper-augmented digital document software, users can annotate their own printed copies of a document, and then compile all the annotation marks into a single electronic version. The process starts when a user prints a document onto paper that boasts a unique page identifier and a faint pattern of X and Y coordinates; Conroy says a copy of the document is saved in a database, along with the page identifier. The user then goes over the document with a pattern-reading digital pen, and the annotations are uploaded to the computer, letting the user select which marks to import to the document in the processor. Conroy says the software simplifies document creation, editing, and management, irrespective of whether the paper is in a printed or electronic format. "This infrastructure is vital to...group writing efforts often found in business, government and academic settings," he reports. Conroy and colleagues detailed their work at the User Interface Software and Technology (UIST '04) conference last October.
    Click Here to View Full Article

  • "Analysis: New DHS Pick Faces Cyber Dilemma"
    United Press International (02/07/05); Waterman, Shaun

    Former federal prosecutor Michael Chertoff is almost certainly to be confirmed as Homeland Security secretary and is already facing pressure on how to deal with cybersecurity. Congress, some government officials, and industry representatives have been pushing for an assistant secretary position for cybersecurity within the Department of Homeland Security (DHS); critics say cybersecurity currently has too low a profile to attract the best people, and that the National Cyber Security Division is too far removed from upper management to be effective. Former DHS cybersecurity chief Amit Yoran left in part because of frustration at bureaucracy, according to insiders. Chertoff told Congress during his confirmation hearing that he had already chosen a special cybersecurity advisor to serve within his office, a move that Yoran said indicated Chertoff took the matter seriously. Chertoff also previously headed the Justice Department's Computer Crime and Intellectual Property section, which means he has some familiarity with computer issues. Though members of Congress look to find support for legislation that would create an assistant secretary for cybersecurity, opponents of such a move say cybersecurity management should not be separated from the Infrastructure Protection group because physical and electronic threats are integrated. Former Energy Department assistant secretary Ed Badolato, who is being considered for the Infrastructure Protection post, said elevating the cybersecurity chief would not harm integrated protection if managed properly, but would help to lure experts from the cybersecurity industry; turnover is one of the major problems with DHS, he said. Another senior DHS official said the cybersecurity issue was being played up by vendors in hopes of increasing sales to the government.
    Click Here to View Full Article

  • "An Open-Ended Future for Open Source"
    IST Results (02/09/05)

    Open source software and free software (OSS/FS) is entering its golden age, according to some European researchers who point to unique commercial and technical advantages. OSS/FS allows users freedom in how they use, modify, and distribute the code, but also allow them to make money from OSS/FS, either by selling applications or value-added services. The abundance of available OSS/FS projects is exemplified by the 90,000-plus projects available from Sourceforge.net, says ACE-GIS scientific coordinator David Skogan, whose European Commission-funded project provides tools for gathering geographical information via Web services. Developers use ACE-GIS to link to various Web services available online and compile that information into an application that provides multi-faceted analysis of certain geographic locations; ACE-GIS applications could assist officials in the event of a chemical plant explosion by showing affected areas, transportation routes, and level of emissions, for example. Another EU-backed Information Society Technologies project using the OSS/FS model is ScadaOnWeb, which serves as a complementary technology to industrial companies' existing monitoring and control applications. ScadaOnWeb especially meets the needs of energy sector companies after energy market deregulation because it facilitates online data-sharing and provides a flexible, standards-based, and open technology that can be adopted by various parties. Though most OSS/FS applications are standalone, ScadaOnWeb project coordinator Michael Schwan expects complementary systems to be the entry point for OSS/FS in sectors dominated by proprietary software. Skogan predicts OSS/FS' share of the commercial software market will be equal to proprietary software within 10 to 20 years.
    Click Here to View Full Article

  • "Expert: Cooperate, or Risk Hobbling Moore's Law"
    CNet (02/07/05); Kanellos, Michael

    Interuniversity MicroElectronics Center senior fellow Hugo De Man told attendees at the International Solid-State Circuits Conference (ISSCC) on Feb. 7 that universal wireless interdevice communications can only be realized with portable devices equipped with microprocessors capable of executing 100 billion calculations per second while using only about 1 watt of electricity and costing no more than $10 each. Chips that function in stationary devices, meanwhile, will perform 1 trillion operations per second while consuming approximately 10 watts of power. Achieving this breakthrough entails greater cooperation between companies, in the form of exchanges of research and collaboration between hardware and software firms at the beginning of product development. "There is a need for worldwide sharing of interdisciplinary research costs to keep [Moore's Law] happy," De Man explained. Daeje Chin, South Korea's minister of information and communications, made an ISSCC presentation in which he predicted that 90 billion radio frequency identification sensor chips will be manufactured annually by the end of the decade, and probably constitute the most rapidly growing segment through 2020. A 3.5G telephone standard that occupies the space between 3G (whose data transfer rate maxes out at 2,000 Kbps) and 4G (whose expected peak speed is 100 Mbps) could also be a profitable near-term technology. Chin gave ISSCC attendees a glimpse of WiBro, a broadband transmission technology that transmits signals at 700 Kbps; cell phone users will be able to view several programs on their handsets with WiBro, for instance.
    Click Here to View Full Article

  • "Concept of $100 Laptop for World's Poor Is a Winner"
    USA Today (02/09/05) P. 3B; Maney, Kevin

    MIT Media Lab founder Nicholas Negroponte's unveiling of a $100 laptop concept at last week's World Economic Forum spotlighted a workable idea for delivering advanced communications technology to people in hostile, poverty-plagued regions. The laptop concept boasts more affordable display technology, a free operating system based on Linux, 1 GB of solid-state memory, and cheaper batteries that can be recharged with a hand crank. Several objectives are driving the push to bring technology and the Internet to the world's poor. They range from the billions of new customers tech companies stand to gain to an erosion of anti-American sentiment as developing nations' economies, living standards, educational systems, and cultures prosper from Internet penetration. Negroponte's Hundred Dollar Laptop Corp. project has drawn backing from major companies (AMD, Google), researchers (Hewlett-Packard's Alan Kay), and celebrities (Quincy Jones, Bono). Meanwhile, the Our Economy nonprofit is working to bring the Internet into low-income households, and AMD is rolling out its $185 Personal Internet Communicator, a small device that lacks a keyboard and screen. Kevin Maney suggests that President Bush's recently released federal budget should commit funding to a "Bits of America" program for spreading technology to lower-income world populations.
    Click Here to View Full Article

  • "EU May Pursue Software Patent Bill"
    Reuters (02/08/05); Grajewski, Marcin

    The European Union's proposed software patent bill has been held up for two months by Poland's opposition to rules that critics say would hurt small developers and limit freely available open-source software, but an anonymous Polish diplomat indicates that Poland could withdraw its objections and allow a second reading in the European Parliament to move forward. "If the Luxembourg presidency puts the computer patent directive as an A-point [an issue to be approved without a debate], we will raise no objections," the diplomat says, noting that amendments to protect small and midsize firms are nevertheless needed. Poland's refusal to ratify a compromise text in December encouraged the Parliament to oppose the bill, and the EU assembly's Legal Affairs committee recently voted to request that the legislation be discarded. Parliament leaders are currently debating whether to formally submit the request to the executive European Commission, and the Commission's Oliver Drewes said at a press conference that "It is the moment for us to keep options open." Should the computer patent bill be supported by EU ministers convening on Feb. 17, the legislation would be re-submitted to the Parliament, where revisions could be proposed. If this happens, legislators would need to arrive at a final compromise with EU member states. Major software developers object to the draft law on the grounds that it does not provide sufficient protection for long-gestating inventions.
    Click Here to View Full Article

  • "Stanford Selected as First Regional Center for Department of Homeland Security's National Visual Analytics Work"
    Stanford Report (02/07/05)

    Stanford University will serve as the first Regional Visualization and Analytics Center for the Department of Homeland Security (DHS), developing new tools and methods needed to manage, represent, and analyze large amounts of information. Specifically, Stanford will develop network traffic analysis methods that can help identify compromised computers; conduct psychological experiments to test the effectiveness of visual representation techniques; and investigate graphical representations of associations between files, authors, and time. Stanford engineering professor and project co-principal investigator Pat Hanrahan says the project's results could prove useful in helping individuals manage ever-increasing amounts of information. The DHS' National Visualization and Analytics Center aims to create improved data visualization tools and methodologies, which should help government officials identify terrorist threats; several regional centers will be established in 2005. Stanford was chosen for its expertise in visual analytics, and its contract calls for psychological tests that will evaluate how useful space and time aspects are in different visual representation techniques. Psychology professor Barbara Tversky serves as co-principal investigator on the project, which will also involve staff from the Computer Science Department and the Computer Graphics Lab. Stanford researchers will also develop graphical representations that improve association investigation, such as the links between authors and documents written within a certain span of time or in a particular field of expertise. The Stanford contract also includes study on computer network traffic analysis, especially finding ways to identify compromised machines that pose a threat to networks and data.
    Click Here to View Full Article

  • "Data Centers' Quest for Cool: Heat Issues Put Servers at Risk"
    Investor's Business Daily (02/08/05) P. A5; Brown, Ken Spencer

    Heat output is becoming increasingly problematic for data centers, and the advent of clusters, grids, and other technologies that pack servers closely together will likely exacerbate the situation. The bulk of the 250 watts of electricity a typical server consumes ends up as heat, and Hewlett-Packard engineer Chandrakant Patel says each dollar users spend to power up their machines adds up to two dollars in cooling and heat-related maintenance. The usual data-center cooling strategy involves an air conditioner that pumps cold air through the building, which is sucked into the system by fans on the server's chassis to draw heat and vent it back into the air conditioner. Server rows in data centers are frequently configured in a back-to-back formation so that hot air exhaust is vented into other servers' exhaust, the idea being that the hot air is sucked into the air conditioner; however, the air often ends up in other aisles and warms up air that should be cool. The typical strategy in this situation is to turn up the air conditioner, which devours money and cancels many of the cost savings promised by densely packed systems, while increasingly cold air carries the risk of damaging the system. Technology companies and data center managers are trying to solve the problem with a multi-pronged strategy that includes finding ways to better cool hot processor components and employ lower-voltage chips, as well as manage the heat of individual servers. Linux Networx employs Fluent's airflow modeling software to forecast how temperature throughout a building will be influenced by various factors, such as floor plans. Meanwhile, Patel and other HP engineers are developing intelligent data centers that monitor temperature and can adjust airflow on the spur of the moment. And the tightly-packed servers in IBM's Blue Gene supercomputer are kept cool with sheet metal panels on both sides of each server.

  • "Division Over Skills Crisis"
    Australian IT (02/08/05); Riley, James

    Opinion is split in the technology sector over whether a shortage of Australian IT workers exists. The Information Technology Contract and Recruitment Association (ITCRA) argues that the government should increase visas for skilled immigrants by 10 percent and relax Skilled Independent visa eligibility requirements for foreign IT workers; ITCRA executive director Norman Lacy says such workers are needed to teach domestic ICT graduates on projects. He also says the government should devise strategies to either lure back about 1 million skilled Australians from overseas, or replace them via the migration program. Meanwhile, the Graduate Careers Council of Australia has determined that finding full-time work is a problem for 30 percent of computer science graduates. Insurance Australia Group CIO David Issa says he is more troubled by a shortage of skilled IT workers stemming from declining numbers of domestic computer science students. Despite a new report indicating that local tech graduates' career prospects are being undermined by an excess of migrant tech professionals, the Australian Computer Society (ACS) has not yet formulated a policy on the issue. "The policy is likely to recommend changes to the current skilled migration program to make it more responsive to changes in the ICT labor market conditions," declares ACS CEO Dennis Furini. Australian Information Industry Association executive director Rob Durie reports that any domestic IT shortage is still too small to justify an immediate increase in IT migrants, though his organization will meet with Immigration Minister Amanda Vanstone to talk about increasing the flexibility of the skilled migration program.
    Click Here to View Full Article

  • "PC Recycling on Congressional Agenda--Again"
    TechNewsWorld (02/08/05); Mello Jr., John P.

    Rep. Mike Thompson (D-Calif.) introduced legislation calling for the establishment of national e-waste recycling standards that would require consumers to pay a maximum fee of $10 on all PCs, CRT monitors, or other federally designated devices; 3 percent of the fee would go to recyclers to cover administrative costs. This marks the third time Thompson has filed such a bill, which resembles a California law that has been in effect since the beginning of 2005: Under the law, consumers are charged a $6 to $10 recycling fee on TVs and CRT monitors, with recyclers pocketing 28 cents for every pound of e-waste they recycle. The only other state with e-waste recycling laws, Maine, extracts handling and recycling costs from TV and CRT monitor manufacturers. Texas Campaign for the Environment executive director Robin Schneider reports that Maine's recycling law and Thompson's law are emerging as the two leading e-waste recycling models. "We favor the approach of the Maine model, which is that electronic producers are responsible for the end of life of their equipment," she notes, arguing that the $10 consumer fee in Thompson's proposal will not sufficiently cover costs. In addition, Schneider contends that a flat fee model is unfair in that it will keep the cost to manufacturers the same regardless of whether their products are recyclable or not. She says that if Thompson's bill is rejected once more, "We have a better shot at getting more states to pass a producer responsibility approach and to get that up and running in a few places, rather than to push right now at the national level." Meanwhile, Sen. Ron Wyden (D-Ore.) is expected to file a measure that calls for the creation of e-waste recycling tax breaks.
    Click Here to View Full Article

  • "Project Honeypot Aims to Trap Spammers"
    New Scientist (02/05/05) Vol. 185, No. 2485, P. 26; Biever, Celeste

    The tide of spam can only be countered by a partnership between technology and legislation, stresses John Praed of the Internet Law Group. This was established by the trackdown, prosecution, and conviction of spammer Jeremy Jaynes, who may face nine years of incarceration for his activities, which netted him about $750,000 per month. Paul Graham, organizer of MIT's annual Spam Conference, says evidence uncovered at Jaynes' office suggests that spammers think spam filters are easier to thwart than they actually are. Filters, which scan messages for words typical of junk email, can sometimes be fooled by large amounts of random text spammers insert within their messages; or spammers can hijack computers with viruses and use them as spam launching pads. One tool Webmasters can use to build evidence against spammers is Chicago lawyer Matthew Prince's Project Honeypot software, which exploits a provision in the federal CAN-SPAM Act that criminalizes the harvesting of email addresses for spamming. The software can transform a Web site into bait for such harvesters: When "crawler" software visits the site, the software produces a bogus email address that the crawler captures, and records the time, date, and crawler address; this ensures that any mail sent to the fake address originates from the spammer. Prince admits that spammers will likely come up with anti-honeypot countermeasures, but says he has countermeasures of his own to deal with this scenario. Still, Graham notes that though Jaynes' conviction was cause for rejoicing at the Spam Conference, the battle against spammers is far from over.
    Click Here to View Full Article

  • "Matrix Realized"
    Science News (01/29/05) Vol. 167, No. 5, P. 72; Brownlee, Christen

    Brain-computer interfaces (BCIs) could give disabled people more independence and better quality of life, if the technology is perfected. BCIs developed over the past 30 years either direct electrical impulses into the brain or tap the brain's electrical output: Cochlear implants are an example of the former, while the latter category encompasses "neural prostheses." Such prostheses are difficult to develop, given the smallness and precision of their electrodes as well as problems in decoding the neural impulses they pick up. Electrodes have shrunk to the point that they can be implanted within the brain to record signals from individual neurons, and California Institute of Technology neuroscientist Richard Andersen has spent the last six years recording the brain activity of primates with such technology. The next step is to develop computer software that can translate those impulses into machine movements, and breakthroughs in this area include experiments at Duke University and the University of Pittsburgh in which monkeys direct mechanical arms by thought. BCIs have also been tested with human subjects: Cyberkinetics' BrainGate system, for instance, enabled a quadriplegic to open email, switch TV channels, control a robotic arm, and activate and deactivate lights. However, Andersen notes that electrode implantation is not an exact science--incorrect positioning, scarring, and natural movement around the brain can interfere with a BCI's function. John Donoghue at Cyberkinetics says the ultimate challenge for BCIs is overcoming people's fear of invasive surgery, while some researchers expect future BCIs to be noninvasive devices that pick up neural signals outside the skull's surface.
    Click Here to View Full Article

  • "A Musical Head Trip"
    Discover (02/05) Vol. 26, No. 2, P. 18; Johnson, Steven

    New software allows amateurs to compose or produce professional-sounding musical scores. Software that automatically adjusts pitch and otherwise manipulates instrumental playing or voices has been available for some time to those in the music industry, but when Apple introduced its GarageBand software last summer, groups of avid users began creating and sharing new piece-meal songs, often political satires that poked fun at then-presidential contender Howard Dean's "scream speech" or at President Bush's malapropisms. GarageBand's primary tool is a library of thousands of sound loops, such as couple measures of a drum beat, bass lines, or acoustic guitar strumming, for instance. Users can insert those loops where needed, then customize them by adding or subtracting notes; changing tempo is as easy as moving a slide bar on the screen. More sophisticated desktop sound-editing software allows people who sing out-of-tune to see where their notes are off and then correct them. Melodyne plots notes sung on a musical scale, allowing people to adjust the pitch of their voice or even use the same recording to create a harmonic version a few notes higher or lower. Vocaloid is another software that basically acts as a synthesizer for the human voice, and comes pre-loaded with three types of English-speaking voices and one Japanese voice; with 2,500 phonemes, or language components, behind each voice, the Vocaloid software can synthesize singing for lyrics and music the user types in or inputs via a MIDI file. In the future, top artists could license their voices in the same way they license their own songs to record companies. Eventually, software such as Melodyne and Vocaloid will become easier to use and incorporated into mass-market products such as GarageBand.
    Click Here to View Full Article

  • "Building High-Speed Lanes on the Information Highway"
    Scientist (01/17/05) Vol. 19, No. 1, P. 24; Daviss, Bennett

    A new paradigm for scientific research is being ushered in by meganetworks, which integrate new computing, data-storage, and networking technologies into a platform for geographically dispersed, multidisciplinary e-collaboration. One meganetwork is the National Lambda Rail (NLR) network in the United States, which harnesses over 11,500 miles of dark optical fiber to link numerous U.S. universities and research institutions. "Because we own the infrastructure, we can be more responsive to researchers' needs and more flexible in deploying new networking technologies," states Tom West, NLR CEO. Meanwhile, the National Institutes of Health's (NIH) Biomedical Informatics Research Network (BIRN) seeks to merge basic neuroscience knowledge with high-quality imaging as well as test a new collaborative research and resource configuration scheme. To this end, BIRN has implemented a trio of projects: Function BIRN, which tracks brain dysfunction in schizophrenia; Mouse BIRN, which focuses on animal models of brain cancer, Parkinson's, and other diseases; and Morphometry BIRN, which is probing depression, cognitive dysfunction, and Alzheimer's. TeraGrid is a network of nine supercomputing centers and research universities that taps 20 teraflops of computing power and almost 1 petabyte of online storage capacity using open-source grid software developed by the Globus Alliance. This toolkit "gives scientists the ability to run applications on multiple remote resources with a single login across all sites and to do high-performance data transfer," reports TeraGrid grid infrastructure group director Charlie Catlett. The success of meganetworks hinges on resolving privacy, data security, and intellectual property issues by facilitating a shift in the culture of scientific research.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "A Conversation With Tim Bray"
    Queue (02/05) Vol. 3, No. 1, P. 16; Gray, Jim

    In an interview with Microsoft Bay Area Research Center manager Jim Gray, Sun Microsystems director of Web technologies Tim Bray puts his career in perspective. His stellar trajectory began in 1986 as coordinator of the University of Waterloo's New Oxford English Dictionary Project, and then he founded the Open Text Corporation to market the OED technology as a proto-search engine. Bray was first introduced to the concept of the Web and search engines at a Standard Generalized Markup Language (SGML) conference, where he was inspired to build a system that copies data and displays URLs on the results page; he recalls that the final product's scalability was great with data size but poor with high data rates. Bray then co-edited the World Wide Web Consortium's XML specification, and he notes that the idea was to jettison the unused presentation element of SGML while retaining the content portion. He was next involved in the development of RDF, which spawned the Semantic Web concept, although Bray remains skeptical about the Semantic Web's fundamental goal to infer meta-data via natural-language processing. His interest with Virtual Reality Markup Language led to the founding of Antarctica, a supplier of polished visual interfaces to complex data sets. Looking back, Bray is surprised that the idea of XML did not come earlier. "I guess one explanation is that, to do it properly, you really need to crack the internationalization nut," he muses. Bray also authored a trio of well-known applications: Lark, the primary XML processor, which was deployed in Java; Bonnie, a Unix file system benchmark that taps the system's semantics; and Genx, a library written in C that facilitates the efficient generation of well-configured, authoritative XML. Bray says "XML is a lot more complex than it really needs to be," and says some of its functionality, such as entities, have been more problematic than expected.

  • "Naming Names"
    American Scientist (02/05) Vol. 93, No. 1, P. 6; Hayes, Brian

    Brian Hayes writes that the advent of computer technology has transformed the nature of name-giving to a model that demands greater uniformity and uniqueness among names, while also forcing many names and numbers to fit into an inflexible format with a particularized number of digits or letters taken from a finite alphabet. For instance, an acceptable country code for Internet addresses by the Internet Assigned Number Authority must have precisely two characters taken from the 26-letter English alphabet, so the number of available codes totals 676. However, such a standard establishes namespaces that rapidly fill up so that not everyone can get their first name or code of choice. Hayes draws a link between searching randomly for an unused name and the process of hashing, in which items of data are dispersed in a seemingly random fashion throughout a table in computer memory for quick retrieval. This scattering is not truly random, as the position of each item is established by a deterministic "hash function." A simulation of the filling of namespace based on this assumption suggests that the likelihood of getting one's first choice drops when the space is considerably more than half full, but the logic of this analysis falls apart because people have specified, not random, name preferences. Hayes analyzes statistical bias within a namespace using International Air Transport Association airport codes, and comes to the conclusion that finding unclaimed names in such a space is even more difficult. In addition, he admits that the experiment fails to account for the fact that many of the code words may have been chosen because the first choice was already claimed, while statistical bias fluctuates with the filling factor.
    Click Here to View Full Article