Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM. To send comments, please write to
Volume 5, Issue 581:  Wednesday, December 10, 2003

  • "Developers Take Linux Attacks to Heart"
    CNet (12/09/03); Lemos, Robert

    Open-source software groups have been shaken by the string of attacks on their host servers, including machines governing the Linux kernel, the Debian Project, the Gentoo Linux Project, and the GNU Project. Previously, relatively few hacker attacks focused on open-source software efforts and mainly targeted widely used Microsoft products; the recent spate of attacks may simply signal the growing popularity of open-source software, but for the developers themselves it underscores the need for tighter security. Though the attacks on Gentoo Linux on Dec. 1 and on the Linux kernel servers in November were detected quickly and caused little damage, the GNU Project and Debian development systems remain offline as administrators ensure security. Free Software Foundation general counsel Eben Moglen, whose organization maintains the GNU Project, says digital signatures will now be required from developers to verify their identity. Linux creator Linus Torvalds says the recent break-ins did not seriously threaten the integrity of the open-source projects because they have sufficient checks and balances throughout the system to make sure malicious hackers cannot insert damaging code. "The kernel source code is endlessly replicated," he explains. Still, Apache Software Foundation developer Justin Erenkrantz says the popular Apache open-source Web server software will be moved to a more secure source-code maintenance system called Subversion, which requires cryptographic signatures--also called checksums--for transactions and reduces the number of local accounts.
    Click Here to View Full Article

  • "Intel to Release First 'Causal' Learning Algorithms"
    EE Times (12/08/03); Johnson, R. Colin

    Intel will release details of its probabilistic network library (PNL) learning package at this week's Neural Information Processing Systems (NIPS) conference. The open-source PNL is capable of merging separate data flows from real-time sensors with as much ease as detecting and identifying objects and conditions, and this will allow causal relations to be easily inserted into control programs that oversee sensor networks. The direction of causality will be represented by directed graphs. Intel researcher Gary Bradski reports, "The probabilistic graphical models in PNL provide a formal way, for the first time in science, of describing causality." The PNL suite is a one-size-fits-all core inference engine rather than an assortment of disparate methods; the suite's beta version will be released first, followed by a "gold" release next summer. Bradski says, "The proper terminology is that this is a probabilistic graphical model, and these techniques cover a huge range of formal techniques--everything from principle component analysis to Kalman filters to Markov models" employed in tracking and speech. The PNL offers Bayesian networks, which are termed directed acyclic graphs, as well as a broad scope of undirected models and certain directed/undirected graph combinations.
    Click Here to View Full Article

  • "Massive Software Engineering Reform Is a Must"
    ZDNet (12/06/03); Farber, Dan

    This month's National Cyber Security Summit brought together federal government and major private sector software groups, and resulted in the creation of a task force meant to improve national software security. The Security Across the Software Development Life Cycle task force is just one of five groups created at the summit, and is charged with making recommendations on how to improve software engineering, deployment, and management. Computer Associates chief security strategist and task force chair Ron Moritz says an important idea that needs to be hashed out is that of licensing software engineers in the same way legal, accounting, and medical professionals are licensed; Moritz points to a number of existing examples, where professional licenses are managed by the federal or state government, and often in conjunction with professional organizations. Software groups such as the CERT Coordination Center, SANS Institute, and Computer Security Institute could provide standards for software security. Task force education subgroup chair Fred Cohen says professionalism is an important issue in software security, and that university degrees provide a basic level of certification. Moritz believes education is important as well, and says ethics training should be incorporated into computer science instruction, since vulnerabilities are often exploited by software engineers. Barriers to certification are huge, including added cost, time to certify the workforce, and difficulty in getting long-time programmers to conform to new, better practices. If recommended by the task force, a move toward software certification would take many years, but Moritz believes moves forward must start now; he also says more resources need to be funneled into basic software development research efforts that identify what makes secure software.
    Click Here to View Full Article

  • "IEEE: Chinese Security Standard Could Fracture Wi-Fi"
    IDG News Service (12/09/03); Lemon, Sumner; Evers, Joris

    IEEE senior executive Paul Nikolich fired off a letter to Chinese government officials Li Zhonghai and Wang Xudong last month warning that China's deployment of the GB15629.11-2003 WLAN standard threatens to fragment the market for wireless networking equipment. China's standard bears a close resemblance to IEEE's Wi-Fi wireless networking specification, with the major disparity residing in the security protocol: Wi-Fi uses Wired Equivalent Privacy (WEP), while GB15629.11-2003 employs WLAN Authentication and Privacy Infrastructure (WAPI). All WLAN equipment sold in China is required to use the standard starting Dec. 1, though the compliance deadline has been extended to June 1, 2004 for certain WLAN products. "We are concerned that mandatory use of the [Chinese WLAN] standard would prohibit the use of 802.11 standard products and thereby limit choice and increase costs to users," wrote Nikolich in his letter. He admitted that current Wi-Fi network security leaves a lot to be desired, and explained that his organization is trying to address the problem through the development of the 802.11i standard--and expressed a willingness to discuss China's own network security concerns in the hope that they too could be allayed through the rollout of 802.11i. If the two standards bodies fail to reach an accord, then WLAN equipment providers may have to supply products that support both standards or offer two separate types of WLAN gear, one based on 802.11 and one based on GB15629.11-2003. However, IEEE 802.11 Wireless LAN Working Group Chairman Stuart Kerry has not ruled out the possibility of bundling WAPI into 802.11 so a fracturing of the WLAN product market can be avoided.
    Click Here to View Full Article

  • "States Scrutinize E-Voting as Primaries Near"
    CNet (12/08/03); Festa, Paul

    With the presidential primaries on the horizon, some U.S. states are taking second looks at electronic voting systems as concerns about security and accuracy surface. Much of the latest voter backlash against e-voting systems, critics say, stems from blunders committed by Diebold Election Systems, chief among them the disclosure that its flagship product is insecure and partisan political statements by CEO Walden O'Dell. However, Diebold's David Bear thinks that second thoughts about e-voting are bubbling up as states rush to comply with the Help America Vote Act, which mandates that states must modernize their voting systems by 2004 in order to receive federal aid. Ohio Secretary of State J. Kenneth Blackwell wants to push back e-voting from March to August in response to the disclosure of dozens of potential security risks in four e-voting systems discovered by InfoSentry and Compuware; Blackwell indicates that fixing the software problems in certain cases will require new certification by state and federal governments. Nevada is expected to choose between Diebold and Sequoia e-voting systems in the coming week: Meanwhile, Secretary of State Dean Heller plans to hold town-hall meetings with elections officials to address voters' e-voting worries, and is awaiting the results of an analysis of e-voting machines by the state's Gaming Control Board. Maryland state senators' call for the deployment of paper verification systems has spurred the state to order e-voting machine audits, while California Secretary of State Kevin Shelley gave all cities and counties in the state until July 2006 to supply touch-screen voting systems that print out a paper ballot. "A huge movement has developed across the nation, with citizen activists joining computer scientists, academics, lawyers, and nonprofits to demand verifiable voting systems," declares California Voter Foundation President Kim Alexander.
    Click Here to View Full Article

    For information about ACM's activities regarding e-voting, visit

  • "Yes, They Can! No, They Can't: Charges Fly in Nanobot Debate"
    New York Times (12/09/03) P. D3; Chang, Kenneth

    Foresight Institute Chairman Dr. K. Eric Drexler and Nobel laureate Dr. Richard E. Smalley of Rice University are waging a long-standing feud in print about the possibility of creating self-replicating "nanobots" that could construct almost anything. Smalley sees no practical, error-free method for precisely configuring millions of atoms into nanostructures through molecular assembly, and argued in a 2001 issue of Scientific American that a molecular assembler would need a "finger" that manipulates each atom into the desired position--and fitting enough fingers into such an assembler in so small an area is impossible. Drexler has argued that the creation of nanobots is a possibility, since his proposals do not violate basic physical laws such as conservation of energy or the speed of light, and this year countered in an open letter on the Web site that his proposal for a molecular assembler never included multiple "Smalley fingers." His proposed assemblers function in the same way as enzymes and ribosomes, which prompted Smalley to retort that enzymes and ribosomes generally only work in water, which would restrict the kinds of materials that could be constructed. Drexler responded to this observation by describing the operation of a theoretical "nanofactory" where computers are employed "to assemble small parts into larger parts, building macroscopic products." Furthermore, Drexler has argued that Smalley is incorrect in his claim that enzymes only function in water. Smalley and Drexler's nanobot debate does not seem to have made much of an impact on the opinions of nanotechnology researchers in general.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "One Small Step Made on the Long Road to Realizing Quantum Computers"
    Daily Yomiuri (12/09/03); Otsu, Terumitsu

    Japanese scientists at NEC and the Institute of Physical and Chemical Research have successfully built a fundamental element of a quantum computer--a "quantum gate" that could be a component of the quantum equivalent of a computer chip. NEC research fellow Tsai Jaw-Shen reports that the gate can only function in extremely low temperatures because it relies on superconductivity, though he hopes that the operating temperature can be raised to a level more comparable to that of conventional computers. Tsai theorizes that a quantum computer can complete calculations, such as factorizing large numbers, much faster than even the most powerful existing supercomputer, while other potential applications include determining the properties of proteins and molecules, and solving biochemical, biological, environmental, and climatological problems. Kazuo Nakamura of NEC's Quantum Information Technology Group explains that quantum computers could also decode practically any encrypted message, though quantum cryptography itself promises to be unbreakable. He wages that a considerable amount of time must pass before quantum computers become a reality, and estimates that only 10 percent of the job has been accomplished thus far. "The single Qubit [quantum bit] was completed in 1999 and the two Qubit operation has been completed this year," Nakamura notes. "But we have to integrate these two components."
    Click Here to View Full Article

  • "Online Learning and the ROI of Training High-Tech Wizards"
    E-Commerce Times (12/08/03); Diana, Alison

    Companies seeking to boost the skills of their IT employees and individuals looking to advance their careers can choose from a score of e-learning products and services, and ascertaining the return-on-investment (ROI) of e-learning is critical in the current economy. Michael Brennan of International Data reports that most companies determine ROI in terms of easily measurable factors such as usage and completion, but there is a growing segment deploying systems and processes to help evaluate whether e-learning material is making a significant impact on IT workers' day-to-day activities, as well as gauge the actual financial ROI of various training sources. Candy Haynes of Deloitte's U.S. Consulting Group notes that around 20 percent of her company's user experience was e-learning prior to Sept. 11, 2001, but e-learning users accounted for 85 percent of Deloitte's users worldwide as of last year. Furthermore, Deloitte has been able to reduce training costs by 60 percent to 70 percent over the last three years while providing additional training. Useractive CEO Patrick Brown recommends that corporations evaluate e-learning's ROI by considering such factors as how much more time is freed up for employees, a decline of time-to-expertise, and how Internet-based training "[reinforces] employee IT skills with safe, personalized, hands-on experience with personal coaching anytime, anywhere." SkillSoft e-learning strategy director Dorman Woodall points out that it is more cost-effective for businesses to invest in e-learning tools and services outside the company than develop e-learning content internally. Industry executives add that e-learning is also being leveraged by IT workers to address immediate needs. "The Holy Grail is...breaking down training in all its elements--both content and how it's delivered--to make it more and more practical: to deliver the right information at the right time to the right learner," explains Eduventures analyst Eric Bassett.
    Click Here to View Full Article

  • "Researchers Report Progress Embedding Devices in Fabrics"
    EE Times (12/09/03); Mokhoff, Nicolas

    Researchers gathered at this week's International Electron Devices Meeting detailed their work in the field of ambient intelligence, with a particular emphasis on electronic textiles. "Ambient intelligence is the vision that technology will become invisible, embedded in our natural surroundings, present whenever we need it, enabled by simple and effortless interactions, attuned to all our senses, adaptive to users and context and autonomously acting," explained Werner Weber of Infineon's Corporate Research Laboratory of Emerging Technologies. One area Infineon is focusing on is the embedding of electronics into clothing concurrent with the development of grid-distributed sensor and display functions for textile applications, the goal being to incorporate security and monitoring functionality into canvas covers, carpets, and wallpaper. Another Infineon research project involves textile interconnect and packaging technology through the employment of a polyester narrow fabric in which silver- and polyester-coated wires replace certain warp threads, while a flexible printed circuit board is attached to the fabric as well; this breakthrough has led to the development of an MP3 player system controlled by either a keypad or speaker-independent voice recognition. Other e-textile research reported at the conference includes the University of Tokyo's integration of organic field-effect transistors and rubber pressure sensors for synthetic skin, a significant development for touch-sensitive next-generation robots. Meanwhile, University of California-Berkeley researchers detailed the creation of flexible transistors directly on fibers without relying on traditional lithography, a promising step towards scalable e-textile manufacturing.
    Click Here to View Full Article

  • "A Potluck Party for XML" (12/05/03); Boulton, Clint

    XML application interoperability will be the main theme of this week's XML Conference and Expo 2003, which is hosting demonstrations by BEA, Adobe, and others. Major standards organizations such as the Organization for the Advancement of Structured Information Standards (OASIS), the World Wide Web Consortium, and the Web Services Interoperability consortium are presiding at the expo. OASIS will demonstrate WS-Reliability interoperability, with the participation of Hitachi, Fujitsu, Oracle, NEC, and Sun Microsystems; they will perform the role of server or client while various network problem scenarios--dropped, duplicated, and disordered messages--are run. OASIS will also showcase health epidemic management via ebXML, UBL, and XACML. Adobe's solution to the problem of combining data and presentation to enhance business processes via its XML architecture will be discussed in a keynote speech by Adobe's Shantanu Narayen. Other keynote speakers will include BEA Systems' Adam Bosworth, who will talk about the popularity of XML and its impact on application architecture; and Dan Shiffman of IBM's Almaden Research Center, who will demonstrate the construction of large-scale systems within a firewall through the use of Model Based Design and XML RPC. A Hewlett-Packard representative will also be on hand to showcase HP's employment of XML to transfer XML-based content between internal and external assets with Trigo, Documentum, Oracle, and other partners. "The big trend at the conference this year is 'This is what we did and it has worked,' whereas in previous years the proposals were 'This is what we'd like to do,'" notes XML 2003 Chair Lauren Wood.
    Click Here to View Full Article

  • "Hey, Santa, Just So You Know: Techies Are Up to No Good With 'Smart Dust'"
    USA Today (12/10/03) P. 3B; Maney, Kevin

    Researchers at the Pentagon, UCLA, the University of California-Berkeley, and elsewhere are developing smart dust, minuscule devices or "motes" equipped with sensors, computer chips, and radio transmitters that can be deployed into widely distributed ad hoc mesh networks and make decisions on their own. "That's the grand vision of the technology. The sensors will not be dumb but will collaborate," says Alan Broad, chief engineer of Crossbow, a company that has taken a leadership position in the rollout of smart dust. The potential applications for smart dust range from environmental monitoring to enhanced battlefield strategy to structural integrity readings. The technology still has technical hurdles to overcome, not the least of which is size: Researchers intend to shrink the motes down by 50 percent every 18 months. Building small but powerful batteries for the motes is another challenge, while the devices' limited memory capacity precludes the use of sophisticated software programs--a task made all the more difficult by the complexity of setting up ad hoc mesh networks and making group decisions. Smart dust developers are aiming to deliver motes no bigger than a grain of rice within five to 10 years. Berkeley researchers plan to distribute approximately 200 motes over the Golden Gate Bridge to monitor its movement in windy conditions, and California's James Reserve is already employing motes to track changes in habitat.
    Click Here to View Full Article

  • "Transforming Non-Geeks Into Computer Whizzes"
    Silicon Valley Biz Ink (12/05/03); Ascierto, Rhonda

    Ellen Spertus, who runs an interdisciplinary computer science graduate program at Mills College in Oakland, Calif., is attempting to break myths about computer geeks in the hopes of bringing more women--and other persons with non-technical backgrounds--into the technology field. She notes that many people, women in particular, are late bloomers when it comes to using computers; their interest in computer science is kindled usually after they have entered the workforce, and the Mills program intends to offer people a way to merge computer science with other areas of interest, such as liberal arts. Spertus explains that such a interdisciplinary course of study is advantageous in a number of ways: People skilled in other fields as well as computer science can design unique products that are beyond the skills of traditional computer scientists. In addition, such people might be more difficult to replace with cheaper overseas labor. Examples that Spertus cites include a student who collaborated with a Livermore Labs researcher to develop an audio icon called an earcon; the student's background in electronic music served her well by allowing her to design pleasant tones to relay data. "Our students are interested, really, in applying computer science to real problems and making use of knowledge they have of other fields," Spertus notes. She observes that many girls are put off by computer science because of a widespread stereotypical view of computer people as nerdy and socially isolated, while the transfer of computer science programs from the liberal arts division to the engineering division has also contributed. "The idea [behind the Mills program] is [to provide] a route for women into computer science, since some of them were discouraged from doing it when they were in college or never considered it," Spertus asserts.
    Click Here to View Full Article

    For information about ACM's Committee on Women in Computing, visit

  • "Yahoo Proposes New Internet Anti-Spam Structure"
    Reuters (12/05/03); Berkowitz, Ben

    Yahoo! says its Domain Keys software, which may be launched next year, will fight spam by changing the way the Internet requires a sender's authentication, and the company adds that it will be made freely available to developers of the Web's major open-source email software and systems. Domain Keys would have a system sending an email message embed a secure, private key in a message header, and the receiving system check the Domain Name System (DNS) for the public key registered to the sending domain. If the public key cannot decrypt the private key, the email is assumed to be not an authentic one from the sender, and is blocked. "One of the core problems with spam is we don't know, Yahoo! doesn't know, the user doesn't know...if it really came from the party who it says it came from," says Yahoo!'s Brad Garlinghouse. "What we're proposing here is to re-engineer the way the Internet works with regard to the authentication of email." He adds that the technology only needs a few major providers to work.
    Click Here to View Full Article

  • "U.N. Control of Web Rejected"
    Washington Times (12/08/03); Zaracostas, John

    In preparatory talks for the upcoming World Summit on the Information Society, the European Union, Canada, and Japan have helped the United States turn back a bid by developing nations to put the Internet under the control of the United Nations or its member governments, but involved parties will be asked to establish a mechanism to study the governance of the Internet and offer recommendations by 2005, under the auspices of the United Nations. The draft declaration due at the end of the conference has references to freedom of the press and freedom of information online, counter to protests by China and Vietnam, but there are still differences between African nations and developed nations on the creation of a global digital solidarity fund. "For the first time, we see governments internationally recognizing that which we have talked about for many years--that the Internet is a responsibility not only of governments, but also primarily of the private sector, civil society and others both in the developed and the developing countries," says Ambassador David A. Gross, chief of the U.S. delegation to the conference. ICANN President and CEO Paul Twomey lauds the decision to create a working group on Internet governance, but many developing nations are not confident. "We feel as the system gets more complex, we don't want the whole question of Internet governance to be concentrated around the existing ICANN, which is closely linked to the U.S. Department of Commerce," says one senior Brazilian diplomat.
    Click Here to View Full Article

  • "Computers That Read Your Mind"
    Economist Technology Quarterly (12/06/03) Vol. 369, No. 8353, P. 6

    Researchers at Germany's Fraunhofer Institute for Computer Architecture and Software Technology (FIRST) and the Benjamin Franklin University Clinic are attempting to develop a non-invasive brain-computer interface (BCI) that allows users to control computers by thought, and thus far have enabled volunteers to play video games using a electroencephalograph (EEG) connected by external electrodes. The BCI is significant because the users do not need a lot of training to operate the system, whereas most BCIs require roughly 200 hours of practice. Such a BCI, if perfected, could give paralyzed people the ability to manipulate a cursor and type messages, lead to a whole new class of video games, and help accelerate vehicle braking and steering time. A practical BCI device must be able to determine the user's intention from a single brain-wave reading, rather than averaging it from hundreds of readings, which is the traditional way to produce textbook brain-wave samples, notes FIRST's Klaus-Robert Muller. The EEG as well as the brain generate background noise that can complicate single-trial readings, so the FIRST team has designed a program to filter out such noise, similar to how the brain's reticular activating system functions. The commercialization of the BCI depends on making the EEG cheap and portable enough to be sold as a peripheral, while the electrode-based EEG connection needs to be made faster and even less invasive. Muller says the use of conductive gel should be eliminated. A non-invasive, gel-free method significantly reduces the voltages of brain signals, so a powerful low-noise amplifier is needed to take up the slack.

  • "The New Internet"
    Computerworld (12/08/03) Vol. 31, No. 55, P. 36; Mearian, Lucas

    Academic researchers are already testing out the new Internet on the PlanetLab project that links 160 servers at 65 sites across 16 countries. The researchers say PlanetLab acts as a testbed for new Internet applications and expect that as the system grows, more popular aspects of it will be slowly integrated into the mainstream Internet. The project is led by Intel and involves about 70 academic researchers worldwide. Participants say PlanetLab should grow to 1,000 server nodes connecting via all major Internet regional and long-haul links. University of California, Berkeley, researcher John Kubiatowicz is working on one PlanetLab application called Infrastructure for Resilient Internet Systems (IRIS) that uses a peer-to-peer system to break up data into small pieces and rebuild it when needed; IRIS would mitigate denial-of-service attacks because of the massive load-balancing done across the system, as well as speed information transfers. A related project Kubiatowicz is working on is called OceanStore, which would store multiple encrypted copies of files at many points throughout the network. Kubiatowicz says OceanStore ensures data will remain available as long as the network is operating, unlike current archival methods which often rely on single copies stored on isolated physical media; the system would require archival fees paid to ISPs, however, but would provide infinite storage and immediate access from any network connection. Another interesting PlanetLab experiment is called Netbait, a program that tracks worms and viruses as they propagate and travel through the network, providing administrators with more upfront information before their system is attacked. MIT professor Frans Kaashoek says the most useful PlanetLab applications will eventually be adopted by more people and standardized, slowly upping the Internet's level of intelligence.
    Click Here to View Full Article

  • "The Secret Is Out"
    New Scientist (11/29/03) Vol. 180, No. 2423, P. 24; Anderson, Mark Kendall

    Quantum communication has long been publicized as completely hack-proof, but quantum hacking is an area of research that engineers are exploring in parallel with the development of true quantum networks--and they are uncovering possible exploits that quantum encryption designers never anticipated. "The models that tell us quantum cryptography is hot stuff are drastically simplified," explains Harvard University's John Myers. Quantum communication encryption's basic incarnation is the BB84 scheme devised by IBM's Charles Bennett and the University of Montreal's Gilles Brassard, in which a message sender (Alice) and receiver (Bob) use both a public link and a quantum communication link to set up a secret quantum key used to encrypt messages that an eavesdropper (Eve) cannot guess without being detected, since Eve's measurement of Alice's photons disturbs their quantum state. However, engineers have found several practical techniques that eavesdroppers could use to correctly guess the key: In a photon number-splitting attack designed by Nicolas Gisin of the University of Geneva, Alice's laser accidentally releases two or three photons instead of just one, and Eve diverts and measures these extra photons without Alice and Bob knowing. In another quantum hack, known as a frying attack, Eve sends an intense pulse of laser light into Bob's 1 photon detector, rendering it inoperative and making Bob capable of only receiving 0s; Alice and Bob's key will therefore be all 0s, which means that their data will be unencrypted without their realizing it. "In general, I do not think that a real quantum cryptography system will ever be 100 percent secure, because a real system will always implement an approximation of the theorist's system," states Gisin. Military and intelligence agencies as well as financial firms are employing commercial quantum communication products, but establishing secure quantum communication in a public Internet is a more complex proposition, especially since there is such a wide variety of quantum communication schemes.

  • "Sentient Data Access Via a Diverse Society of Devices"
    Queue (11/03) Vol. 1, No. 8, P. 52; Fitzmaurice, George W.; Khan, Azam; Buxton, William

    University of Toronto researchers have developed a user model for cooperative ubiquitous computing known as sentient data access, in which wearable/mobile computers are used to relay referential data to embedded computing environments at particular locations. This model consists of three fundamental elements: Fixed location devices or terminals, physical or virtual data pointers called identifiers (such as UPC symbols and radio frequency tags), and mobile wireless devices, or containers, that house identifiers and can also function as personal portable user interfaces to terminals (PDAs, bar-code readers, and Smart Cards being examples). Approaching the terminal UI in a consistent manner can ease interactions with diverse terminals. Systems based on this model must meet two basic challenges: They must be able to determine the proper representation to be loaded onto the terminal according to the given identifier, and offer a way for users to choose an alternate representation when incorrect predictions are made. The researchers have outlined a trial environment for sentient data access via a variegated society of devices by conceptualizing an automotive design space whose terminal components include a Powerwall for exterior evaluations and large presentations; a Vision Dome for interior assessments; 51-inch plasma displays; and a Chameleon viewer for intuitive inspection of 3D models. These terminals, along with the diversity of accompanying identifiers and containers, are integrated through the PortfolioBrowser software infrastructure. The heart of the system architecture is a relational database that chiefly stores digital assets that can be organized into projects, while key components include the ability to map between data type and target applications that may rely on the type of terminal; adaptive, programmable heuristics is also needed so that an appropriate asset can be delivered according to the contextual state, as well as the identifier, terminal, and location involved.