HomeFeedbackJoinShopSearch
Home

Be a TechNews Sponsor

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM. To send comments, please write to [email protected].
Volume 7, Issue 824:  Wednesday, August 3, 2005

  • "Advocates Urge Election Assistance Commission to Require Verification and Accessibility"
    Kansas City infoZine (07/28/05)

    Proponents of disabled voting rights and voter-verified paper records (AVVPR) joined forces on July 28 by calling on the Election Assistance Commission to make verifiable and universally accessible voting systems a requirement. "Most people are realizing that there is no need for conflict between accessibility for voters with disabilities and the integrity of the voting system for everyone," said Verified Voting Foundation founder and Stanford computer science professor David Dill. Shawn Casey O'Brien, former executive director of the Unique People's Voting Project, said the mobilization of both AVVPR and accessible voting advocates demonstrates the compatibility of voting system security and accessibility, which will give the disability vote considerable political sway in the state of California. California Secretary of State Kevin Shelley made accessible AVVPR a state requirement based on recommendations from his Task Force on Touch-Screen Voting, which Dill and O'Brien both served on. Dill cautioned that technology is not a panacea to all accessibility problems, and called for the establishment of guidelines that provide accessible and trustworthy voting systems. AVVPR are a requirement in 24 states, and all polling places will soon need to offer accessible voting systems in order to comply with the Help America Vote Act.
    Click Here to View Full Article

    For more on e-voting, visit http://www.acm.org/usacm.

  • "DNS Servers--an Internet Achilles Heel"
    CNet (08/03/05); Evers, Joris

    At last week's Black Hat conference in Las Vegas, security researcher Dan Kaminsky presented the results of his survey that found susceptibilities to DNS cache poisoning in almost 10 percent of the 2.5 million Domain Name System machines he scanned. The incentive to attack a DNS server is largely financial, according to the SANS Internet Storm Center, as hackers are frequently paid on the basis of how much malicious software they install on people's PCs; they can also gain access to sensitive information such as social security and credit card numbers. DNS cache poisoning attacks, sometimes known as pharming, substitute the address of a malicious site for that of a legitimate, popular site, redirecting users to bogus pages that might upload corruptive software or prompt users for sensitive personal information. Since DNS servers can each support thousands of users, the number of vulnerabilities Kaminsky found could expose millions of Internet users to phishing attacks, identity theft, or other threats. Kaminsky said the vulnerable servers he found are powered by the Berkeley Internet Name Domain software, and are in need of an upgrade that does not employ forwarders for DNS requests. BIND 4 and BIND 8 in forwarder configurations are porous, and the equivalent of "Internet malpractice" on the part of service providers, said Nominum Chairman Paul Mockapetris. Kaminsky appealed to the managers of DNS servers to upgrade and audit their systems, as a scan similar to the one he conducted could easily be performed by a hacker trolling for vulnerable servers.
    Click Here to View Full Article

  • "New Rating System Aims to Take Mystery Out of Open-Source Tools"
    InformationWeek (08/01/05); Kontzer, Tony

    To allay managers' fears over the unpredictability of open source software, Intel, Carnegie Mellon University, and SpikeSource have joined forces to compile a rating system that will help IT departments decide which programs are worth adopting. The Business Readiness Ratings will solicit feedback from the open source community to inform others on the viability of a particular program. The ratings, which can be viewed on the Web, also apply 12 criteria to commercial software, including such measures as functionality, scalability, and security, though it is primarily concerned with open source. It is expected that the large number of responses to a given program will mute any efforts on the part of promoters to tamper with the rating system by flooding the site with positive feedback. "Generally you find accurate results when you've got a large number of people weighing in," said SpikeSource CEO Kim Polese. Polese says Business Readiness Ratings will provide developers with tools to improve their code. She says, "Open-source developers more than anything want to see their users using what they're writing. This is a way to ensure that." The Business Readiness Ratings should be available by the end of the year, once the project's sponsors have solicited enough feedback to test whether their model works. Polese says feedback from developers and IT managers could change how the model looks when it's released.
    Click Here to View Full Article

  • "Reddy Awarded 2005 Honda Prize From the Honda Foundation"
    Carnegie Mellon News (08/01/05)

    The Honda Foundation has awarded Carnegie Mellon University professor Raj Reddy with the 2005 Honda Prize for his work in computer science and robotics, particularly as it pertains to "Eco-Technology" that is not only efficient and profitable, but also environmentally friendly. Reddy was recently honored as the first recipient of Carnegie Mellon Qatar's Mozah Bint Nasser Chair of Computer Science and Robotics, and was co-chair of the President's Information Technology Advisory Committee from 1999 and 2001. His achievements in artificial intelligence earned him the ACM's 1994 ACM Turing Award, while his work in developing countries secured him a Legion of Honor in 1984. The Honda Foundation cited Reddy's status as founding director of the Carnegie Mellon Robotics Institute and his commitment to accepting and teaching researchers from companies and universities all over the world in an effort to improve the international robotics community. "As a result, robotics has become one of the most promising technological areas for today's industry as well as future society in the sense that it helps create more harmonious relationships between man and nature through the involvement of intelligent machines," declared the foundation. Carnegie Mellon Provost Mark Kamlet pointed to Reddy's current attempts to bridge the digital divide through the PCtvt personal computer and the Million Book Digital Library. Dean of the School of Computer Science Randal Bryant lauded the professor for his dedication to the concept of technology that improves the quality of life while keeping humans' environmental impact to a minimum. Reddy's areas of study include AI, human-computer interaction, and speech and visual recognition by machine.
    Click Here to View Full Article

    To view Raj Reddy's ACM A.M. Turing Award citation, visit
    www.acm.org/awards/turing_citations/reddy.html.

  • "Initial Report Undersold E-Vote Snafus"
    Inside Bay Area (CA) (08/03/05); Hoffman, Ian

    The first mass testing of the new Diebold AccuVote TSx electronic voting systems in the United States last month demonstrated fallibility of far greater magnitude than previously reported. During a mock election employing 96 such machines in California, almost one-third of the units had some kind of problem: Nineteen machines suffered 21 screen freezes or system crashes and had to be restarted by Diebold technicians, while 10 units suffered 11 printer jams. This was almost double the number of problematic systems reported in earlier tests, the results of which prompted California Secretary of State Bruce McPherson to authorize the mock election. Seventeen California counties and many other counties in Utah, Ohio, and Mississippi are considering acquiring the TSx, but the California test could put a serious crimp in those plans. McPherson said last week the TSx was too unreliable for him and California voters, while the printer jams alone were enough to spur e-voting critics to gather before the Alameda County administrative offices on Aug. 2 and demand that county supervisors scuttle plans to purchase the Diebold machines. University of Iowa computer science professor and computerized voting systems expert Douglas Jones says the results of the test weren't surprising. Diebold says the problems will be fixed and the machines will be tested again later this month.

    To learn more about ACM's activities regarding e-voting, visit http://www.acm.org/usacm.

  • "Apple Offers a Mouse to Counteract the One-Button Blues"
    New York Times (08/03/05) P. C5; Markoff, John

    Apple Computer has changed its mouse form factor to accommodate multiple buttons instead of just one with its new Mighty Mouse offering. "The really interesting thing about Apple coming out with a multiple-button mouse now is it shows just how far the goal posts have moved in terms of what constitutes ease of use for an ordinary computer user," said Institute for the Future research director Alex Soojung-Kim Pang. The mouse's inventor, SRI International computer researcher Douglas Engelbart, envisioned the device as having considerably more than three buttons, which is the industry standard. A mouse with 10 buttons would be a far more powerful tool once mastered, he argued. William English, who co-designed the mouse with Engelbart, said the original model had three buttons simply because the wood case crafted by the SRI machine shop had room for three microswitches. Apple's Mighty Mouse is equipped with five sensors rather than discrete buttons: Two sensors read left and right mouse clicks, two more side-mounted sensors serve as a single button when the device is squeezed, and a magnetic sensor tracks the movement of a "squirrel ball" for scrolling. The emergence of multiple-button mice seems to reflect consumers' increased prowess in multiple-button pushing thanks to the explosion in cell phones, game consoles, and other handheld electronic products.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "CMU Online Game Will be Used to Teach Computers to See"
    Pittsburgh Post-Gazette (08/01/05); Spice, Byron

    Carnegie Mellon University researchers are developing online games designed to advance computer technology. One such game is Peekaboom, whose goal is to help train computers to see through the interaction of two players. In Peekaboom, one player receives an image paired with a word that describes the image or a component of the image, while the other player must correctly guess the word; the first player initially sees no image, but in the course of the game the second player gradually reveals the object by moving his cursor over the image. Gradually showing the object or highlighting images within the larger context of the photo is necessary to teaching computers to see. The images and words used in Peekaboom are taken from another CMU online game, the ESP Game, in which participants generate descriptive labels for images. CMU computer vision researcher Alexei Efros says earlier assumptions that computers could be trained to recognize an object if provided a rough geometry of the object were not very successful. A better strategy is to show the computer many "segmented" images of the object, but acquiring so many images leads to a bottleneck that training games such as Peekaboom could perhaps remove. Peekaboom is the brainchild of CMU computer science doctoral student Luis von Ahn, who is planning follow-up games designed to create explanatory sentences for images and help computers better understand common knowledge that consistently vexes machines.
    Click Here to View Full Article

  • "'Virtual Clay'"
    Buffalo News (08/01/05); Rivoli, Jonathan

    University at Buffalo engineering professor Thenkurussi Kesavadas' "virtual clay" system integrates sculpture with 3D computer design through an interface that enables users to shape virtual objects on a computer screen by molding an actual lump of clay or Play-Doh using a sensor-equipped glove. Kesavadas embarked on the virtual clay project as a way to address popular computer design packages' inability to harness people's natural artistic skills. The location of the user's hand and the amount of pressure it applies to an object are sent to the computer by the glove's sensors, and the computer translates this data into virtual indentations on the digital object. In addition, the virtual clay system can refine the computer's mimicry of the indentations so that sculptors can create indentations that are impossible in the physical realm. Users can choose their initial on-screen shape from an array of configurations. Virtual clay promises to seamlessly connect digital data and the art of sculpting, and could be of great benefit to the design world, according to Mk Haley, a special effects designer and communications director for ACM's SIGGRAPH. The system is slated for commercial rollout in 2007.
    Click Here to View Full Article

  • "Experts Warn on Cyber-Security"
    National Journal's Technology Daily (07/26/05)

    Purdue University computer-security professor Eugene Spafford and Akamai Technologies chief scientist Tom Leighton painted a bleak picture of U.S. cybersecurity before a panel of congressional staffers at a Tuesday event sponsored by the Congressional Research and Development Caucus. Leighton testified that the chief cybersecurity R&D underwriter, the National Science Foundation, has an annual budget of just $30 million for cybersecurity, and ends up funding only 8 percent of cybersecurity proposals. The recently disbanded President's Information Technology Advisory Committee (PITAC), which both Spafford and Leighton were on, released a report in February estimating that of the $238 million federal agencies spent on cybersecurity research last year, $136 million went to classified programs whose new approaches cannot be transferred to the private sector. "We agree with the PITAC that improved coordination of federal cybersecurity R&D activities is key to increasing the efficiency and effectiveness of the government's investments in this area," said White House science adviser John Marburger at PITAC's April 14 meeting, where he announced the merger of the Interagency Working Group on Critical Information Infrastructure R&D and the Networking and Information Technology Research and Development Program. However, Spafford and Leighton called for regular long-term R&D funding. "It's going to take a very large and significant failure before people understand" the importance of cybersecurity research, Spafford warned.
    Click Here to View Full Article

    Eugene Spafford is chair of ACM's U.S. Public Policy Committee: http://www.acm.org/usacm.

  • "Hackers Annihilate WiFi Record"
    Wired News (08/02/05); Zetter, Kim

    The iFiber-Redwire team won the third annual Wi-Fi shootout contest at last week's DefCon hacker conference by sustaining an unamplified WiFi connection between a laptop on Nevada's 8,500-foot Mount Potosi and another on the 18-foot Utah Hill mound in Utah--a distance of almost 125 miles--for three hours. This is the second consecutive win for the Ohio-based team, which won last year's shootout by sustaining a 55.1-mile connection. Tests showed that an unamplified WiFi signal could be supported across a maximum distance of 300 miles, according to iFiber-Redwire member Ben Corrado. Fellow team member Andy Meng said the signal strength of the 125-mile connection was better than that of the testing phase's 8-mile connection, proving that the signal's limitations were due to terrain interfering with the line of sight rather than equipment. The iFiber-Redwire team originally hoped to beat the 192-mile record made by Alvarion and the Swedish Space Corporation three years ago, but they did succeed in beating an 82-mile WiFi connection set up across the Great Salt Lake in 2003. Conference judges confirmed the distance and connection by using GPS coordinates, and provided secret, encrypted messages to the teams to transmit.
    Click Here to View Full Article

  • "The Fingerprint of Paper"
    Boston Globe (08/01/05); Cook, Gareth

    A new deterrent for forgers and fraudsters could be in the making with British researchers' announcement last week of a relatively cheap method for distinguishing between authentic documents and counterfeits. The technique involves the laser scanning of a document, whose paper, cardboard, or plastic surface contains microscopic imperfections resulting from manufacture that cannot be duplicated; the pattern or "landscape" of these imperfections is recorded by the laser scanner, generating a fingerprint of the document that can be used to verify its authenticity. Imperial College London nanotechnology professor Russell Cowburn says the technology could secure a document in two main ways. One technique involves laser scanning a spot on the document when it is first issued, producing a code that could be stored in a central computer file; the document would be scanned again when someone tries to use it, and compared to the database to confirm that it is genuine. The other technique would have the document's fingerprint numerically encrypted and then printed on the document as a bar code, which would also be stored in a database for later comparison. The only way a forger could circumvent the scanner in both instances would be to break into the database and alter the information. The U.S. Government Printing Office, which is in charge of printing passports, plans to independently test the scanner in the next few months, says CTO Michael Walsh. The method is detailed in the current issue of Nature.
    Click Here to View Full Article

  • "The Sniffer vs. the Cybercrooks"
    New York Times (07/31/05) P. 3-1; Rivlin, Gary

    As the motivation for hackers shifts from the pursuit of bragging rights to high-stakes economic plundering, many corporations are enlisting the services of sniffers, security analysts who peer through the eyes of a hacker to exploit a system's vulnerabilities in the name of improving its security. A recent survey found that over 87 percent of the companies polled conduct penetration tests, up from 82 percent a year ago; up 14 percent from 2003, companies in North America spent more than $2 billion on security consulting last year, says Gartner analyst Kelly Kavanagh. Sniffers such as independent consultant Mark Seiden often resort to unorthodox techniques to expose a system's vulnerabilities. While he is a former programmer with considerable technical expertise, Seiden may be best known for his innovative methods for gaining access to companies' most sensitive information, such as using disguises to infiltrate restricted places. Once inside, Seiden is an expert at figuring out where a data center is housed, and by blending in, picking locks, and shimmying through air ducts to drop through a ceiling into an otherwise secure room, he has exposed weaknesses in many high-profile companies. The most porous security is most likely to be found in a physical building, where file cabinets with cheap locks and unsecured backup tapes offer a wealth of sensitive information to someone such as Seiden. Though his creativity and uncanny ability to think like a cyber-criminal have kept him in high demand, he acknowledges that "you can't prevent a determined adversary who has unlimited resources from breaching security." But as Gartner analyst Richard Mogull points out, even though 100 percent security will forever be an illusion, sniffers such as Seiden can help companies protect against the vast majority of would-be hackers who "have only rudimentary skills."
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Who Should Run the Internet?"
    Washington Times (08/01/05) P. A11; Zarocostas, John

    In this interview with the Washington Times, ICANN President Paul Twomey reacts to a new report assembled by the United Nations' 40-member Working Group on Internet governance (WGIG). The authors of the report failed to reach a consensus as to how the Internet should be run, but they did offer four scenarios and a broad definition of Internet governance that will be discussed further at the World Summit on the Information Society (WSIS) this fall in Tunisia. Twomey applauds the WGIG's definition of Internet governance, which goes beyond jurisdiction over domain names. While declining to comment specifically on the WGIG's four proposed governance models, Twomey says he hopes the more than 500 agreements signed between ICANN and various Internet stakeholders, including national governments, will remain intact. "We remain very strongly committed to the way in which the present Internet remains stable," Twomey says. Twomey does not expect the United Nations to attempt to control the technical aspects of the Internet, though he concedes that some will likely argue for "a radical UN new treaty organization to run all this." While he acknowledges that governments should have a role, Twomey warns against politicizing the "core infrastructure" of Internet governance. Twomey says he believes that the new report reflects a more moderate tone from the working group, as opposed to the tone at the end of the first summit in late 2003, which was "very politicized and at the same time very ignorant."
    Click Here to View Full Article

  • "Java Preps Dive Into Real-Time Role"
    EE Times (08/01/05); Lammers, David

    Though still plagued by programming difficulties, real-time Java is making inroads into the mainstream. However, as the language moves to supplant traditional C-based codes, it still struggles for legitimacy, as a recent survey found that 9 percent of embedded developers are including a Java Virtual Machine in their projects, and of those, fewer than half reported that their projects will incorporate real-time features. Many of the smaller companies offering Java packages are finding more business coming from the military, as the language's real-time potential is particularly attractive for automating secure, digital communications for soldiers in the field. Sun's JavaOne conference in June and the unveiling of Java Real-time System (RTS) has helped drive the recent surge in attention to Java; Sun engineer Greg Bollella said the first iteration of RTS is derived from Solaris 10, and has latencies of 10 to 20 microseconds in dual-processor, Sparc-based machines. Next, Bollella said Sun will support either the X86 or PowerPC platforms. Java RTS supports both real and nonreal-time threads, and offers control over asynchronous occurrences and supports asynchronous control transfer. To improve real-time Java, Sun, IBM, and other promoters are developing algorithms to address the garbage collection issues that have posed problems in applications such as military aircraft.
    Click Here to View Full Article

  • "HumanML: The Vision"
    DM Direct (07/29/05); Peltz, Jay

    The Organization for the Advancement of Structured Information Standards (OASIS) is developing the Human markup language (HumanML) specification to more formally represent human traits (cultural, physical, psychological, and so on) and augment the fidelity of human communication by delivering machine readable subtext via the use of XML. The goal of HumanML is to support sharing, collaboration, and relating as the Internet and other digital communications tools make accurate information exchange between culturally and socially disparate peoples a growing priority. HumanML is envisioned as a tool for mining massive volumes of textual and multimedia content by facilitating the inclusion of human-related contextual clues; providing document markup standards that enable the human computer interface to automatically adjust to system users' cognitive or psychological needs; and supplying standardized markup to support automated handling and management operations that take information security and releasability across all granularity strata into account. Accelerating global interconnectivity is leading to information overload, and HumanML is designed to ease the burden and complement semantic-based technologies such as the Semantic Web that are attempting to assign context and meaning to digital content. Both governmental and non-governmental areas can benefit from HumanML. In the former sector, HumanML can help facilitate interoperability across numerous federal agencies and departments. In the latter sector, HumanML offers a standardized means to relay and establish contextual meaning, giving researchers and others a more refined way to locate the specific information and knowledge they are looking for.
    Click Here to View Full Article

  • "Healing Power"
    Computerworld (08/01/05) P. 21; Hoffman, Thomas

    Emerging and existing technologies could help the U.S. electric grid become adaptive and more reliable, but cost, security, and other issues are hindering such enhancement. Among the technologies cited by industry experts is software that can predict demand by analyzing patterns of electrical consumption; systems that could be employed during peak demand periods to alert industrial customers that they should halt their use of electricity; and intelligent sensors affixed to transformers and other elements to detect and report malfunctions. More effective and reliable communication between relays and other grid components is essential, and this cannot be achieved without a stable communications backbone. PJM Interconnection's Michael Bryson notes that the incorporation of intelligent sensors makes grids more susceptible to cyberterrorism, while Doug Fitchett with American Electric Power says the superior security offered by fiber-optic technology comes with the drawbacks of high cost and no clear return on investment. Cost is a particularly critical factor in the rollout of new grid technology, and IDG analyst Rick Nicholson says utilities will not invest in self-healing solutions without sufficient incentive. Bryson expects the power industry to migrate to a publish-and-subscribe communications system in the next five to 10 years; such a system offers greater efficiency than IP-based networks. An intelligent power grid is the goal of the Electric Power Research Institute's (EPRI) IntelliGrid Consortium, which envisions the linkage of sensor-equipped grid components to public and private communications networks so that utilities and regional transmission organizations can be rapidly notified about equipment failures or deviations that could cause power interruptions. The sensory data could be captured and processed by "decentralized substations," according to EPRI's Don Von Dollen.
    Click Here to View Full Article

  • "Facing an Innovation Deficit"
    Federal Computer Week (08/01/05) Vol. 19, No. 25, P. 18; Sternstein, Aliya

    Some U.S. legislators believe a lack of innovation stemming from declining federal IT R&D budgets threatens to enervate America's economy, and they are organizing a National Conference on Science, Technology, Innovation, and Manufacturing to decide what steps should be taken to remedy this situation. A 2005 study from the American Electronics Association points to a fall-off in federally funded non-defense IT research in the past 15 years, while the proposed fiscal 2006 budget entails cutbacks in the National Science Foundation's IT research funding efforts. The United States will lose its global economic leadership in five to seven years without substantially higher investments in risk-taking IT research, predicts ITAA President Harris Miller, who supports the institution of a permanent R&D tax credit as an incentive for companies to pursue long-term IT research projects. Office of Science and Technology Policy director John Marburger said at an April science forum that the fiscal 2006 budget proposal has been misinterpreted by critics, noting that 5.6 percent of that budget would go toward non-defense R&D, in contrast to the 5 percent average of the last 30 years. Marburger also said the proposal earmarks $71 million in additional funding for the NSF's K-12 Math and Science Partnership Program. Meanwhile, John Palguta with the nonprofit Partnership for Public Service warned that a considerable percentage of federal IT, science, and engineering R&D has been offshored, and federal agencies must address the challenge of ensuring the retention of expertise to coordinate federal research programs and contracts. A lack of federal funding for supercomputing is also a sore point among researchers, despite House lawmakers' insistence that they are committed to re-invigorating this area.
    Click Here to View Full Article

  • "Engineering EverQuest"
    IEEE Spectrum (07/05) Vol. 42, No. 7, P. 34; Kushner, David

    Sony Online Entertainment's spectacularly successful EverQuest online role-playing game, which runs on upwards of 1,500 servers worldwide, exemplifies how a company manages accelerated growth and proficiency in scaling up computing technology. The EverQuest universe is made up of dozens of virtual worlds that reside in server clusters of between 20 and 30 dual-processor computers, and individual processors within each cluster are dedicated to generating different geographical details. When a player logs on, the program being used links to servers in the cluster the player's character was last inhabiting, which then downloads the data describing all items within the virtual environment; logging on requires the latest software update, which is facilitated via downloadable software patches in two- to four-week intervals. In the time it takes to install a patch, the servers are down so that Sony can update the servers' code and send the software to the players, but the patches do more than fix glitches; they can also provide new content. A major content expansion entails the addition of new dual-processor servers, and Sony recently started using blade servers to support the increase of computing power without exceeding the server farm's capacity. Sony also employs just-in-time computing, which apportions computer resources based on user demand, to more efficiently manage new game content. The biggest problem faced by Sony's team of EverQuest troubleshooters is bugs that can wreak havoc with the game's economy, which has enough overlap with the real-world economy to give cheaters an incentive to forge currency. Computer games are the leading driver of graphics chip design as well as a major motivating force behind new and creative software.
    Click Here to View Full Article

  • "Simulating Ancient Societies"
    Scientific American (07/05) Vol. 293, No. 1, P. 76; Kohler, Timothy A.; Gumerman, George J.; Reynolds, Robert G.

    Archaeologists are using computer simulations to model the forces that shaped the culture and history of ancient societies, one example being the use of agent-based software to simulate how environmental factors might have influenced the Anasazi civilization of the American Southwest. A model of the Long House Valley in Arizona simulated agricultural and settlement patterns to explain why the Anasazi abruptly left the area about seven centuries ago. Environmental data such as rainfall was fed into a digitized map, and a computer program randomly introduced agents representing households and traced their movements as they sought the best land for farming maize. The computer-driven movements of the Anasazi aligned with the archaeological record up to A.D. 1270, but the simulation of later years predicted a depleted Anasazi presence in the valley following a drought, when the actual record indicated that the area was completely empty. This led to the conclusion that sociopolitical elements, ideological elements, or environmental forces not included in the model played a role in the Anasazi exodus. A major advantage of computer simulation is its allowance for experimentation: Archaeologists can refine models incrementally to test new variables and see whether they bring the virtual prehistory more into line with the archaeological record. More advanced simulations that take cultural and social factors into account are underway, and the insights gleaned from the earlier models, though simplistic, can be applied to current problems such as the preservation of natural resources.
    Click Here to View Full Article
    (Access to full article is available to paid subscribers only [pay per article also available].)


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM