HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 697:  Wednesday, September 22, 2004

  • "Activists Find More E-Vote Flaws"
    Wired News (09/21/04); Zetter, Kim

    Prominent e-voting critic and activist Bev Harris and a computer scientist claim to have uncovered even more flaws in a Diebold e-voting system that could potentially allow hackers to manipulate votes in the upcoming presidential election. In a demonstration to officials in the California secretary of state's office several weeks ago--one that will be repeated for federal legislative personnel and reporters in Washington, D.C., on Sept. 22--Harris showed that the Global Election Management System (GEMS) software that sums up votes recorded on Diebold machines in polling places is vulnerable. GEMS produces two data tables--one made up of rows showing votes for each candidate registered by voting machine memory cards at each precinct, the other comprised of precinct data summaries--that do not always match. Harris says the vote summaries can be changed without altering the raw data, leaving officials with no way to detect the discrepancies apart from manually counting votes from each polling station and comparing them to the GEMS vote summaries. Despite assurances from Diebold that the two data sets are so closely integrated that any summary changes would be reflected in the precinct data and automatically flagged by the system, Harris states that writing the script in Visual Basic will fool the Windows-enabled system into thinking that the votes have not been altered. She says "any teenager or terrorist with a laptop" could perform this trick thanks to a "hidden program" in GEMS for flagging vote data as old or current by marking it with a 0 or a -1. Simply switching the numbers is enough to deceive the system into misreading old data as new data, as Harris demonstrated. Diebold retorts that such manipulation can be spotted easily thanks to a system of checks and balances within the electoral process, but Lawrence Livermore National Laboratory's David Jefferson remarks that poll workers and election officials do not always follow procedures, adding that the vulnerability stems from ineptitude on Diebold's part rather than deliberate maliciousness.
    Click Here to View Full Article

    For information on ACM's activities involving e-voting, see http://www.acm.org/usacm.

  • "NSF Announces Two Cybersecurity Centers to Study Internet Epidemiology and 'Ecology'"
    National Science Foundation (09/22/04)

    The National Science Foundation (NSF) has awarded funding to 33 new cybersecurity projects, including two cybersecurity research centers that will model IT security threats in terms of ecology and epidemiology. The $30 million Cyber Trust program is the NSF's main cybersecurity effort and addresses a range of issues, including resilient architecture, fundamental cryptography research, multidisciplinary research, and education and workforce training. In the latest round of funding, $6.4 million went to Carnegie Mellon University for the Security Through Interaction Modeling (STIM) center, headed by Mike Reiter: The STIM center will study Internet ecology to understand the interactions between networks, computers, humans, and cyberattacks. Research into these interactions will show how healthy network interactions differ from attacks and the behaviors of different application "species," such as peer-to-peer applications or email. A second center based at the University of California, San Diego (UCSD), will look at how viruses and worms propagate globally and develop early-warning systems and defenses to suppress these outbreaks. The Center for Internet Epidemiology and Defenses will be headed by USCD's Stefan Savage and Vern Paxson at the University of California at Berkeley. Cyber Trust program director Carl Landwehr says the centers will not only study existing technology and infrastructure, but will develop new knowledge and techniques that will enable more secure systems in the future. In addition to the two centers, Cyber Trust funds went to 12 new team projects and 19 smaller projects; Landwehr says the number of proposals, nearly 400 entries, showed significant interest in cybersecurity research among the academic community.
    Click Here to View Full Article

  • "Fixing a Busted IT Research System"
    CNet (09/21/04); Frauenheim, Ed

    Computing Research Association Chairman James Foley says U.S. national competitiveness is threatened by the lack of federal funding in computer science, difficulties in nurturing new computer scientists, and the increasing numbers of engineers in other countries. The number of computer science and engineering graduate students is declining, along with undergraduate computer science enrollments and graduate science students in general, while countries such as China, Korea, Japan, and Finland are producing more engineering students per capita than the United States. In the longer term, this trend will affect the United States' ability to compete in areas dependent on scientific advancement, such as computers, telecommunications, and medicine. Computing, however, has enjoyed greater independent funding for new Ph.D.s than other scientific fields in the United States, placing those people in faculty positions faster. Foley says giving new Ph.D.s independent projects early on is crucial because that is a period when their creative energy is greatest. The National Science Foundation and the university tenure process, however, are holding computing back from making large, conceptual advances in computing due to their focus on the short term. Overall, the outlook for computer science and engineering education in the United States is good, especially considering the dot-com bust, the rise of offshoring, and the visa restrictions imposed on foreign students after Sept. 11, 2001. Foley, a Georgia Tech professor, says more immigrants are seeking computer science graduate degrees than Americans because the field has historically been a path toward upward mobility. He expects offshoring to continue to pull away basic programming jobs, while the U.S. computer science profession will focus more on system architecture and system design, which require understanding business and end-user needs.
    Click Here to View Full Article

  • "Reports on Spam Levels Paint Differing Views of the Problem"
    Wall Street Journal (ONLINE) (09/21/04); Bialik, Carl; Creighton, Deborah S.

    Accurately measuring the extent of the spam problem and the effectiveness of strategies to combat it is complicated by inconsistent statistical reports on the volume of junk email, and the fact that the most oft-cited reports are furnished by antispam software vendors. An August estimate by MessageLabs determined that spam constituted 84 percent of all email, while a report from Brightmail indicated 66 percent. Meanwhile, FrontBridge Technologies and Brightmail claim that the spam problem continues to expand, while AOL contends that spam growth has been level for the past 12 months. The antispam companies supplying these reports usually cull their data from email they scan for corporate clients, which may not represent a cross-section of Internet users, though both vendors and certain analysts believe spam-fighting products' mainstream penetration is reducing this sampling partiality. Still, the inconsistency between spam level reports has been a frustrating factor for legislators: For example, spam level estimates accumulated by the Organization for Economic Cooperation and Development (OECD) varied so wildly as to discourage the organization's attempt to evaluate the spread of spam and the performance of countermeasures. "There's not much out there except what's coming from private companies, where the methodology differs and we don't know how it differs," remarks Dimitri Ypsilanti with the OECD. Muddling matters are divergent definitions of spam among antispam companies and nations, while some spam filters operate by amassing reports from users, whose characterization of spam is not always objective. Furthermore, the reported numbers are mean averages that can be distorted by major spam attacks against a few companies.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "Agent Model Yields Leadership"
    Technology Research News (09/29/04); Patch, Kimberly

    Researchers at Los Alamos National Laboratory and two universities have developed a software model for studying economic markets, quantitative sociology, or optimizing communications among robot collectives. The model is based on the classic minority game, where multiple agents compete to be in the minority of each round of decision-making; by adding a limited social network between the agents, the researchers were able to create a leadership structure that ultimately led to smarter and more adaptive performance than classic models without social network influence. Large and complicated systems such as the stock market are difficult to model because of the number of independent agents and choices available. Computing all possible scenarios is impossible with today's technology, but the researchers' model uses quantitative representations for agent behavior. Los Alamos National Laboratory researcher Zoltan Toroczkai says real human agents actually make decisions inductively rather than through deductive reasoning, as is assumed in classic game theory models; this is because, as with the computer models, figuring out all the possibilities is simply too difficult. The social network influence links each agent to its nearest neighbors and has them rely on the most recently successful agents for advice. Interestingly, the model grew more volatile when denser connectivity was added, since some leader agents' opinions became too popular and destabilized the system. Eventually, Toroczkai says the software model could help arrays of robots operate in conjunction where no human control is possible, such as on Mars explorations, although this type of technology would not be ready for another 10 to 20 years. The research was funded by the National Science Foundation, the Department of Energy, the Research Corporation, and the Alfred P. Sloan Foundation.
    Click Here to View Full Article

  • "Second Thoughts for a Designer of Software That Aids Conservation"
    New York Times (09/21/04) P. D2; Christensen, Jon

    In the six years since its development, the Marxan computer program created by Dr. Hugh Possingham of the University of Queensland, Australia, and grad student Ian Ball has been employed to structure many environmental conservation and biodiversity plans around the world, but Possingham now thinks that simple rules of thumb long cherished by conservationists might be a better option, based on the results of a recent experiment he carried out with Dr. Sandy Andelman of the University of California, Santa Barbara, and MIT's Dr. Eli Meir. Marxan is designed to analyze data about species and their ecosystems and from it extrapolate an optimal plan for the most efficient network of reserves to maintain biodiversity in a region. Using actual species and habitat data from a conservation plan directed by Andelman for the Nature Conservancy in the U.S. Pacific Northwest, Possingham and colleagues compared the results of a strategy produced by Marxan over the span of a decade with those of a plan organized along simpler rules, which were to acquire and protect land that supported the richest biodiversity at the time, incorporated irreplaceable habitat, or hosted species that conservationists wanted to preserve. Marxan operated on the assumption that only a few properties would become available for protection annually and that some would be lost due to development or degradation, and its optimal plan called for the purchase of property only if it was identified as essential in the plan. The simpler rules strategy outperformed Marxan's plan in almost every case, and from these results Andelman concludes that "Given the rate at which the world is changing, unless you can implement the [optimal] plans in a year or so, they're outdated." Although Possingham agrees with this assessment, many conservationists still value optimal plans that software such as Marxan can provide, asserting its usefulness as a fund-raising tool as well as a vehicle to supply public accountability.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Scientists Help Needy Regions"
    Daily Californian (09/22/04); Tang-Quan, Sharon

    Richard Newton, engineering dean of UC Berkeley's College of Engineering, says that a prototype "peace corps for technology" initiative has been organized through a joint effort between the Haas School of Business and the Engineering College's Management of Technology (MOT) program. The program was started under the auspices of the Information and Communication Technology for Billions (ICT4B) project, whose goal is to bring affordable information and communications technology to the segment of the world's population that earns less than $2,000 a year, according to Special Assistant to the Chancellor for Science and Technology Tom Kalil. Such technologies must be low-power, low-cost, multilingual, and usable for people with poor literacy. "Used creatively, these technologies can be powerful tools for reducing poverty and expanding access to health care, education, and government services," explains Kalil. Another goal of the ICT4B project is to bridge the chasm between the rich and the poor, with a concentration on open communication and information-sharing between groups. Berkeley computer science professor Eric Brewer notes that the successful deployment of the technology hinges on the participation of nongovernmental groups with insight into local cultures. An alliance between the MOT program and the United Nations allocated funding to teams of students so that they could journey to Africa, Latin America, and Asia to study how new technologies could be employed for indoor lighting, information access, microfinance, clean energy, and cervical cancer prevention.
    Click Here to View Full Article

  • "Congress Tackles Taxing Issues"
    IDG News Service (09/20/04); Gross, Grant

    Technology lobbying groups are upping their pressure on Congress to vote on some significant bills relating to Internet taxation and copyright, among other things. Some 40 organizations, including the Information Technology Association of America (ITAA) and the Association for Competitive Technology (ACT), are against the Inducing Infringements of Copyright (Induce) Act sponsored by Sen. Orrin Hatch (R-Utah), which would let artists and entertainment companies sue companies that market products that "induce" copyright infringement, with a particular focus on peer-to-peer networks that host unauthorized file-trading. ACT President Jonathan Zuck and others argue that sellers of technologies with important legitimate uses could be liable under the Induce Act, while Rep. Rick Boucher (D-Va.) warns that the proposal could give the content community free rein to ban anything that does not meet with their approval. Boucher supports the Digital Media Consumers' Rights Act, which proposes the required labeling of copy-protected compact discs and the revision of the Digital Millennium Copyright Act to clearly stipulate the legality of marketing hardware or software with "significant" uses outside of copyright infringement. Zuck explains that ACT's opposition to the bill is based on the contention that intellectual property and fair use rights are already fairly balanced by the courts. Tech organizations have asked Congress to declare a permanent moratorium on taxes unique to the Internet, as opposed to a four-year extension passed by the Senate in April. ITAA general counsel Joe Tasker notes that state legislatures may consider taxing Internet access in the absence of congressional action. Moratorium proponents believe taxation would inhibit the growth of the Internet in the United States, while opponents claim the ban could mean billions of dollars in lost revenues for states.
    Click Here to View Full Article

  • "Getting Computer Vision Systems to Recognize Reality"
    IST Results (09/21/04)

    The IST program's VAMPIRE project involves the testing of the theory that Visual Active Memory (VAM) plays an essential role in the cognition and learning processes of computer vision systems. Without VAM processes, objects and behaviors cannot be learned and categorized against a dynamically changing environment, and the VAMPIRE project is unique in its tight linkage of object acquisition and recognition processes while incorporating the human brain. In other words, the system is designed to enable itself to learn via the gathering of knowledge while simultaneously supplying information to the user. VAMPIRE's general memory infrastructure archives a visual event, learns new concepts, and recalls previous events to furnish the required object categorization. Scenarios employed to test the VAM thesis include the use of two static cameras to give an action recognition system visual input within an office environment so that it can develop a consistent interpretation of the scene with the resulting object and action-recognition algorithms. Other experiments involve interactive object learning, the location of augmented reality (AR) users, and scene augmentation based on visually-perceived events. The project has also yielded some advances in real-time object tracking, contextual reasoning categorization, attention cue utilization, object model acquisition from a small group of example images, and hybrid tracking that combines inertial and visual cues. VAMPIRE's technologies will be demonstrated at the IST 2004 event, which takes place from November 15-17, 2004, in The Hague, The Netherlands.
    Click Here to View Full Article

  • "Policy, Not Technology Creates Barriers to Info Sharing"
    United Press International (09/20/04); Waterman, Shaun

    In a Sept. 17 interview, Karen Evans of the White House's Office of Management and Budget said that policy and practice, rather than technology, were the root cause of federal agencies' inability to share information, which the Sept. 11 Commission cited as a major obstacle to the progress of the United States' war on terrorism. The commission recommended that both Congress and the White House take remedial steps by establishing a decentralized, horizontally integrated "trusted information network" to allow employees in different agencies to search each other's databases. It is the job of the Information Systems Council created by President Bush in August to set up "an interoperable terrorism information-sharing environment" by year's end. Establishing baseline standards for information exchange between intelligence agencies, other federal agencies, and state and local law enforcement, along with policies and personnel practices to comply with those standards, is the responsibility of the CIA director, Attorney General John Ashcroft, and Homeland Security Secretary Tom Ridge. This directive has aroused the concern of privacy and civil-liberties proponents, who believe that the free exchange of information between all federal agencies will erode citizens' privacy, while government use of private-sector data may be an infraction of the Fourth Amendment. ACLU legislative counsel Charlie Mitchell argues that the blockage of info-sharing between federal agencies was erected to set limits for government power, while former CIO official Lee Strickland says the "special needs" of the government could override the amendment's prohibition on unreasonable search, seizure, and trespass by the state. Zoe Baird with the Markle Foundation, which set up a task force in 2003 that recommended the same solution to the info-sharing problem supported by the Sept. 11 Commission, calls for "new checks and balances" to ensure that people are required to explain why they need the data, and how it relates to terrorism.
    Click Here to View Full Article

  • "Alice Chatbot Wins for Third Time"
    BBC News (09/20/04)

    American programmer Richard Wallace's chatbot, Alice, won the Loebner Prize for the most convincing computer program to display human-like conversation for the third time on Sept. 19 in New York. The annual international contest rates entrants according to a modified version of the Turning Test, which suggests that a computer can be considered intelligent if its conversational ability is equivalent to that of a human. Alice functions according to a complex series of rules that regulate the chatbot's responses to a question. British programmer Rollo Carpenter, whose Jabberwacky chat program came in second, notes that Wallace's program "is based around a set of big and complex 'if statements' that analyzes the text and respond to the one thing that you have immediately said." He claims that Jabberwacky, by contrast, is characterized by greater freedom and openness. Carpenter believes the era of artificial intelligence that can learn is rapidly approaching. "It is inevitable because a hand-coded system cannot keep up with an exponentially growing system which learns dynamically," he contends. Alice is the recipient of the Loebner Prize's bronze medal and $2,000 cash.
    Click Here to View Full Article

  • "USC Computer Scientist Receives Presidential Early Career Award"
    USC Viterbi School of Engineering (09/09/04); Ainsworth, Diane

    Cyrus Shahabi, research director of information management in the USC Viterbi School of Engineering's Integrated Media Systems Center, has won the 2004 Presidential Career Award for Scientists and Engineers (PECASE) for his work in multidimensional databases and related methods for storing and analyzing streaming data. Scientific data analysis, education, and medicine are just some of the fields these technologies can be applied to, while a prototype streaming architecture called Yima has been devised at USC for the purpose of managing multiple concurrent high-bandwidth streams of images and sound, synchronized over the Internet to single-frame accuracy. Shahabi received a five-year, $400,000 Faculty Early Career award from the National Science Foundation last year for research, teaching, and outreach programs in the management of immersive sensor data streams, which ties in to his current project, An Immersidata Measurement System (AIMS), whose chief goal is to overcome the hurdles in managing multidimensional sensor data streams produced in 3D immersive environments. "These are the user interfaces of the future, which will become increasingly popular as the next generation of the Internet--Internet 2--comes online," Shahabi explains. The AIMS project involves the development of methodologies that Shahabi is applying to the design of backend storage and database frameworks for scientific data analysis and education. The data analysis project is being underwritten by Chevron-Texaco's Center for Interactive Smart Oilfield Technologies at the Viterbi School and NASA/JPL, while the education project is focused on the development of a computer-based, interactive classroom. The PECASE awards honor individuals with "talents and potential that are expected to make them leaders in 21st century science and technology."
    Click Here to View Full Article

  • "Ready or Not (and Maybe Not), Electronic Voting Goes National"
    New York Times (09/19/04) P. 1; Zeller Jr., Tom

    Almost one-third of the 150 million-plus registered U.S. voters will use electronic voting systems in the upcoming presidential election, regardless of whether the machines are truly ready for such a wide-scale deployment. The controversy that has erupted over e-voting machines is between advocates who view the technology as a panacea to hanging chads and similar outdated technologies that made the 2000 election such a debacle, and critics who argue that the systems lack the reliability and security needed to ensure against election fraud. Allegations by computer scientists, civil libertarians, and local voting rights groups that the machines are riddled with vulnerabilities have provoked accusations of fearmongering from election officials and e-voting system providers. A common proposed solution from activists is the addition of paper ballots, which has been mandated by at least one state (Nevada). "Without an actual paper ballot, we are then left with only the computer's word for the election results," states the Free Congress Foundation. Information Technology Association of America President Harris Miller contends that many e-voting critics' fears are spurred by ignorance, and says that e-voting machine vendors have their systems' source code audited for bugs and hidden programming by independent inspectors; however, critics note that such auditors are usually paid by the vendors, and add that they failed to find major vulnerabilities in e-voting software uncovered by scientists and election officials throughout the country. Experts such as Aviel Rubin of Johns Hopkins University note that it is too late now to switch to a less worrisome voting technology, which at this point would only cause more security problems than it would solve. If the presidential election is close, experts think that the general uncertainty over e-voting could lead to a fiasco comparable to the 2000 election.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

    For information on ACM's activities involving e-voting, see http://www.acm.org/usacm.

  • "Internet Governance Under Spotlight in Geneva"
    Computer Business Review (09/20/04)

    The International Telecommunications Union kicks off a two-day meeting on the United Nations' Working Group of Internet Governance (WGIG) on Sept. 20, and in attendance will be representatives from governments, organizational bodies, and businesses. The meeting will focus on creating the WGIG, which is expected to be composed of a few dozen members of private, public, and civil organizations from an array of countries. The WGIG will eventually come up with a definition for "Internet governance" that will help decide what position the United Nations will take on the issue of governing the Internet. As of Sept. 17, only the governments of the United States, Japan, Canada, and Norway had taken a position on the issue, with Norway calling for "a better balance of influence concerning the present domain name management." ICANN's Governmental Advisory Committee should receive more funding and its scope should be expanded from "a mere counseling role," Norway adds. The United States submitted an opinion that emphasized "supporting continued private sector leadership" and "avoiding overly prescriptive or burdensome regulation." The U.S. submission also argued that "innovation, expanded services, broader participation, and lower prices will arise most easily in a market-driven arena, not in an environment that operates under substantial regulation." Japan and Canada's submissions were similar in nature to the U.S. position, though they noted that governments are inevitably getting more involved with the Internet.
    Click Here to View Full Article

  • "When Will These Tech Wishes Come True?"
    Investor's Business Daily (09/21/04) P. A4; Deagon, Brian

    At a recent meeting of the Media Entertainment Technology Alliance, moderator and kenradio.com host Ken Rutkowski had participants put together a wish list of currently unavailable technologies. Among the desirable products is an instant-on PC, which InterVideo's Mike Ling says is likely to become even more desirable as more and more consumers use PCs as home multimedia entertainment centers; for now, however, PC startups take longer as more files are added to the hard drive. Another wished-for technology is swappable hard drives that would allow people to easily transfer content between different hardware, and fleshing out this vision requires applying the same principle in which corporate data is linked via cable, optic, and wireless networking to households in an easy-to-use and uncomplicated format. An intelligent cell phone that knows its owner's identity and can update its data based on a user's location was cited at the meeting; Motorola's Rob Shaddock expects such a device to become a reality once broadband improves so that data and graphics can be more easily and quickly exchanged on cell phones. He also notes that batteries will need to be upgraded if more sophisticated cell phone applications--such as the simple transfer of data between phones and support of multiple phone numbers--are to be added. Battery life is also cited by Research In Motion President Mike Lazaridis as a major challenge to the development of handhelds that can project large displays. Consumer Electronics Association staff director Matt Swanston believes a truly universal remote that can control every possible electronic device will eventually emerge, and classifies the "killer app" as a product that even an elderly person can operate without any trouble.

  • "Taking Stock of E-Paper"
    Computerworld (09/20/04) Vol. 32, No. 38, P. 23; Rosencrance, Linda

    Several companies are pursuing the commercialization of electronic paper (e-paper), a flexible polymer sheet that combines the reflectivity of real paper with low power requirements and lightweight batteries thanks to its bi-stable characteristics, which allow images displayed on the sheet to be retained even after the power is cut. E-paper needs no backlighting or emissive light source, and the material contains electronic ink particles that display as either black or white when an electrical current is applied. Analyst Tom Ashley notes that e-paper cannot yet support color, while Information Display Magazine editor Kenneth Werner points out that updating and rewriting a page on e-paper takes too long to make full-motion video workable, at least for now; Ashley, however, is confident that second-generation e-paper technologies will be more affordable, flexible, and versatile. Commercial e-paper products include Gyricon's SmartPaper, which the company has deployed in a e-paper pricing-sign system for retail stores controlled by software that wirelessly connects the system to in-store pricing databases. E Ink, Royal Philips Electronics, and Sony have co-launched an e-ink display in Sony's paperback-sized Librie e-book reader, which currently allows users to download and store 500 digital books of about 250 pages each. On the horizon is E Ink's RadioPaper, a next-generation smart paper that will resemble paper in both appearance and feel, says E Ink's Darren Bischoff. The company expects its e-ink technology to be incorporated into a wide array of devices, such as handheld computers, cell phones, digital watches, calculators, and car dashboards. Meanwhile, Fujitsu Laboratories is working on a paperlike display for production in 2006 that Fujitsu's Isao Hirano says could be used with a terminal for reading business papers downloaded from a PC.
    Click Here to View Full Article

  • "Values of Community Source Development"
    Syllabus (09/04) Vol. 18, No. 1, P. 36; Brooks, Lois

    Stanford University director of academic computing Lois Brooks notes that open source is becoming increasingly pronounced in higher education, and points out a movement toward initiatives in which institutions combine their resources and expertise to develop products that the education community can use. She suggests that "We need to think about open source not as a product or as a way of distributing code, but rather, as a philosophy about how we develop tools in the higher education community." Community source is a new open source model that hybridizes the cathedral development model defined by isolated teams of experts and top-down decision making, and the bazaar model characterized by large groups of developers whose collective contribution to a project is greater than their individual efforts, with the advantages of better ideas, accelerated development time, faster debugging, and fewer bottlenecks. Brooks outlines four key components of the community source model: Consistency in the goals and responsibilities of colleagues in other institutions, as well as the timing of what they want to accomplish; engagement of commercial partners to provide solid product support through their expertise with numerous campus environments; funding agencies that underwrite development projects and whose influence over goals, collaboration between project partners, and leveraging of contacts to cultivate community structures is increasing; and the inclusion of all higher education and potentially kindergarten through grade 12. The author cites MacKenzie Smith and others who claim that successful community sourcing depends on turning users who are originally consumers into stakeholders. "One significant value of engaging in the community source process is that it teaches us to look more widely, to be more accepting of things 'not invented here,' and to think more strategically about the build/buy/adopt argument," notes Brooks.
    Click Here to View Full Article

  • "The War Room"
    Wired (09/04) Vol. 12, No. 9, P. 150; Silberman, Steve

    The Institute for Creative Technologies (ICT) at the University of Southern California is a convergence point for military experts, visual effects artists, research scientists, and videogame developers, who are busy creating artificial environments that replicate battle conditions with astonishing realism at relatively low costs, for the purpose of training soldiers. Academic studies have demonstrated that people learn faster and retain more knowledge when they are immersed in simulated environments, and the Army launched the ICT to take advantage of this phenomenon. The genesis of the ICT was a 1996 workshop that brought together representatives from Intel, the Defense Advanced Research Projects Agency, a major Hollywood f/x studio, and elsewhere to contemplate "experiential" computing and digital storytelling. TV exec Richard Lindheim says that eliminating the Army's stereotypical views of f/x experts as nerds removed a large obstacle to collaboration. A key ICT deployment is the Joint Fires and Effects Trainer System (JFETS) at Fort Sill in Oklahoma, where troops interact in a synthetic environment generated by wall-mounted flat-panel displays, surround sound, and Windows and Linux software. Essential to the JFETS deployment is the after-action review, in which soldiers engage with artificial senior commanders immediately following a simulated mission to go over their performance. The Fort Sill project is directed by Hollywood f/x specialist Diane Piepol, who put the JFETS system together out of off-the-shelf hardware that enables interoperability between the simulation technology and the military's legacy systems. Combat is only one aspect of the simulations, which also cover diplomatic strategies soldiers must learn and practice in order to successfully negotiate complicated situations in dicey locales with potentially hostile residents. ICT has also developed a desktop PC package for captains-in-training based on real-life situations gleaned from interviews with officers; characters are represented by digital avatars imbued with artificial intelligence, speech-and-text recognition software, and Army doctrine.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM