HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 713:  Monday, November 1, 2004

  • "E-Voting Tests Get Failing Grades"
    Wired News (11/01/04); Zetter, Kim

    The certification of secure e-voting systems is riddled with flaws, including so-called independent testing labs' lack of transparency because e-voting machine vendors are paying for the tests and authorizing who gets to view the results; federal voting system standards rife with loopholes that permit voting system components to fall through the cracks without testing; and pitiful procedures for tracking certified software. University of Iowa computer scientist Doug Jones, an early critic of touch-screen voting machine design and encryption, says election officials barely acknowledge the deficiencies of the certification process out of fear that public confidence of elections and voter turnout will erode. Lawrence Livermore Laboratories researcher David Jefferson blames the National Association of State Elections Directors (NASED) for failing to institute stricter testing standards to ensure more transparency, but former chair of NASED's voting systems board Tom Wilkey complains that the federal government's refusal to pay for testing leaves little option but for vendors to foot the bill. Once a system is successfully tested, it is the state's responsibility to carry out functional tests to ensure that the machines comply with state requirements. But poor communication between labs and state officials makes tracing a bug to the lab that certified the system or forcing vendors to rectify the flaw impossible. The most heavily criticized test is the source-code audit: Despite SysTest director of IT operations Carolyn Coggins' assurance that the labs analyze the machines' counting accuracy and painstakingly read the course code to check for compliance with coding conventions, Jones counters that there is greater emphasis on programming etiquette than secure design. "There's no deep examination of cryptographic protocols to see whether the programmers made the best choices in terms of security," he asserts. The most contentious aspect of federal voting machine standards is security, which do not specify security levels for vendors.
    Click Here to View Full Article

    To read about ACM's activities regarding e-voting, visit http://www.acm.org/usacm.

  • "Hollywood Whistles a High-Tech 'Toon"
    CNet (10/29/04); Olsen, Stefanie

    Hollywood's love affair with computer technology continues to blossom, as evidenced by the recent successful initial public offering of DreamWorks Animation. The popularity of computer animation in feature films is spurring the industry's drive for increasingly sophisticated programs to achieve long-term goals such as realistic rendering of human beings and convincing facial expressions. However, Siggraph advisor and University of Georgia computer science professor Scott Owen observes that the growing sophistication of computer-generated (CG) effects is the reason why the render times for computer-animated movies has remained consistent despite increasing computer speed. Many computer animation companies are trying to boost rendering times: DreamWorks, for instance, updates its processors constantly so animators can get instant feedback on scene changes, and the studio licensed computing power from Hewlett-Packard to get extra rendering brawn in the final three months of "Shrek 2's" production; in addition, studios are transitioning from proprietary to open-source operating systems such as Linux. Computer animation has allowed the traditional "flip-book" method to be supplanted by software that can render the sequences and movement in between each frame to achieve fluidity. Owen says that realistic CG human characters will be an elusive goal until animators better understand human movement, actions, and perceptions. Because an audience can be repelled by CG human characters that are too realistic, animators are trying to approximate the complexity of human appearance and emotion while avoiding precise simulation. Another reason animators choose "stylized realism" is audiences' growing unconscious discernment of CG effects in movies.
    Click Here to View Full Article

  • "Organised Chaos Gets Robots Going"
    New Scientist (10/27/04); Knight, Will

    An experiment by Japanese researchers suggests that the way biological systems learn to move may be similar to the behavior of chaotic systems, which were used to control the movements of a simulated, multi-legged robot. Tokyo University roboticists Yasuo Kuniyoshi and Shinsuke Suzuki postulated that just as chaotic mathematics underlying the weather can generate clear patterns such as weather fronts, similar systems might determine locomotive movement patterns. "We, and animals, seem to be able to work out how to move in different situations without going through thousands of trial-and-error situations like today's robot-control software does," notes Kuniyoshi. The behavior of chaotic systems heightens small effects so quickly that the systems' behavior becomes unpredictable beyond a short period of time. The researchers tested their concept by computer-simulating a robot with a dozen legs, each of which was controlled by a chaotic mathematical function; the functions were originally fed 12 randomly selected parameters, after which sensory data from each limb was channeled back into the controlling chaotic function. Kuniyoshi and Suzuki learned that blending starting parameters in specific ways caused the legs to quickly adopt "walking-on-the-spot" behavior, but the robot was not able to get anywhere. But placing a weight at one end of the machine facilitated a scampering locomotion that the robot could change to climb over impediments. The machine accomplished these feats without conventional programming, and its behavior materialized faster than it would have if it had been "evolved" using genetic algorithms.
    Click Here to View Full Article

  • "Women Techies Make Workplace Gains"
    Small Times (10/26/04); Stuart, Candace

    A confluence of factors are boosting the role of women in the technology field, including corporate awareness, programs targeting girls, and the increasingly interdisciplinary nature of new technological study. Nanotechnology, for example, involves less traditional research and more interdisciplinary skills; individuals that can adapt to this type of work are extremely valuable to their organizations, and companies such as IBM are making significant efforts to attract and retain these people. For example, first-time mother and IBM research team leader Frances Ross says the flexibility offered by IBM will likely mean her microscopy and video recording skills will still be used to help IBM integrate nanoscale components in chip manufacturing processes, and she plans to ease back into her role as leader of a six-person team as her new baby matures. Government statistics show an increasing proportion of women in the sciences, though engineering still remains overwhelmingly male-dominated with just about 20 percent of engineering undergraduate degrees going to women in 2000. IBM and other companies are starting to address what they believe is the root of the problem with science camps and other programs that encourage engineering skills and interests in younger girls. IBM also hosts a networking and leadership conference for women in technology, and has increased its percentage of women in executive roles by 400 percent since 1995. Ross says nanotechnology and other interdisciplinary fields will attract more women, similar to how biology has become more diverse; in 2000, more women earned biology undergraduate degrees than men. Companies are also seeking to create diverse workforces that can meet customer demands more effectively.
    Click Here to View Full Article

  • "Teammates Train for Code-Crunching Match"
    Albany Times Union (10/29/04); Aaron, Kenneth

    A three-person team of computer science majors at Siena College solved word search problems last Friday while the rest of the campus enjoyed the weekend. The students were practicing for this Saturday's preliminary competition of the Association for Computing Machinery (ACM) International Collegiate Programming Contest in Springfield, Mass. IBM sponsors the prestigious programming competition, which sends preliminary winners on to a regional contest, and about 30 regional winners on to the finals, which will be held in Shanghai in April. The winning team will receive medals, $10,000, and other prizes. College students appear to be losing interest in crunching code, considering the number of computer science majors fell to 17,706, or 23 percent, in 2003 from the previous year. Gabby Silberman, program director of the IBM Centers for Advanced Studies and a contest executive, who compares coding to writing sheet music, says computer science is not glamorous. "I think that writing music is programming, in a sense," says Silberman. "It's a formal notation for giving instructions and how to go about doing something."
    Click Here to View Full Article

  • "We're Funny in the Brain"
    Times Online (UK) (10/30/04); Burne, Jerome

    The Oct. 30 Festival of the Art and Mind will be notable for the presentation of artificial intelligence expert Kim Binsted's Witty Idiomatic Sentence Creation Revealing Ambiguity in Context (Wiscraic), a computer program capable of rudimentary joke-making. "If computers are going to interact with humans via language, they are going to have to do humor," explains Binsted. Neuroscientists are fervently pursuing how jokes are processed by the brain, which can differ according to the type of joke. For instance, recent MRI analysis of subjects at London's Institute of Neurology showed that anecdotal jokes, puns, and "semantic jokes" follow different neural pathways to the medial ventral prefrontal cortex--semantic jokes go through the temporal lobe while puns go through the speech-determining Broca's region. Wiscraic is being used as an educational tool for Japanese students learning English: The program produces jokes ("The friendly gardener had thyme for the women") and deconstructs them to clarify the idiomatic use of specific words (in this case, "time"). Binsted reports that students using the program exhibit more word retention and tend to work longer when the occasional joke is generated. She notes that another program, Joke Analysis Production Engine (Jape), is helping improve the socialization skills of children with severe language problems by letting them make puns about any word they select. "It kind of gives them a leg-up into the world of humor," Binsted says.
    Click Here to View Full Article

  • "For the Blind, a Welcoming Web"
    Business Week (10/27/04); Lacy, Sarah

    Advocates for the blind are working to entice rather than exhort companies to make their Web sites and services accessible to the visually impaired, in accordance with voluntary guidelines established by the World Wide Web Consortium in 1999. A year earlier, the federal government revised the Rehabilitation Act to include compliance with certain basic Web-site accessibility criteria, but most disability advocacy groups think the 14-year-old Americans with Disabilities Act (ADA) should be instituted as the legal standard for Web accessibility; some of the more combative disability supporters, such as New York attorney general Eliot Spitzer, have been using the ADA as legal leverage to goad companies to comply. However, Bradley Hodges of the National Federation of the Blind reports that organizations such as his would prefer to resolve the issue collaboratively, rather than involve the courts or the federal government. With such groups consulting with companies, the results are more likely to benefit the visually handicapped, the argument goes. The Nielsen Norman Group's Jakob Nielsen estimates that the ranks of companies with accessible Web sites swell by a mere 4 percent every year. Implementing site accessibility for blind people can be a costly endeavor if a company is not already engaged in a site redesign. Forrester Research pegs a Web site retrofit at roughly $160,000. On the other hand, vendors of Web site usability and accessibility compliance tools such as Watchfire are reporting brisk sales of their products.
    Click Here to View Full Article

  • "US Expert Drums Up Support for Supercomputing"
    China Business Weekly (10/26/04); Boru, Zhu

    Chinese supercomputing firm Galactic Computing aims to help China establish world-leading supercomputing capabilities using blade technology. Galactic founder Steve Chen recently met with senior Chinese officials from the education, commerce, public security, health, IT, and science and technology ministries to discuss partnerships; the aim is to utilize Galactic's supercomputing technology in application development partnerships with leading Chinese academic institutions. "We have selected eight top colleges or research institutions as development partners, each for a specific industrial application," Chen says. A member of the U.S. National Academy of Engineering and American Academy of Arts and Sciences, Chen was a vice president at supercomputer maker Cray before starting his own IBM-funded firm in 1987, called Supercomputer Systems. During that time, it was more difficult to import supercomputing technology into China, but China now has the opportunity to develop its own supercomputing technology instead of relying on imported intellectual property. Galactic's third-generation blade supercomputer operates at an average 1 teraflop calculation speed, but can be scaled up to 50 teraflops performance, which would make it one of the fastest supercomputers in the world, according to Chen. He says U.S. companies are still working on blade supercomputing hardware and have not yet begun to write applications that take advantage of blade supercomputing's cost advantages. Because blade hardware can be added easily, owners do not need to buy completely new systems in order to increase their capacity, and Chen envisions a type of public supercomputing grid infrastructure for China using his blade technology.
    Click Here to View Full Article

  • "Three Minutes With Ray Kurzweil"
    PCWorld.com (11/01/04); Spring, Tom

    In his book, "Fantastic Voyage: Live Long Enough to Live Forever," futurist Ray Kurzweil describes how a person's lifespan can be radically extended through three processes or "bridges." The first bridge is biochemistry reprogramming, in which a person slows down aging and disease with supplements, intravenous therapies, and other existing techniques so that he can live long enough to benefit from the second bridge, which is the biotechnology revolution. This revolution, which Kurzweil estimates will peak in one to two decades, will involve a reprogramming of fundamental biological information processes through such breakthroughs as computer-designed therapies, implanted biochemical monitors, and artificial organs. Biotechnology will dovetail with the third bridge, nanotechnology, in which microscopic technology--most notably disease-destroying, intelligence-expanding nanobots--will greatly augment a person's physical and mental abilities; Kurzweil expects the second decade of the 21st century to be nanotech's "golden age." In fact, he anticipates that the nanotech-driven nonbiological element of human intelligence will hold sway by the 2030s. The futurist enthuses that nanotech "can also produce radical wealth creation in that we will be able to manufacture essentially any physical product from inexpensive raw materials costing pennies per pound." Kurzweil acknowledges that these technologies have a dark side, but argues that discarding them would only amplify the danger by forcing development underground, where responsible users would lack ready access to the tools required to develop defensive applications. He is confident that, as with computer viruses, nanotech development will be paralleled by the evolution of defensive technologies to keep potentially disastrous consequences in check.
    Click Here to View Full Article

  • "For Math Whizzes, the Election Means a Quadrillion Options"
    Wall Street Journal (10/26/04) P. A1; Forelle, Charles

    This year's close presidential election has many political amateurs with a mathematics bent analyzing state polling data. In the last four years, the rise of blogging, availability of polling data, and anticipation of a 2000 redux have caused people such as Lawrence Allen to create probabilistic models and Web sites on which to publish their up-to-date results. Allen, an unemployed computer programmer, used Matlab and 1,700 state polls found online to predict the chances of either President Bush or Sen. Kerry winning 15 closely contested states and the election: Results taken Oct. 20 put Bush in the lead, though he later mixed those unbiased results with historical data that said undecided respondents lean toward the challenger; with historical data mixed in, Allen found Kerry on top by a wide margin. University of Minnesota economics Professor Andrea Moro is using the Monte Carlo method to predict the election outcome, while former Bell Labs physicist John Denker has compiled enormous stores of polling data on an Excel spreadsheet. California State University math lecturer and programmer Matthew Hubbard said incorrect predictions four years ago that President Bush would win the popular vote made him think news organizations did not have the math expertise to properly examine polling data. Hubbard's computer model parses 16.8 million possible outcomes to give Kerry a 73.9 percent chance of winning the electoral college vote as of Oct. 23, but he says the closeness of the race means even slight changes have a huge impact. Most amateur mathematic analyses have between 30,000 and 16 million possible outcomes for the election, but only focus on the contested states. Voter turnout, polling bias, and decisions on how to gauge undecided voters and what polls to use also influence analysis.

  • "GOP Beats Dems on Tech-Friendliness"
    CNet (10/28/04); McCullagh, Declan

    A compilation of key technology votes in Congress by CNET News.com illustrates the predominance of Republicans in terms of tech-friendliness: Republican senators' average score was 61 percent, compared to Democratic senators' 46 percent, while Republican House members earned a 68 percent collective score to Democrats' 52 percent. Sen. Jim Bunning (R-Ky.) captured the highest score for tech-friendliness in the Senate--86 percent--while the lowest score of 11 percent went to Sen. Ernest Hollings (D-S.C.), who drew fire from software and hardware firms with his proposal to embed anti-copying technology within their products. In the House, Rep. Butch Otter (R-Idaho) scored highest with an 88 percent lifetime rating, while Rep. William Lipinski (D-Ill.) scored lowest with 11 percent. Democratic presidential candidate Sen. John Kerry (D-Mass.) earned a lifetime voting rating of 44 percent, partially due to his position on the Digital Millennium Copyright Act and Internet taxation. The House average for tech-friendliness was 60 percent, while the Senate average was 53 percent; senators were ranked against 10 key votes over the past decade, while representatives were ranked against 12 key votes over the last six years. Will Rodger of the Computer and Communications Industry Association says the scorecard results demonstrate that politicians who promote themselves as tech advocates do not always follow through. Robert Atkinson of the Progressive Policy Institute argues that below-average scores do not necessarily reflect opposition to e-commerce and Internet growth, as the congressmen may support pro-tech measures in appropriations bills, or other proposals that never reach the floor.
    Click Here to View Full Article

  • "Sun, Others Prep Java Integration Spec"
    InternetNews.com (10/27/04); Boulton, Clint

    Java Business Integration (JBI), a scheme for effecting service-oriented architecture (SOA)-based Java software integration, has been published by a group of vendors led by Sun Microsystems. JBI is designed to ease the integration of software from different vendors in order to facilitate more code reuse and portability; the spec will enable integration developers to trim costs by placing a JBI-compliant BPEL engine, an XSLT-based transportation engine, or XPath-based routing services into a single container. Over 22 vendors are pushing the spec, also called Java Specification Request (JSR) 208, through the Java Community Process: Members of the consortium include Novell, Sonic Software, Apache, Oracle, JBoss, and IONA, while Sonic's David Chappell reports that former JSR 208 advocates BEA and IBM recently withdrew their support. He says the reason for this reversal may stem from the prospect of standardized integration, which could result in "an increased choice for customers and a reduced cost, and IBM, being one of the entrenched vendors in this space, probably feels threatened by this trend." ZapThink analyst Jason Bloomberg contends that a great deal of Sun and IBM's rivalry in relation to Java comes from both vendors' claims of leadership in the Java software market. He and fellow ZapThink analyst Ronald Schmelzer caution that though the JBI announcement is positive for the Java community, there could still be considerable user lock-out. "There's nothing here that enables the development of heterogeneous SOAs that are deployed on a wide range of infrastructure types, including Microsoft, mainframe and non-Java environments," warns Schmelzer. He adds that a lack of support from IBM and BEA will hinder the attractiveness of JBI.
    Click Here to View Full Article

  • "Nuke Agency Develops Linux Tools"
    Federal Computer Week (10/25/04); Sarkar, Dibya

    The Energy Department is giving Linux a boost by providing funding for open-source performance analysis tools. The agency's National Nuclear Security Administration is helping to fund a $6.8 million project in which Silicon Graphics will develop an open-source version of its commercial SpeedShop tool in cooperation with the University of Maryland and the University of Wisconsin. Open/SpeedShop, which will be available by summer 2006, will help users to analyze the performance of applications and tasks, eliminate bottlenecks and bugs, and enhance the overall application performance. Silicon Graphics will also offer a Pro version that will include added features. "To me, the idea that an agency could give a vendor incentive to move something to the open-source world, that's pretty powerful," says Steve Reinhardt, principal engineer at Silicon Graphics, who adds that the company would not have developed the tool on its own because it would have been too expensive. The tools will help speed up research at government labs, universities, and other institutions conducted on Linux-based operating systems.
    Click Here to View Full Article

  • "Key Internet Domain Up for Grabs"
    TechNews.com (10/25/04); McGuire, David

    ICANN is preparing to decide what company should take over management of the .net Internet domain--the world's fourth-largest--when VeriSign's contract expires next year. With a contract worth $30 million a year and 4.9 million Internet addresses registered under .net, there is much at stake in the battle over the domain, particularly since the online operations of many key Internet users, including service providers, e-commerce enterprises, and site operators, depend on the stability of .net. Tom Galvin, VeriSign's vice president for government relations, is calling ICANN's choice of a new .net operator "the most important decision ICANN has ever had to make." The company has the support of several large tech firms, including Microsoft, for its re-bid for operation of .net, but VeriSign's critics and rivals, including Dublin-based Afilias, are accusing it of using scare tactics to garner such support. Ram Mohan of Afilias notes that his company was able to replace VeriSign as operator of .org last year without causing any major disruptions to Internet activity within that domain. Internet Systems Consortium President Paul Vixie adds that changing power over .net from VeriSign to a new operator would help ensure "that we don't have all of our eggs in one basket." VeriSign hopes that its recent struggles with ICANN, including a lawsuit it filed in an attempt to reduce the group's power, will not hinder its bid for .net. ICANN general counsel John Jeffrey says a neutral, as-yet-to-be-named third party will be appointed to oversee the process.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Building Smarter Storage"
    InfoWorld (10/25/04) Vol. 26, No. 43, P. 47; Erlanger, Leon

    Many fiscally tight companies are apportioning high performance storage, storage area network (SAN) bandwidth, and services to essential and trivial applications in equal measure, a costly practice that often leads to reckless overimplementation. Enterprises that deploy automated storage polices and procedures based on each individual application's requirements can reap cost savings, improve regulatory compliance, guarantee the availability and disaster-recovery of mission-critical information, and establish a platform for information lifecycle management. Hitachi Data Systems CTO Hubert Yoshida argues that "The time has come to take a look [at storage] from the application down," a philosophy that many storage management and hardware vendors have adopted. Application awareness solutions are diverse, and follow a number of alternate strategies; selecting the most suitable strategy depends on the enterprise's desired level of automation and standardization as well as the kinds of services the company wishes to support. The fragmented application awareness market is typical of any new technology in an early developmental phase. An application-based SAN schema may allocate a certain level of service for mission-critical applications and channel less critical information through other storage tiers, while a metadata server keeps tabs on the data's location. Most analysts and vendors concur that the employment of application-aware storage throughout a heterogeneous SAN and the entire data lifecycle is the most effective approach. Standardization is an important first step, but the challenge of locking down policies and service levels will entail planning, consulting, and collaboration between application owners, storage managers, and business units with affected business processes.
    Click Here to View Full Article

  • "Degrees of Change"
    CIO (10/15/04) Vol. 18, No. 2, P. 54; Datz, Todd

    A decline in college IT enrollments spells trouble for CIOs in need of staff who can deftly balance technical and business issues to keep the IT department effective, efficient, and competitive with outsourcers. Steps colleges and IT schools are taking to reverse this trend include overhauling computer science curricula to emphasize business skills, and attempts to eliminate stereotypical perceptions of the IT field as socially isolating. For instance, Boston University's School of Management offers a program where graduates earn both an MBA and a Master of Science in Information Systems, while University of California-Irvine's Graduate School of Management is developing a joint computer science/business MS. Georgia-Pacific VP James Dallas remarks that his company prioritizes knowledge of business drivers, communication, business processes, project management skills, presentation, and innovation abilities over IT competence, noting that "I find it easier to take someone with a business major and an appreciation of technology, then teach them IT, than take someone with tech skills and try to teach them business." McKesson chief technology strategist Michael Keselman says there is a healthy demand for people skilled in the development and maintenance of complex systems, but adds that computer science grads are surprisingly unprepared for such jobs because their college training glosses over networks and databases in favor of basic algorithms, object orientation, and application programming languages. Furthermore, retooling curricula and creating new courses and degree programs is a slow process that involves the hiring of new faculty, the securing of funding, bureaucratic conviction, and justification research. John King of the University of Michigan's School of Information reports that business schools are de-emphasizing IT because so few professors have enthusiasm for the subject.
    Click Here to View Full Article

  • "Reading Between the Lines"
    EDN Magazine (10/14/04) Vol. 49, No. 21, P. 48; Dipert, Brian

    Radio-frequency identification (RFID) technology has drawn interest from governments seeking ways to monitor the activities of residents and citizens, and businesses looking for more automated and less expensive supply-chain management. One advantage of RFID over bar codes is the fact that individual RFIDs can incorporate unique identifying data sequences; furthermore, RFIDs do not require laser line-of-sight orientation between a tag and a reader to function effectively. Widespread adoption of RFID is currently hampered by competing and incompatible power techniques, encryption, modulation schemes, and frequencies, but these differences should be resolved eventually. An RFID tag's built-in storage capacity is partly determined by whether the RFID reader supports an Internet connection, an intranet connection, or both. Making RFID pervasive will require cooperation between the two primary groups responsible for RFID standardization--the International Organization for Standardization and EPCglobal. The potential market growth of RFID tags could be far outstripped by the opportunities for vendors of software and hardware that stores, transfers, and transforms RFID information. Because privacy issues are such an important factor in RFID adoption, the industry will need to promote consumer awareness, private-sector caution, and legislative oversight if it wishes to thrive. RFID vendors have a number of aspects to choose from to differentiate their products, including size, power consumption, communication strength, and, most importantly, cost.
    Click Here to View Full Article

  • "The Perils of Polling"
    IEEE Spectrum (10/04) Vol. 41, No. 10, P. 34; Cherry, Steven

    Over one-quarter of registered U.S. voters will cast their ballots with direct recording electronic (DRE) systems in the presidential election, despite the technology's lack of security, reliability, and auditability. The e-voting quagmire is the result of a number of unfortunate situations, including conscious decisions by overeager vendors to deploy systems without sufficient testing (or any testing in some cases); a formidable system-design challenge that is much tougher than most people assume; misplaced confidence that state and local officials will make fair and balanced efforts to choose and deploy optimal voting solutions; and state-level oversight of elections giving rise to a crazy quilt of voting regulations and policies. Most DRE systems are paperless, which makes independent recounts and verifiability impossible, and many critics have proposed the introduction of paper ballots. For now though, there appears to be a growing consensus to view verified-voting paper trails as an option rather than a requirement. The Help America Vote Act (HAVA) of 2002 promised $30 million to the National Institute of Standards and Technology (NIST) to develop more exacting election system analysis criteria, but this money has not been allocated; nor did the legislation specifically stipulate that election officials have to comply with any new HAVA regulations in order to receive funding for equipment purchases. Meanwhile, voting equipment certification is not mandatory, and critics think the testing process is untrustworthy because the companies authorized by the Federal Election Commission to evaluate the equipment are paid by vendors. Technologists and academics' advice to address e-voting security and reliability problems has often been rejected by the election community, which repeatedly argues that disclosing technical details about the equipment would compromise its security.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM