HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 598:  Monday, January 26, 2004

  • "Risky E-Vote System to Expand"
    Wired News (01/26/04); Zetter, Kim

    The U.S. government has elected to proceed with the deployment of the Secure Electronic Registration and Voting Experiment (SERVE) system for use in the November presidential election, despite a research panel's report that the system is too vulnerable to tampering or intrusion. SERVE, designed as a way for Americans stationed overseas to register online and vote in absentia via their PCs, was criticized by the panel as susceptible to viruses, worms, spoofing attacks, and denial-of-service attacks. Johns Hopkins University researcher and report co-author Avi Rubin argues that SERVE's security flaws cannot be solved because they are also flaws inherent to PCs and the Internet, but SERVE manager Carol Paquette insists that the system has safeguards to keep such dangers in check. She promises that the workplace computers people use to vote on SERVE will be fortified with firewalls and other intrusion countermeasures, and adds that election officials will recommend that home users install antivirus software on their PCs and run virus checks prior to election day. Rubin counters that antivirus software can only identify known viruses, and thus is ineffective against new e-voting malware; furthermore, attacks could go undetected because SERVE lacks voter verifiability. Rubin and the three other researchers who furnished the report were part of a 10-member expert panel enlisted by the Federal Voting Assistance Program (FVAP) to assess SERVE. Paquette reports that of the six remaining FVAP panel members, five recommended that the SERVE trial proceed, and one made no comment. Lawrence Livermore National Laboratory researcher and FVAP report co-author David Jefferson is worried that if the November SERVE trial goes off without any noticeable hitches, then organizers will expand the system under the false assumption that it is secure.
    Click Here to View Full Article

    To learn more about ACM's activities involving e-voting, visit http://www.acm.org/usacm/Issues/EVoting.htm.

  • "Small Robotic Devices Fly Like Birds"
    Newswise (01/23/04)

    Sunil K. Agrawal of the University of Delaware is designing and building flying robots inspired by the hummingbird and the hawkmoth, whose potential applications range from military surveillance to industrial maintenance to enhanced law enforcement and search and rescue operations. Agrawal says the hummingbird is a particularly valuable biological model because of its ability to hover, which would be a critical feature in effective surveillance. An early permutation of the robotic bird was constructed out of balsa wood and sported paper wings that flapped under the power of rubber band engines; a later design featured battery-powered wings and drew the attention of real birds when it flew, Agrawal notes. The robot bird's current design has Mylar wings and a body made from carbon fiber composite, which reduces its weight dramatically and makes the frame less fragile. Agrawal says the research team's current efforts focus on minimizing the robot's mass and power requirements, while later initiatives will seek to make the machine small enough to fit in the palm of a hand as well as integrate flight controls. "We want to demonstrate that the flapping wing machines can be built and optimized and, eventually, we would like to expand from a single flying machine to a group of cooperative flying machines," he adds. Agrawal says his team intends to study new robot designs in a wind tunnel to collate information on force and torque, which will be used to anticipate how to improve and control the machines' movement and then polish future designs via computer modeling. The National Science Foundation, the National Institutes of Health, the U.S. Air Force, and the National Institute of Standards and Technology are underwriting Agrawal's research.
    Click Here to View Full Article

  • "Memory: Beyond Flash and DRAM"
    Business Week (01/21/04); Kharif, Olga

    Despite not catching as many headlines as computer processors, memory chips are set for a renaissance as demand for faster, cheaper, and denser memory increases rapidly. The market for traditional flash and DRAM memory are set to grow by about 40 percent this year, according to IC Insights, but the performance advances of these chips still lag other computer components and degrade overall functionality. Device manufacturers have to choose between power-hungry DRAMs that draw electrical charge to retain stored data and more expensive but slower flash memory that retains stored data when turned off. Improvements on DRAM and flash, such as DDR II interface, stacked architecture, and better insulators will not boost those memory types enough to satisfy market demand. Established technology vendors and a host of startup companies are working on new technologies that will combine the best qualities of DRAM and flash while remaining price-competitive: Among the most developed contenders are ferro-electric RAM, magnetic RAM (MRAM), and phase-change RAM (P-RAM). Ferro-electric RAM uses exotic materials that seal electrons in so well as to preserve data even when current is turned off, but needs larger capacity and lower cost; MRAM uses magnetic forces rather than electric charge to store data and may enable instant-on PCs in about four years; P-RAM, or ovonic memory, keeps data ready for quick access and stores data up to 100 times faster than flash memory. Startup ZettaCore claims to be readying prototype molecular memory that could reduce memory manufacturing costs to just one-fifth of DRAM costs because molecules can be sprayed onto base material, avoiding several costly silicon manufacturing steps. IBM and other firms are working on organic memory techniques that could turn simple media such as business cards into memory storage to be read by special reader devices, though their organic nature would make them less robust than inorganic-based memory.
    Click Here to View Full Article

  • "Linux's 'Center of Gravity'"
    CNet (01/21/04); Shankland, Stephen

    Open Source Development Labs (OSDL) CEO Stuart Cohen declares that his organization's goal "is to become the center of gravity for the Linux industry," and a focal point for IT vendors, the development community, and users. He says the expansion of OSDL to include corporate customers was implemented with the establishment of a customer advisory council in the United States, while similar councils are planned for Europe and Asia; Cohen notes that the purpose of the councils is to give corporate users a voice so that technical workgroups and marketing workgroups are in sync. The CEO says that OSDL will soon announce the inclusion of several governments as members, though Linux's appeal to government clients varies around the globe: Total cost and ownership appears to be a major consideration in the United States, while Europe seems more interested in Linux's origination, and Japan is concentrating on software imports versus exports. Cohen promises that OSDL will detail its desktop initiative at LinuxWorld this month, and notes that their definition of desktop covers not just calendaring and email, but also client/server applications, branch office, point-of-sale terminals, the help desk, the IT and engineering departments, and grid computing. Cohen says OSDL's customer council was the instigator for the formation of a legal defense fund to protect Linux users against threats of copyright infringement lawsuits by the SCO Group so that users are less uncomfortable with continuing or beginning to implement Linux solutions. Criteria for qualifying recipients is now being finalized, he adds. Cohen dismisses SCO's charge that Linux's so-called code infringement is attributable to shoddy code vetting, arguing that every step of Linux code development is viewed by all members of the development community. He points out that OSDL has no reason to believe the infringement accusation has any merit, given that SCO has refused to make the offending code available.
    Click Here to View Full Article

  • "Timers to Shrink Microchips, Cell Phones"
    United Press International (01/21/04); Choi, Charles

    National Institute of Standards and Technology (NIST) researchers have invented microscopic electronic clocks that could replace larger, more expensive timers in microchips, leading to significant reductions in price as well as size. NIST physicist William Rippard speculates that such an innovation "could lead to shrinking of components in computers, leading to faster and more powerful computing, as well as cell phones, things like global positioning system units, as well as broadband Internet and wireless connectivity." Computers, cell phones, and other electronic devices currently employ millimeter-scale oscillators of quartz, ceramic, or similar materials to stay synchronized; in addition, the crystals are mated to the microchips, not integrated into them. Last year, a team led by Cornell University physicist Dan Ralph developed 100-nm-wide "nano-pillars" formed from sandwiched layers of magnetic and non-magnetic metals, while the injection of electrons into the pillars induced fluctuation of their magnetic fields. The problem was that the oscillations' numerous frequencies made stable synchronization impossible; the NIST team therefore developed a timer that generates a pure tone similar to the sound emitted by a tuning fork. Rippard believes the NIST oscillator is more robust than the Cornell oscillator because it boasts a 40-nm-wide electrical contact atop a larger magnetic layer. Another advantage over the Cornell device is a lack of electron leakage, which NIST researchers believe is responsible for better performance. The NIST oscillator could also function as a transmitter because its oscillation frequency can be tuned within the 5 GHz to 40 GHz range, which encompasses automotive collision-avoidance radar and high-speed Internet communications.
    Click Here to View Full Article

  • "Fort N.O.C.'s"
    MSNBC (01/20/04); Meeks, Brock N.

    VeriSign operates the "A" root server somewhere in Virginia at the end of a small highway, in one of many nondescript mini-office parks in the Washington, D.C. area; invited visitors to the site cannot see any markings or other signs that would indicate the role of the four-story building, but inside an electronic badge is required to access just the reception area, while access to the elevator requires signing several papers basically amounting to non-disclosure agreements. VeriSign's "A" root server acts as the central address book for the worldwide network of 13 Internet servers, 11 others of which are operated by a variety of other academic and corporate entities on a volunteer basis. Most root servers are protected by similar "security through obscurity," says Internet expert Christopher Ambler, who leads a tour through the VeriSign facility and explains that corporate and government officials have become much more aware of the role of the Internet in global commerce and even in critical national infrastructure in the past two years. VeriSign operates another root server many miles away in the same region, and protects both sites with an investment of $150 million. VeriSign network security vice president Ken Silva, a 20-year veteran of the National Security Agency, says his company spends more money than any other root server operator to protect the machines, though that money is primarily meant to secure the company's other Internet operations, such as its .com and .net registrar operations. At the "A" root server building, that security investment is seen by the layered security needed to access levels three and four, and any of the dozen doors on those floors. The Network Operations Center, the actual nerve center where experts monitor Internet health in real time, is protected by a double-doored hallway requiring two handprint scans. CNN and CNN Headline run on two of the 13 monitors while another displays root server loads from all over the world--the same screen officials at the Department of Homeland Security monitor.
    Click Here to View Full Article

  • "Online Reference to Reach Milestone"
    SiliconValley.com (01/25/04); Gillmor, Dan

    The posting of the 200,000th article for the Wikipedia online encyclopedia in the coming days or weeks will be a turning point in the life of the project, which is supported by a grass-roots volunteer community whose population expands daily. Nearly anyone can contribute or edit material on the Wikipedia, as banning one's access to the site is prescribed for only extreme cases of abuse. In its three or so years of existence, Wikipedia has proved itself remarkably resilient against cyber-vandalism, article content conflicts, and underdeveloped articles. Most Wikipedia articles maintain a neutral tone, while facts are supplemented with multiple perspectives when the subjects of the articles are controversial: "The only way you can write something that survives is that someone who's your diametrical opposite can agree with it," notes Wikipedia founder Jimmy Wales. At the heart of Wikipedia is "Wiki" software that permits all users to edit all posted pages, and makes each editorial change visible to anyone. Wales estimates that roughly 200 users contribute to Wikipedia on a more or less daily basis; about 1,200 work on the site regularly, while tens of thousands more contribute infrequently. Vandalism is discouraged by the many volunteers who repair every blemish to the Wiki within minutes. Wikis are starting to form the core of companies, and being implemented as collaboration or planning tools by private firms.
    Click Here to View Full Article

  • "Spam Law Generates Confusion"
    Wired News (01/26/04); Ulbrich, Chris

    Email marketers at the Jan. 22 Spam and the Law Conference raised the issue that provisions of the Can-Spam Act, in effect since the first of the month, remain vague, underscoring their worries about potential vulnerability to lawsuits by ISPs and government agencies. Institute for Spam and Internet Public Policy President Anne Mitchell reported that many people who lack legal training are interpreting Can-Spam in their own way, which has led to widespread confusion: "There is a lot of fear, and a lot of that is fear of the unknown," she commented. Can-Spam threatens substantial punishments for spammers who use false email headers or commandeer computers, as well as refuse to honor recipients' opt-out requests--nor are spammers' clients necessarily exempt from these penalties. Among the provisions that had attendees confused was the requirement that advertisers maintain a list of users who have opted out of receiving spam by inserting their postal address in each email sent on their behalf; one question raised at the conference was how this applies to emails with multiple ads. ISP and antispam software vendor representatives attending the conference noted that they are exercising more caution in their email campaigns with Can-Spam in effect, but ePrivacy Group's Ray Everett-Church posited that the law is of little concern to mainstream email markets. "I think the reality is that most companies who are engaged in email marketing are not going to be deeply affected by [Can-Spam], because that law is geared towards dealing with abusive and deceptive practices, most of which legitimate companies are wise enough to avoid," he pointed out. Verizon Online general counsel Thomas Dailey opined that third-party bulk emailers are especially at risk, while Wilson Sonsini Goodrich & Rosati attorney Dave Kramer doubted that Can-Spam would increase the effectiveness of ISPs, arguing that they were granted no additional powers or authority under the law.
    Click Here to View Full Article

  • "IST Labs Project Images of the Future"
    Centre Daily Times (PA) (01/26/04); Miller, Gwenn

    The purpose of Penn State University's Information and Sciences and Technology (IST) Building is to host research in a variety of fields with the overall goal of preparing students for high-tech careers--or more specifically, "to build leaders in problem solving with technology," according to School of Information Sciences and Technology dean James B. Thomas. The $58.8 million IST facility, which is shared by the university's Department of Computer Sciences and Engineering (CSE) and the School of Information Sciences and Technology, will house research focused on homeland security, the human genome project, and other grand challenges the United States is facing. Penn State President Graham Spanier announced at the Jan. 22 IST dedication ceremony, "I truly believe we are on the cutting edge of the digital age, and have the power to make a more outstanding contribution to Pennsylvania through technology." The CSE department's Intelligent Systems Laboratory is dedicated to the development of computing systems that are controlled by voice and gesture, an example being "Dave," a multimodal interface platform that displays geographical information through vocal commands. The IST Building is also home to the Information Science Laboratory, whose goal is to increase the challenge and interactivity of video games. Assistant professor of information sciences and technology Magy Seif El-Nasr demonstrated a project to improve human-machine interaction by immersing viewers in a more realistic rendition of dramatic material. Meanwhile, the Applied Cognitive Science Lab focuses on building more accurate human behavioral models and improved modeling tools, as well as abstracting human behavior in reviews and by collating additional information to support model construction. Frank Ritter of IST concentrates on modeling human behavior through such tools as the ER1, a robotic wheeled laptop.
    Click Here to View Full Article

  • "Quantum Dice Debut"
    Technology Research News (01/21/04); Smalley, Eric

    Government researchers and scientists at MIT have created a system to introduce limited randomness in quantum operations. Random numbers are essential for core computing tasks such as creating chance and variation in games and simulations, encryption, and taking accurate samples of large data stores. In quantum computers, the introduction of limited randomness would counteract ambient noise that threatens sensitive qubits, atomic spin states that are the basis for quantum computing. Generating randomness in quantum calculations is tricky because of the infinite number of qubit configurations, each of which represents a possible calculation. The researchers' innovation is to limit the possible outcomes while maintaining randomness. Thus, future quantum computers would have a method of estimating imperfections or errors in processors caused by decoherence, or interaction with light, heat, electricity, or magnetism. Random quantum operations would essentially control operations that allow quantum computers to characterize noise and limit its affect. The system is very basic and comprises a three-qubit prototype controlled by a nuclear magnetic resonance device; so far, the researchers can only estimate very rough characteristics of ambient noise, such as its overall strength, but the development signals a big leap forward in quantum computer development. The quantum randomness generator could also prove useful in quantum communications tasks such as quantum encryption.
    Click Here to View Full Article

  • "Disabled to Get Greater Access to Linux"
    SiliconValley.com (01/21/04); Takahashi, Dean

    The Free Standards Group says it has established a task force to develop accessibility standards for Linux. Scott McNeil, executive director of the Free Standards Group, says a standard version will make it easier for Linux developers to develop software and hardware for disabled people; Linux developers have already created speech synthesizers that read aloud text. The strategy should encourage the development of keyboards and other devices that would be compatible with any Linux operating system software or applications. The Bay Area group wants to make Linux as accessible to people with disabilities as is Windows from Microsoft, which has introduced add-on features for the disabled since the mid 1990s. IBM, Hewlett-Packard, Sun Microsystems, Red Hat, and a number of universities support the efforts of the Free Standards Group. Janina Sajka, the American Foundation for the Blind's director of technology research and development, one of the estimated 10 million Americans who are visually impaired, has used a special version of Linux for five years and says, "When this technology works, it changes people's lives profoundly."
    Click Here to View Full Article

  • "Perfecting Protection"
    The Shorthorn (01/22/04); Garcia, Josie

    A team of researchers at the University of Texas at Arlington are studying new ways to help protect the nation from terrorist attacks. The researchers are involved in a five-year project, the Pervasively Secure Infrastructure, that is being funded by a $1.6 million grant from the National Science Foundation. Data collecting and data-mining are the focus of their work in that the researchers are looking to gather and process data about the environment or a suspicious-looking person for real-time communication to determine if there is a real threat. For example, the technology could be used in a device such as a camera that can focus on eye movements, and detect potentially troublesome situations. Another solution from their research, which could help in a national disaster, is a pattern-finding device that would be able to determine the location of a person under rubble without digging through stone. The researchers even envision their work leading to sensors that would be able to add up merchandise taken from store shelves and charge items to the credit card of the person. "Shoplifting would essentially go away," Behrooz Shirazi, a computer and engineering professor, says of the crime-fighting capabilities of their research. "We think this is the wave of the future." Dr. Sajal Das, the project's principal investigator, and his team are also working with researchers at Pennsylvania State University and the University of Kentucky.
    Click Here to View Full Article

  • "Workstation Clustering: Strength in Numbers"
    Workstation Planet (01/22/04); Nadel, Brian

    Advanced nuclear weapons design, genetic disease markers, and the search for alien life-forms are just some of the projects being undertaken with the help of supercomputers composed of thousands of commercially available workstation processors lashed together in a massively parallel architecture. "The [processors'] parallel structure...lets them each work independently and then combine their results to come up with the answer," notes Steve Conway of Cray. Such machines can be constructed for 50 percent or less of what a traditional supercomputer would cost. Los Alamos National Laboratory's Lightning supercomputer, used for nuclear explosion simulation, can perform more than 11 teraflops using about 3,000 2GHz AMD Opteron 244 workstation processors, while the University of California and Lawrence Livermore National Laboratory's Thunder supercomputer can perform over 20 teraflops on nearly 4,000 1.4 GHz Itanium 2 workstation processors. Virginia Polytechnic Institute's Big Mac can carry out about 17 teraflops using 2,200 64-bit PowerPC 970 processors, and was built at a relatively frugal cost of $5.2 million. Big Mac is used for research with molecular modeling, acoustics, aerodynamics, and nanotechnology. The SETI@Home project relies on a virtual supercomputer comprised of millions of terminals owned by people willing to donate the idle capacity of their machines to analyze radio signals from outer space for signs of intelligent life. An extension of this concept is the Grid, a distributed computing architecture connecting all major computers at universities and research facilities into a vast supercomputing resource that is available on demand.
    Click Here to View Full Article

  • "Coming to Grips With Grids"
    InfoWorld (01/19/04) Vol. 26, No. 3, P. 42; Scannell, Ed

    Though enterprise IT executives are intrigued by the possibilities of grid computing, widescale mainstream adoption may be several years away because of lingering doubts about the technology's maturity, its lack of solid financial value, and the disparate terminologies vendors employ to describe distributed computing. Mary Johnston Turner of Summit Strategies asserts that "Grid purchasing decisions will be driven by CIOs and architects looking to save money, improve their services level, and increase IT flexibility," while leading grid technology vendors are confident that grids are the optimum choice for integrating departmental services or starting to build a complete utility computing environment. Grid technology projects cannot move forward until issues of manageability and the seamless integration of security products with existing enterprise infrastructure are settled. Grids' ability to maximize efficiency by more fully tapping idle servers is very attractive to early adopters, while IBM's Dan Powers believes corporate users could initially use grid computing to more efficiently schedule jobs within the network, provision vital server-based operations, and visualize information across all enterprise servers. Some executives are finding grid architecture deployment via interweaving assorted grid technologies with Web services to be an effective and thrifty strategy. Corporate users should initially streamline existing infrastructure through the concentration of as many datacenters as possible that will run all IT resources, and then see if they can refine the midrange servers into a unified base of application servers. Once this is done, users must choose a standardization platform and discard the others, thus simplifying deployment, support, and management issues. Shahin Kahn of Sun Microsystems thinks early grid implementations must be able to automatically detect available resources and supply workloads and other resources, seamlessly uphold software portability and mobility, and assign workloads to available resources on the spur of the moment.
    Click Here to View Full Article

  • "Dawn of a New PC"
    CIO (01/15/04) Vol. 17, No. 7, P. 89; Edwards, John

    PCs are poised to experience a major cosmetic change in the next few years, but whether enterprises will appreciate it is a subject of debate. Though many CIOs would like to avail themselves of the advanced capabilities promised by forthcoming technologies, budget constraints will dictate that organizations only upgrade their hardware on an as-needed basis, according to Chris Shipley of The Demo Conferences for IDG Executive Forums. Future PCs promise speedier chips and more power via chip-oriented architectures such as Intel's Hyper-Threading, AMD's HyperTransport, PCI Express, Serial ATA, Serial-Attached SCSI, and ExpressCard. A next-generation notebook from IBM that folds up like origami is being tested, while many enterprises are exploring the possibilities of Tablet PCs, which offer the advantage of pen-based input, and low-cost wireless personal digital assistants. Next-generation desktop and mobile systems could boast organic light emitting diode (OLED) displays that are ultrathin, space-efficient, and require no backlighting, though the technology cannot yet compete with liquid crystal displays in terms of low cost and size. Wireless adoption is expected to ramp up with new standards such as 802.11x and 802.11g that promise faster communications and interoperability across different technologies. Gartner fellow Martin Reynolds maintains that "The market is more than ready for" thin-client devices whose purported benefits include lower cost, better security, and improved management. Many analysts are confident that enterprises will ultimately purchase new hardware in the face of rising security and maintenance costs on old equipment.
    Click Here to View Full Article

  • "The Smart-Dust Revolution"
    The World in 2004 (01/04) P. 121; Anderson, Alun

    The current definition of progress as squeezing more computing power into the same space will be overtaken by an emerging information revolution heralded by billions of tiny, intelligent sensors that can self-organize into scalable, fault-tolerant networks, and whose limited brain power is offset by sheer numbers. An extreme example of this sensor revolution is "smart dust"--incredibly small computers that proponents believe could be distributed within the atmosphere and enable people to instantly access any Earth-related information, such as weather. Meanwhile, practical devices touted as business efficiency solutions have started to hit the market. Sensors that monitor the operations of cooling and heating systems in buildings are already in use, while other applications include industrial machinery fault detectors and moisture and temperature readers for fertile soil. And three years ago, the U.S. Army successfully tested small sensor "motes" as a potential surveillance tool. Radio-frequency identification (RFID) tags are very attractive to supermarkets and other retail outlets because of their potential to revolutionize inventory, supply-chain management, and customer convenience. Automated stores with minimal staff could become a reality if RFID tags become inexpensive enough to outfit on individual items. Sensor installation is relatively low-cost because the networks communicate wirelessly, and power consumption is also low since sensors talk to one another via short-range hops.
    Click Here to View Full Article

  • "If He's So Smart...Steve Jobs, Apple, and the Limits of Innovation"
    Fast Company (01/04) No. 78, P. 68; Hawn, Carleen

    Apple has had a long history of developing technological milestones but getting cheated out of the lion's share of the profits by competitors. Over the past 23 years, Apple has slipped from the No. 1 vendor in the PC industry to No. 9, and recorded only $6.2 billion in revenues for the fiscal year ending on Sept. 27, 2003. The company's single-minded focus on innovation may be its Achilles' heel, a theory that gains greater credence when measured against other outfits known as nurturers of trend-setting, useful, and "cool" technologies--Xerox's Palo Alto Research Center being just one example--that have seen little financial returns. Boston Consulting Group's James Andrews says businesses can increase their chances of financial success by choosing one of three innovation models: The integrator model, whereby a company is responsible for the entire innovation cycle; the orchestrator model, which keeps design in-house while manufacturing, marketing, and other operations are farmed out to a strategic partner; and the licensor model, in which a company licenses its products in order to get maximum distribution for minimum investment. Apple, with its dogged dedication to innovation, appears to have subscribed to the integrator model since the beginning, which has led to a profound shortage of developers, and thus fewer products to operate on Apple machines. Innovation of business models is thought to be far more important to a company's success than tangible innovation, and Apple has had a poor track record in this regard. "If you can't wrap...innovation into a compelling value proposition, with a dynamic distribution strategy and attractive price points, then the innovation isn't worth much at all," argues Strategos Chairman Gary Hamel. Apple also comes up short in follow-through--building a solid sales force, strategically collaborating with developers, and providing compelling customer services are frowned upon because they lack the coolness or sexiness of pure innovation.
    Click Here to View Full Article

  • "Broken Machine Politics"
    Wired (01/04) Vol. 12, No. 1, P. 144; O'Donnell, Paul

    The promise of direct recording electronic (DRE) devices to simplify the voting process, lower costs, make ballots accessible to all, eliminate overvoting, and avoid the debacle of the last presidential election has been tempered by revelations that electronic voting systems suffer from glitches, vulnerabilities, and a lack of auditability that could compromise elections. Computer experts and election observers grumble that the technology which should have ensured secure, accurate democracy is now supporting an even more flawed electoral infrastructure. The state of Maryland held up the $55.6 million deployment of thousands of Diebold e-voting machines when a report by Johns Hopkins researcher Aviel Rubin demonstrated that it was possible to isolate the encryption keys that protect data on the devices. Other problems the report outlined included the use of the obsolete Data Encryption Standard and the employment of unencrypted passwords by smartcards voters use to log in. Maryland officials commissioned Science Applications International to conduct an outside review of the Diebold machines, which uncovered many of the vulnerabilities mentioned in the report; the deployment of the DREs went forward once Diebold patched the most grievous security faults. Rubin explains that computer security deficiencies are more attributable to people than technology: Assigning particular people to handle particular tasks--usually through password-protected access--is the core of computer security, but the secret ballot system does not allow specific voters to be linked to their selections. However, Harvard's Rebecca Mercuri contends that Rubin's report may have shifted focus away from poor quality software by concentrating on the dangers of hacking. In a February 2003 meeting of the Santa Clara County Board of Supervisors, Stanford's David Dill proposed that a paper trail be incorporated into e-voting systems to ensure the accuracy of elections; his suggestion influenced California secretary of state Kevin Shelley, who announced that a printed audit trail must be a DRE requirement by 2006.
    Click Here to View Full Article



    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM