HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 635: Friday, April 23, 2004

  • "China Will Keep Pursuing Digital Standards"
    Wall Street Journal (04/23/04) P. B1; Chen, Kathy

    Despite its decision to shelve a wireless-encryption standard in response to international pressure, China is going ahead with other initiatives to set standards for global technologies such as radio-frequency identification (RFID) and digital compression. Not only would this approach better position China to negotiate royalties or technology transfers, but would give Chinese industries an edge over competitors by claiming early ownership of nascent technologies. U.S. companies have been rattled by China's standardization efforts, which, if successful, would force them to customize products for the Chinese market. The Chinese government has established a working group that is holding discussions with 24 multinationals to shape a Chinese RFID standard. Meanwhile, China's video-compression technology, which was defined with assistance from Microsoft, Cisco, and other multinationals, is being promoted as superior to MPEG. Its licensing deal is also purported to be better: Licensors of the Chinese standard only have to deal with a single licensing body, whereas MPEG-4 licensors must hold separate fee negotiations with each individual patent holder. Anne Stevenson-Yang of the U.S. Information Technology Office reports that Chinese firms are naturally inclined to be leading standards-setters, but cautions that "you want to do it through inducement, not dictate, or you'll cut off your companies from export markets, international customers and collaboration in technology." China's wireless-encryption standardization effort may have been nixed, but policy makers note that the country is working on several compulsory security standards for routers and Internet-related hardware and software.

  • "Privacy Advocates Swap Horror Stories"
    PC World (04/22/04); Brandt, Andrew

    The hazards of e-voting, FBI wiretaps, and the accuracy of Google searches were just some of the topics discussed at the ACM's 14th Annual Computers, Freedom, and Piracy conference in Berkeley. Keynote speaker and Stanford University computer science professor David Dill warned that electronic voting systems are insecure, and their lack of an audit trail raises the possibility of a "nightmare scenario" in which accidental or intentional errors in electronic voting booths could jeopardize the democratic process and fix elections without anyone realizing it. Meanwhile, former FBI agent Mike Warren reported that the bureau carried out 2,426 Title III wiretaps in 2001 using Carnivore technology. He added that the FBI is lobbying the FCC to require ISPs to install devices that allow the agency to tap Internet phone calls via Voice over IP technology. Harvard researchers and University of Leipzig professor Marcel Machill disclosed the results of a study on the use of Google and how people's reliance on the search engine impacts the Web sites they find. They concluded that overreliance on one search engine hurts the accuracy of Web searches: Machill's study of the first 20 results of the term "back pain" at major search engines revealed that the most accurate search on the German version of AOL yielded only about eight sites relating to back pain, while the remainder consisted of spammed sites that often had no connection to the search term. The researchers also noted that search strictures vary for different versions of Google--hate speech sites banned by the German version, for instance, are permitted in the U.S. version. Privacy International's Big Brother Awards ceremony "honored" the U.S. Transportation Security Administration, Northwest Airlines, and Seisint for poor administration of the no-fly list, supplying passenger records to the government in defiance of privacy policies, and granting the federal government access to the MATRIX search tool, respectively.
    Click Here to View Full Article

  • "Scientists Peg Data's Speed Limit"
    Associated Press (04/21/04); Bergstein, Brian

    Scientists report in the April 21 edition of Nature that the topmost speed at which data can be written on disk and then retrieved is around 1,000 times faster than today's cutting-edge storage devices. This conclusion was reached by using Stanford University's particle accelerator to blast electrons at a portion of the magnetic data recording medium at nearly the speed of light. The duration of the energy pulses was just 2.3 picoseconds, and the researchers observed randomized magnetic patterns. Danilo Pescia of the Swiss Federal Institute of Technology noted that the experiments demonstrated that magnetic recording speed can be sped up 1,000 times before it reaches its limit. "In order to go beyond this limit, some completely new technology will be required, of which we do not know anything yet," he wrote. Stanford researchers directed the project, whose participants included Seagate Technology engineers and a scientist at Moscow's Landau Institute for Theoretical Physics. Seagate CTO Mark Kryder doubted that the results of the experiment will have a dramatic impact on the data storage industry. "Certainly we are not going to start packaging linear accelerators into hard disk drives, so the kinds of speeds achieved in these experiments would never be observed in an actual recording device," he quipped.
    Click Here to View Full Article

  • "Who Should Keep Out The Hackers?"
    Washington Post (04/22/04) P. B1; Krim, Jonathan

    A recent Department of Homeland Security report that places the responsibility for healing the Internet's security holes squarely on the shoulders of tech providers has gained credence with an April 21 DHS advisory that users and network operators act quickly to prevent a pair of vulnerabilities in Cisco routers from disrupting the Internet. The report, issued by the National Cyber Security Partnership on Monday, contends that "vendors are placing the entire burden of securing products on their users" by not making software secure by default. The study contrasts sharply with the tech industry's view, which is that the chief solution to cyber-security threats is a combination of user education and law enforcement. The report's authors also take to task the credo that effective security fixes will be produced faster by market forces as long as the government does not get involved. The study calls for better quality control, new security standards, and additional collaboration with customers. SANS Institute director Alan Paller, an outspoken critic of the software industry, describes the report's suggestions as "the essential first steps in improving cyber-security in America." Some tech providers have responded to the report with claims that they are pouring money into security products, and insisting that more responsible users are a key element of cyber-security strategy. DHS officials do not see their agency as a taskmaster that whips the software industry into shape, and there is no single agency in charge of getting industry to adopt cyber-security recommendations. The task force that furnished the DHS report included representatives of Cisco Systems, IBM, Oracle, banks, academics, and the military.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Push for Voting Changes May Not Cure All Ills"
    Wall Street Journal (04/22/04) P. A4; Weinstein, Elizabeth

    Despite the best intentions, the modernization of the U.S. electoral process required by the Help America Vote Act (HAVA) of 2002 is behind schedule, which means that many voters will still rely on traditional, non-computerized voting systems for the November presidential election. Jurisdictions in at least 34 states are expected to deploy touch-screen electronic voting machines by November, but concerns about their vulnerability to hacking and tampering have become an especially contentious issue--enough to prompt lawmakers such as Rep. Rush Holt (D-N.J.) to introduce legislation calling for the installation of paper trails in order to ensure accurate recounts. "It's not the potential for problems but the potential for scrutiny that's going to be the big difference," notes Electionline.org director Doug Chapin. HAVA requires first-time voters who did not provide ID when registering by mail to confirm their identity to poll workers, and has set a January 2006 deadline for each state to build its own uniform voter-registration database, an expensive requirement that so far only nine states have complied with. The distribution of federal funding to help states upgrade or replace their voting systems in compliance with HAVA has been sluggish due to bureaucratic delays. Election Data Services President Kimball Brace doubts that promises by the Election Assistance Commission to release more money as early as mid May will make a difference. "What that means for counties and states is that they're still behind the eight ball," he explains. Sens. Hillary Clinton (D-N.Y.) and Bob Graham (D.-Fla.) have also introduced legislation calling for voting systems to generate paper trails. Clinton says, "If we have huge problems again, people will fundamentally lose confidence in our democracy and in their vote." Nevada has already installed printers for all of its touch-screen machines, and California is expected to install them by 2006.

    To read more about ACM's activities and concerns involving e-voting, visit http://www.acm.org/usacm.

  • "Linux Creator Calls Backporting 'Good Thing'"
    InternetNews.com (04/20/04); Kerner, Sean Michael

    The practice of backporting newer Linux features into older versions sparked controversy at the Real World Linux Conference in Toronto, when SUSE CTO Juergen Geck said competitor Red Hat's backporting threatened to fragment Linux. In an email exchange with internetnews.com, Linux creator Linus Torvalds said backporting had benefits and detriments, but that as long as the practice was done to best meet customer needs, it was a good thing. Other Linux luminaries took various stands on the issue: Ximian co-founder and Novell vice president Miguel de Icaza said backporting was commonplace and even necessary in order to provide key features, such as the Native Posix Threading Library (NPTL). Former Debian Linux project leader Bruce Perens, however, took issue with Red Hat's particular backporting practices, noting that one of his large customers refused the Red Hat kernel despite running a Red Hat distribution. By backporting features from Linux version 2.6 into version 2.4, Red Hat's kernel became so far diverged from the main thread that customers were dependent solely on that company for support, Perens said. He mitigated his criticism by saying that as long as the Red Hat kernel subscribed to the General Public License, the threat of forked distributions and vendor lock-in was minimal. "Since all of the parties involved can copy any software from any of the forks into their own one, because of the GPL license, forks tend to merge," Perens said. Torvalds said backporting did cause maintenance hassles, but that as long as customers eventually migrated to the newer kernel, the difficulties would be short-lived; he also pointed out that backporting helped subject new features to a much wider testing audience.
    Click Here to View Full Article

  • "Nonlinear Nets Approach Runway to Wireless Apps"
    Electronic Engineering Times (04/19/04) P. 55; Brown, Chappell

    New research in neural networks has yielded an architectural network for telecommunications devices and a mathematical model equivalent to the universal Turing machine model used for computers; the developments are one more step in creating man-made systems that function as efficiently and powerfully as biological information-processing systems. International University Bremen researcher Herbert Jaeger says his Echo State Networks (ESNs) design makes it possible to merge nonlinear networks into devices without having to understand exactly how they function. His technology is based on autonomous robot control research previously conducted at the Fraunhofer Institute for Autonomous Intelligent Systems. Because of its enclosed architecture, the ESN can be designed into a system using a black box approach, meaning that designers do not need a comprehensive mathematical model of the neural network; Jaeger says ESNs are being implemented digitally through field-programmable gate arrays, but not through the more efficient analog circuitry, because digital technology is able to suppress noise that would disrupt the ESN. He is working together with the Fraunhofer Society to offer consulting services to telecommunications firms, and expects that ESN technology will be used to build next-generation cellular networks that are self-organizing. Separately, Wolfgang Maass at the Technical University of Graz has created a more sophisticated nonlinear model called liquid-state networks because inputted signals reverberate with diminishing amplitude. Maass came upon the discovery after observing feedback circuits in the brain. By integrating voltage spike trains, Maass expects to mimic the brain's signal processing capability; he also found networks composed of relatively few neurons could be easily adjusted to implement many different finite-state machines.
    Click Here to View Full Article

  • "E-Translators: The More You Say, the Better"
    Christian Science Monitor (04/22/04) P. 15; Lamb, Gregory M.

    Scientists are improving electronic translation devices by using entire phrases instead of individual words. Phrases are far less ambiguous than individual words, says Carnegie Mellon University systems scientist Robert Frederking, who works in the school's Language Technologies Institute. NEC has developed a working prototype that allows two-way phrase translation and is testing the handheld device at Tokyo's Narita International Airport; a commercial product is expected out by the end of the year. The U.S. military deployed its first electronic translation device, the Phraselator, in Afghanistan two years ago, and is currently using the one-way translation tool in Iraq; the Phraselator understands common English phrases without the computer learning a particular user's voice, then translates those phrases into a foreign language. Different languages can be loaded onto the handheld device by switching a secure digital card. Defense Advanced Research Projects Agency director Tony Tether testified before Congress last month that the device was being used at Iraqi checkpoints, during searches, and with prisoners of war; the one-way translation simplifies the device and makes it more reliable. A two-way device was tested by the military in Croatia in 2001, but worked well only half the time, according to Frederking, who participated in the test. VoxTech, the company that produces the military's one-way Phraselator, says that after the Sept. 11 attacks in 2001, the Defense Department asked the firm to speed development. Scientists foresee civilian versions in the near future, including simpler devices for tourist use. In a recent Technology Review article, researcher Yuqing Gao expects two-way translation to be built into regular PDAs and mobile phones.
    Click Here to View Full Article

  • "Applying Grid Middleware to Industry"
    IST Results (04/21/04)

    Information Society Technologies' DAMIEN project is a middleware toolset that can be applied to computational- and communication-intensive tasks, such as simulation testing on aircraft designs. "Most Grid projects tend to be aimed at new applications, such as searching across databases or Web crawling, etc., in which computer performance is less relevant than data handling," notes Professor Michael Resch of Universitaet Stuttgart's High Performance Computing Center (HLRS). "DAMIEN had a different emphasis, and was about using distributed resources for classical simulations that require large computational effort." DAMIEN involves extending the competence of popular simulation tools from the non-distributed environment to the distributed Grid environment, a task that was accomplished by building middleware through the adaptation of established tools such as Message Passing Interface. Resch says the first step in the extension was the integration of an extra communication level that mirrors distributed environments; the second step was the modification of the tools to accommodate Quality of Service handling; and the third and final step was improving the usability of the tools and distributed environments. The HLRS professor reports that three DAMIEN tools have been extended and released commercially, while the fourth tool was released as an open-source application for the benefit of the research community. The DAMIEN toolset has been used to help simulate acoustical properties for aircraft, and was employed in the RNAfold bioinformatics application at the International Supercomputing Conference 2003.
    Click Here to View Full Article

  • "Virtual Reality the World Over"
    Wired News (04/21/04); Kahney, Leander

    Some 180 virtual reality panoramas were taken in 40 different countries during the vernal equinox on March 20 as part of the World Wide Panorama project. The panoramas--360-degree images stitched together from still photographs on a computer--are designed to give users the experience of standing in the center of a scene. VR panoramas are still generally relegated to niche applications, despite supporters' wishes that the format become more popular. Increasing its popularity is the rationale for the World Wide Panorama initiative, according to project organizer and cartographer Landis Bennett. In partnership with Don Bain of the University of California at Berkeley's Geography Computing Facility, Bennett invited submissions from VR hobbyists and professionals around the world, a community that is about 1,500 strong, the cartographer reckons. "This is by far and away the biggest coordinated event among independent VR producers," boasts Bennett. The panoramas were furnished with Apple's QuickTime VR, the most popular existing VR technology. International QuickTime VR Association President Michael Quan believes the lack of broadband is the primary reason why VR panoramas have been so slow to take off. He thinks that the practice of VR photography will receive a much-needed jolt with the emergence of widescreen HDTVs and other technology.
    Click Here to View Full Article

  • "Software Makers Ready Desktop Lockdown"
    CNet (04/20/04); Becker, David

    Leaked corporate documents have made headlines in recent weeks, including an incriminating email from Microsoft, a memo at the SCO Group, and RealNetworks' failed plans to partner with Apple. That media attention, plus the entry of desktop giants Microsoft and Adobe Systems, has given new credence to the emerging document rights management market. AMR Research VP Scott Lundstrom says the market is still beset by technological problems: Currently, most solutions offer just as much protection to dissuade normal users from copying or changing information, but would not stop any serious attempt to steal or manipulate information. Vendors say enterprise customers are receptive to the basic concept of document rights management, but the market and technology's immaturity is holding back greater adoption. Business customers are worried about vendor lock-in, especially restrictions that would prevent use of third-party applications, and are holding out for greater standardization, according to Gartner analyst Ray Wagner; he says document rights management could be one of the biggest wins for an open-source approach in the enterprise space, especially given that the most advanced academic encryption and security work is dedicated to peer review and open-source development. Even vendors admit that interoperability will be key, especially since users are not willing to change they way they currently work. There is still tremendous technical differences among document rights management products, such as their dependence on central servers and flexibility in changing protections. Any successful solution cannot be too restrictive lest users balk, while it cannot be too open and thus become useless, says Sealed Media CEO George Everhardt; he and other specialty software executives expect the entry of large players such as Microsoft and Adobe to bolster their own credibility.
    Click Here to View Full Article

  • "More Brain Power to Your Engine"
    The Engineer (04/16/04); Fisher, Richard

    Researchers at the University of Missouri-Rolla (UMR) are working on an engine that can boost its fuel efficiency and halve harmful gas emissions through the use of neural network software modeled after learning mechanisms of the human brain. It is hoped that such a breakthrough would enable engines across the board to adjust their performance in real time by calculating their dynamics "on the fly," according to UMR electrical engineering professor Dr. Jagannathan Sarangapani. A neural network could prevent engines from losing stability with excessive exhaust gas recirculation by making tiny adjustments to the fuel-air ratio between engine cycles. UMR's Dr. Jim Drallmeier notes that engines would be more efficient and emit less carbon dioxide when precise fuel demand can be calculated--but such a task will not be easy. "In a fraction of a millisecond a control system would need to measure an engine parameter [for example, heat release], determine what this is, predict where it's going on the next cycle and push the engine in the right direction," explains Drallmeier. "This is not something I would anticipate on a 2008 vehicle." So far, the neural network control system has only been tested in simulations. If successful, the technology could be applied to engines of all types.
    Click Here to View Full Article

  • "Wooden Computers Offer 'Greener' Desktop"
    Nature (04/19/04); Pearson, Helen

    Swedx believes wooden computers might ease some of the concerns people have about the impact of electronic waste on the environment. The Swedish company manufactures computer monitors, keyboards, and mice encased in wood that decompose faster than the plastic skeletons of common personal computers. Standard plastic computer casings tend to include brominated flame retardants to protect against fires, but the chemicals are believed to cause cancer if they accumulate in human tissue. Swedx has sold several thousand ecofriendly PCs, which cost about 30 percent more than plastic PCs, since it introduced them last year, and other companies are considering manufacturing wood-encased PCs as well. The wood is logged from managed Chinese forests. However, wooden PCs will not end all environmental computer issues; cathode ray tube monitors contain lead and heavy metals such as cadmium in microchips are often dumped into landfills and their toxic materials are able to leach into the environment.
    Click Here to View Full Article

  • "Why Does a Technical Manager Function as a Regulator?"
    ICANNfocus (04/19/04)

    Around the middle of last month, the Center for Regulatory Effectiveness (CRE) sent a letter to the National Telecommunications and Information Administration (NTIA) regarding the issue of Internet governance. The content of the letter addressed the transparency of ICANN's processes. The CRE also submitted a letter to ICANN itself, but unlike the NTIA, ICANN did not provide a substantive or prompt response. The NTIA's response to the CRE's letter sheds light on several important issues, including the appropriateness of ICANN serving as both a private technical administrative body and as a public regulator. These dual roles are exemplified by the fact that ICANN is responsible for the day-to-day technical management of the DNS but has also expanded its scope to include regulation of the DNS, including the pricing of registry services. ICANN's Articles of Incorporation define ICANN's organization and operation as being "exclusively for charitable, educational, and scientific purposes," yet the NTIA states that it cannot comment on ICANN reports, such as the Strategic Plan, that contain "proprietary and business-confidential information." This type of information is normally considered to be competitor-sensitive, but ICANN does not have any competitors, meaning that ICANN should not have confidential business information. Finally, the NTIA's response states that it met in private with ICANN and that certain ICANN documents were kept hidden from the public, undermining any claims that ICANN adheres to transparency and accountability in its processes.
    Click Here to View Full Article

  • "IT's Uneasy About Being Green"
    eWeek (04/19/04) Vol. 21, No. 16, P. 53; Coffee, Peter

    The information technology industry is facing some serious challenges to the way in which companies build computers, as concerns grow about the toxic materials in units. In the 2004 annual report from Washington-based Worldwatch Institute, personal computers are described as "a toxics trap," in that they contain lead; cadmium; beryllium; plastics such as polyvinyl chloride, which sometimes includes bromide-based flame retardants; and hexavalent chromium, the pollutant attacked by activist Erin Brockovich. Groups such as the Silicon Valley Toxics Coalition and the National Safety Council say more than two-thirds of heavy metals in U.S. landfills come from dumped computers and other electronic waste. Some computer manufacturers are exporting electronic waste to developing countries, but there are concerns that computer recycling workers could become overly exposed to lead, which could lead to kidney damage, nervous system damage, sterility, and birth defects. Semiconductor manufacturers Intel and NEC Electronics have plans to reduce lead in products ahead of the European Union's Restriction of Hazardous Substances Directive, which takes effect July 1, 2006. Meanwhile, lead-free solders could lead to short circuits in electronic components, which might force computer makers to turn to other materials such as nickel. Lawmakers in Europe want the IT companies to foot the bill for the disposal of old computers. However, in the United States, a recycling fee would be added to the sale price of a PC, under a proposal by Rep. Mike Thompson (D-Napa Valley, Calif.).
    Click Here to View Full Article

  • "The Internest"
    Economist (04/15/04) Vol. 371, No. 8371, P. 78

    The Georgia Institute of Technology's Dr. Craig Tovey and Oxford University's Dr. Sunil Nakrani are taking a cue from honeybee colonies to optimize Internet server performance. Their approach is based on the observation that bee colonies maximize the rate of nectar collection by determining how long and in how many numbers bees will forage flower patches in much the same way that Internet host providers maximize their profits by switching their computers between different Web applications so they can adjust to variable levels of demand. Similarly, a comparison can also be drawn between the downtime penalty a computer incurs when switching between applications and the trade-off a bee incurs for switching between flower patches. About one-fifth of a hive's population consists of nectar gatherers, while the rest are food storers; the forager bees decide how worthwhile patches are by seeing how long it takes to locate an unoccupied food storer, and communicate the value of the patches to other foragers through a "waggle dance." Tovey and Nakrani have extended this concept to Internet hosts by likening individual servers to nectar gatherers and customer requests to flower patches, and have created a "honeybee algorithm" that enables a server to generate an "advert"--a sort of electronic waggle dance--to notify other servers in a "hive" of how important and profitable its customers are. The honeybee algorithm was applied against the greedy algorithm that acts as the foundation for server allocation by most Internet host providers. The former surpassed the latter by up to 20 percent in terms of performance when Internet traffic was in heavy flux, while the opposite case was true when traffic was more uniform.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "Security Holes Force Firms to Rethink Coding Processes"
    Network World (04/19/04) Vol. 21, No. 16, P. 1; Messmer, Ellen

    Microsoft recently released 14 patches to fix critical security holes in Windows XP, Windows Server 2003, and earlier iterations, but security experts say throwing patches at the problem is not the solution; a much more effective strategy is building tools and processes for creating more robust code and finding and eliminating bugs during development. Sanctum CTO Steve Orrin notes that many organizations' approach to fixing flaws involves a code assessment partnership between the software programmers and the security manager's personnel. However, he reports that time-to-market sometimes forces evaluators to skimp on the code review, while corporate policy suggestions in written secure-coding techniques "usually sit on a shelf gathering dust." Microsoft currently supports about 12 security experts to accommodate roughly 20,000 software engineers, and program manager Michael Howard says the company is considering implementing more training via the Internet. Meanwhile, Microsoft evaluates its products using in-house code-review tools and outside contractors such as eEye Digital Security. EEye employs staff who are expert in finding security flaws as well as proprietary tools, which themselves are assessed by checkers outside the company; eEye COO Firas Raouf notes that his firm sometimes encourages researchers to pinpoint and patch security holes by holding contests for vacations or other prizes. There is a migration toward automated security code reviews, with companies such as Sanctum, Spi Dynamics, Parasoft, and HB Gary offering products in this vein. Foundstone President and CTO Stuart McClure will release a report in May indicating that Linux software is about 10 percent more buggy than Microsoft software.
    Click Here to View Full Article

  • "IT Job Market Causes Concern"
    InformationWeek (04/19/04) No. 985, P. 76; Malykhina, Elena

    With outsourcing dampening their hopes of finding stable, well-paying IT work after graduation, Queens College computer-science students are considering alternate careers, while some are planning to pursue jobs outside the United States. Queens senior Theodore Karoutsos laments, "If all jobs are sent overseas, what will be left for us? It's a general feeling of hopelessness." U.S. IT degrees are highly valued overseas, as is fluency in a foreign language: Karoutsos, for instance, is hoping his command of Greek, in addition to a graduate degree, will give him an edge in the Greek job market. Queens College computer-science professor Dr. Kenneth Lord notes that some students have elected to concentrate on managerial and high-end professions that are less likely to be outsourced. Computer engineering, game programming, analysis, IT management, and Linux administration are just a few of the alternate fields students are considering. "Computer-science students won't take over hardware design from the engineers, but they're about to play a larger role than they previously did," predicts Christopher Vickery, another Queens computer-science professor. Some students are combining their computer-science majors with other majors--political science, for example--to qualify for more stable and long-lasting jobs and to have a fall-back position in case the IT career does not pan out. Lord urges students to participate in internships in order to gain valuable practical experience in the business world. Lord says, "The market is turning around...There will always be computers, the industry will continue to grow, and it will need students with experience in technology."
    Click Here to View Full Article

  • "Wary of E-Voting, Some Professors Sound the Alarm"
    Chronicle of Higher Education (04/23/04) Vol. 50, No. 33, P. A18; Schmidt, Peter

    Respected academics are criticizing the security of direct recording electronic voting machines and online voting, alleging that such systems could be tampered with by just about anyone and be used to commit electoral fraud without the public's awareness. Their warnings have prompted lawmakers to take actions that e-voting advocates believe hinder the modernization of the electoral process; moreover, proponents maintain that critics' conclusions are unscientific and clouded by emotion. Harvard University fellow Rebecca Mercuri was among the first to deeply study e-voting security, and proposed that a gaping hole could be filled in by equipping machines with printers to provide a voter-verifiable paper trail. Her proposal was adopted by Rep. Rush Holt (D-N.J.) into a bill making such printers a required component of all e-voting machines in the United States--a bill that has since stalled. Mercuri's is not the only proposed e-voting security solution: MIT's Edwin J. Selker thinks machines could have audio devices that let voters confirm their votes through headphones and record those choices on tape, while private industry researchers advocate cryptography. Opponents such as Michael I. Shamos, a Carnegie Mellon University professor involved in the Caltech/MIT Voting Technology Project, cite research indicating that paper-based voting was chiefly responsible for the debacle of the last presidential election, a fiasco that prompted the federal government to push for electronic voting. A July 2003 report on the security of Diebold Election Systems machines by Johns Hopkins' Aviel Rubin--based on e-voting software code discovered online by Bev Harris--found fundamental flaws, including the reliance on smart cards and the system's vulnerability to tampering. Rubin and colleagues also convinced the federal government to terminate a project that aimed to prove the viability of Internet voting for Americans stationed abroad; Rubin's group raised concerns that hackers and terrorists could exploit such a system to rig or disrupt elections.

    To read more about ACM's activities and concerns involving e-voting, visit http://www.acm.org/usacm.


                                                                             
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM