Volume 7, Issue 749: Wednesday, February 2, 2005
- "Open-Source Leaders Accept New Challenges"
eWeek (02/01/05); Galli, Peter
Leading figures of the open-source software movement described their jobs and discussed the challenges ahead at the Open Source Development Labs Enterprise Summit on Feb. 1. Linux kernel creator Linus Torvalds attributed Linux's success to the concurrent execution of various projects, noting that "There are competing and often symbiotic relationships between the various projects, which is a big positive." Linux 2.6 kernel maintainer Andrew Morton credited the kernel's decoupled nature for its success, and said that finely described standards and interfaces are needed. He also reported that 2.6 maintenance and updates are proceeding at a rapid pace. Mitch Kapor of the Open Source Applications Foundation said the most successful software projects are those that supervise the stream of people into the project, while enlisting people with the right blend of interpersonal skills is also critical. He noted that his current interests include developing the Chandler open-source personal information manager and the CalDAV calendaring standard, which will be deployed in both open-source and proprietary versions this year. CollabNet CTO and Apache Foundation co-founder Brian Behlendorf said he is largely committed to exploring companies' software problems and the writing of software, noting that most software projects are doomed to failure. Kapor said he foresees the dynamics behind open-source software, such as decentralized cooperation on modularized technology, spreading over the next five years, while Torvalds shrugged off any visionary notions, arguing that important issues could be overlooked if people are too focused on "the beautiful utopia ahead."
Click Here to View Full Article
- "The Future of Imaging: What Comes Next?"
Advanced Imaging Pro (02/01/05); Mazor, Barry
Advanced imaging researchers note that their field is changing quickly due to the technological benefits of the consumer digital boom, military and homeland security research, and enhanced imaging processing. Within the next three to five years, Rochester Institute of Technology imaging scientist Roger Easton says improved spatial and spectral resolution over the visible spectrum, as well as advancements in x-ray fluorescence and confocal microscopy, will benefit his digital imaging efforts to recover handwritten text from 10th-century manuscripts. Other near-term advances in imaging include image databasing and more effective processing techniques for hyperspectral data, says Montclair State University Center for Imaging and Optics director Stefan Robila. In-house applications have helped make sensor and lens technology less expensive as well. Sensor technology is one of the main drivers behind imaging advances currently, allowing ever-larger sensor arrays composed of smaller pixels. Another important field of work is image authentication, which will be especially helpful in legal and criminal areas as nearly all images are being distributed in digital form today, says Rochester Institute of Technology associate professor John Kerekes at the Chester F. Carlson Center for Imaging Science. Homeland security is driving research in sensing technology, especially in the non-visible range, and biomedical imaging poses a more palatable alternative to invasive security procedures. With technologies such as face recognition and terahertz imaging, however, people need to be wary of sacrificing too much of their privacy since those technologies also present opportunities for gross misuse, says Kerekes.
Click Here to View Full Article
- "U.S. Government Computer Blunders Are Common and Expensive"
Associated Press (01/31/05); Bridis, Ted
The FBI's failure to deliver its Virtual Case File project, a $170 million effort to set up a paperless, globe-spanning information-sharing system to more effectively investigate terrorists and criminals, is the latest blow in a long history of costly technology debacles the U.S. government has made over the last 10 years or so. The bureau said Virtual Case File was insufficient and out of date, and might be scrapped in favor of off-the-shelf software. Critics call the FBI fiasco a demonstration of the government's tendency to custom-build software, which can lead to significant increases in cost and complexity. Still, experts note that commercial products may need extensive refinement in order to be effective tools for terrorist-tracking and other services. Attempts by the FAA and the IRS to overhaul, upgrade, or replace key systems have also been criticized for missed delivery dates and budget overruns. TenFold CEO Nancy Harvey and others argue that corporate tech upgrade failures occur with almost the same frequency as federal upgrade failures, the difference being that the former receive nowhere near the amount of publicity as the latter. Some industry experts believe the FBI made a wise move in considering shelving Virtual Case File now, as the potential $170 million write-off would be far less expensive than other federal tech failures. "They should build off-ramps early in the process, so if they think things are going south, they can push the reset button," recommends Pentagon deputy CIO Paul Brubaker.
Click Here to View Full Article
- "Elite Collegiate Programmers to Compete at ACM's IBM-Sponsored International 'Battle of the Brains' in Shanghai"
Business Wire (02/01/05)
The 29th Annual World Finals of the ACM International Collegiate Programming Contest (ICPC) will pit 78 three-student teams against each other to solve at least eight real-world programming problems in five hours, with the crown for international champion going to the team that answers them correctly in the shortest amount of time. This year's competition will also serve as an introduction to POWER parallel computing technologies in a separate challenge to build parallel applications on an IBM eServer Blue Gene supercomputer. "This event will give young programmers exposure to advanced programming environments, an experience that will help launch their careers in information technology," remarks IBM Centers for Advanced Studies program director Dr. Gabby Silberman. ACM's sponsorship plays a central role in IBM's university-based programs, which are dedicated to cultivating programming skills to nurture a more competitive, diverse, and innovative IT workforce. IBM supports the National Innovation Initiative by aiding universities through the provision of technology and expertise to produce "in demand skills for an On Demand world," technology grants for collaborative problem-solving research, and recruitment channels to recognize the most talented candidates for the IT workforce. IBM will be collaborating with ICPC team coaches to provide schools with technology and software, and draw more insights about how professors keep their curriculum up to date. Baylor University professor and ICPC executive director Dr. Bill Poucher says the contest has expanded five-fold since 1997 thanks to the cooperative effort between ICPC, ACM, IBM, and universities worldwide.
Click Here to View Full Article
For more information on ICPC, see http://icpc.baylor.edu/icpc/.
- "Passenger Screening, Take 10"
Wired News (01/31/05); Singel, Ryan
Traction is once again building for an upgrade to the airline passenger-screening system after several false starts. One false start was the Transportation Security Administration's (TSA) CAPPS II system, which drew criticism from Congress and privacy groups partly because of its use of commercial databases and algorithms to assess a passenger's threat level, which opponents reasoned could make things difficult for students or poor people as well as those whose data is outdated. The latest proposal is Secure Flight, a centralized computer system for confirming passenger ID that is currently being tested with millions of actual passenger records supplied by airlines under a TSA mandate. The proposal requires airlines to give data dumps to the TSA, which will flag potential threats using a unified watch list coordinated by the Terrorist Screening Center. The TSA also wants to test whether passengers' IDs can be verified with commercial databases, but the Electronic Privacy Information Center's Chris Hoofnagle says private databases should not be employed for airline security no matter how the Secure Flight tests pan out. "These databases are only accurate enough for targeting of junk mail," he maintains. The TSA is prohibited by Congress from even testing the use of commercial databases until the Government Accountability Office authenticates the institution of solid privacy policies. The TSA plans to gradually deploy Secure Flight this year, once congressional critics are appeased.
Click Here to View Full Article
- "Open Source Law Center Opens Doors"
InternetNews.com (02/01/05); Wagner, Jim; Singer, Michael
The Open Source Development Labs (OSDL) have set up a legal group to assist companies using free and open source software, which will be headed by copyright lawyer Eben Moglen and initially staff two full-time lawyers with two more expected to be added later this year. The Software Freedom Law Center (SFLC) was announced at the Enterprise Linux Summit, and will focus on providing front-end legal advice to help companies avoid patent litigation and other legal problems down the road. Specifically, the group will provide litigation support and advice on asset stewardship, IP claim conflicts, licensing, and lawyer training for those interested in learning about the General Public License (GPL). The SFLC is expected to help revise the GPL in its next iteration, and is also guided by Stanford law professor Lawrence Lessig, OSDL general counsel Diane Peters, World Wide Web Consortium technology and society director Daniel Weitzner, and Public Patent Foundation founder and executive director Daniel Ravicher. The SFLC was established with a $4 million investment from the OSDL, and is looking to set up offices in Asia and Europe, where legal issues regarding free and open source software are more complex. Peters says the SFLC will complement commercial source code investigation groups and IP insurance services, and would protect open source users from legal mechanisms of companies whose business is threatened by open source software. Just last year, the OSDL and several vested commercial interests set up a $3 million Linux Legal Defense Fund to help pay for costs related to SCO lawsuits. The SFLC is a logical next step for the open source community, which today provides billions of dollars in value to businesses, says Moglen.
Click Here to View Full Article
- "Chatting Freely With Animated Historical Characters"
IST Results (02/02/05)
IST's NICE project has developed software that can facilitate natural and interactive conversation with animated characters, one application being a computer-generated Hans Christian Andersen that resides in a museum, chatting with visitors and answering questions verbally and nonverbally. The virtual Andersen can also determine whether people's remarks are understandable, irrelevant, or insulting, and respond accordingly. Project coordinator Niels Ole Bernsen explains that NICE has developed "domain-oriented" speech systems that imitate human interaction and dialogue rather than task-oriented speech systems that simply read out information. He notes that computer games companies are interested in the system, which is supported by a management module based on language understanding. Another important NICE technology are speech recognizers, which use acoustic data compiled from Swedish and English-speaking children from numerous countries. In addition, a special Web-searching module allows the virtual Andersen to answer questions outside the system's knowledge, a feature that Bernsen says offers enormous potential to the entertainment and education industries. Morgan Freeman of Liquid Media, a NICE project partner, says the system enables characters to respond in a far more sophisticated way than comparable PC games; he observes that the advanced graphics and content of next-generation PCs and game consoles cannot reach their full potential without intelligent content and character generation, which the NICE technology is well suited for.
Click Here to View Full Article
- "Digital Detective Hunts Hidden Messages"
UDaily (University of Delaware) (01/28/05); Thomas, Neil
University of Delaware graduate students are working to develop steganalysis methods that can uncover hidden messages in digital images, hoping to foil possible terrorist communications. Steganography is the practice of hidden messages, and steganalysis is discovering those messages through investigation. University of Delaware electrical and computer engineering professor Charles Boncelet is leading a team of graduate students in creating algorithms that can detect subtle abnormalities in images, indicating added data; the techniques developed for digital image steganalysis will later be applied to video and audio, he says. Eventually, the team plans to create automated scanners that can quickly analyze large amounts of multimedia for steganographic messages. Modern computer-based steganography places secret messages within normal-looking files so that people who know where to look and have the key can extract the message. The National Science Foundation is funding the University of Delaware research for one year, including training in intelligence techniques for students. The U.S. Army Research Laboratory is also collaborating, as Boncelet says steganography is a major concern for governments because it could allow terrorist organizations to broadcast orders or other communications. Boncelet says, "When you hide a message in a digital image, you change the image a little bit. If you change the image too much, it gives it away."
Click Here to View Full Article
- "In Search of More: The 'Friendly' Engines That Will Manage the Data of Daily Life"
Financial Times (02/01/05) P. 13; Waters, Richard
The search-engine business has begun to usher in a new age of information retrieval that could transform many facets of daily life; key to this transformation is making more of the world's data accessible to software "crawlers" that index information and make it searchable. Leading search engines have taken steps toward this vision in recent months by developing enhanced services such as hard-drive searching, Web video searching, TV news transcripts queries, and digital libraries. But although searchable TV programming and video content is highly desirable among users, most of the content is inaccessible or in an incompatible format; what is more, content owners and other commercial information companies as yet see no business incentive in digitizing their material. Another innovation users are clamoring for is search engines that can directly answer queries by retrieving actual data instead of Web page links, more accurately determine what the user is after, and align the results more closely to the user's preferences based on past searches. Looking five years ahead, Google technology director Craig Silverstein expects most users to conduct Web searches on mobile handsets using voice commands, with the devices communicating the results vocally as well. He also anticipates that search functions will eventually be assimilated and masked by a more intelligent Internet, becoming both pervasive and invisible. "Ten years from now, it will be much more difficult to distinguish searching for something from seeing and using it," forecasts technology author Charles Ferguson. To survive this change, search engines will need to formulate a new business model.
Click Here to View Full Article
- "Power to the Pen"
TheFeature (02/01/05); Frauenfelder, Mark
Mobile computer interfaces often rely on pens and touchscreens, and two new techniques from Microsoft researcher Ken Hinckley promise to make pen use more efficient, as well as open up new possibilities. Hinckley is focusing on simplifying the process of manipulating data with pens: Instead of having to stop writing notes and go to the menu to select a command, users would be able to quickly choose data and manipulate it by drawing a couple of circles and a line. Menu-based commands are much less of a burden on normal PCs because expert users memorize keyboard shortcuts and computer mice are more accurate pointers. With pen devices, users have a difficult time accurately tapping, especially when they have to move the pen further; Hinckley's Scriboli would allow users to simply circle the desired data--a group of thumbnail pictures, for instance--and then implement commands by drawing a line in one of eight directions, each of which executes a different command. The technique leverages people's handwriting skills and minimizes the number of times they have to lift the pen from the screen surface. Another data manipulation technique called Stitching is meant to allow the natural sharing of data between people's mobile devices; currently, people have to know the other person's computer address in order to send them files, but Stitching would allow users to drag an item to the edge of their screen, pick it up, and then drop it onto the other person's device screen. The technique allows for much more natural face-to-face conversations and emulates the passing of physical papers from one to another. Hinckley says Stitching could be implemented with RFID to identify the pen, but a more efficient method would be to measure timing and location of the pen so that it does not need its own ID.
Click Here to View Full Article
- "Software Taming Gene Data Pool"
Wired News (02/02/05); Philipkoski, Kristen
Microarrays enable scientists to determine what genes are being expressed and how, but extracting meaningful information from the vast amount of data generated by microarrays is a tough challenge. Software engineers, however, have stepped up to the plate to address this challenge. A Salk Institute project to predict post-injury genetic changes related to spinal cord injuries involved damaging rodent spines and recording how the animals' genes responded over time. TopCoder software donated by company founder Jack Hughes made it possible to sift through the pile of data produced by Affymetrix's GeneChip microarray technology to learn which specific genes were active or inactive three hours, 24 hours, seven days, and 35 days, post-injury. The online population of freely available gene-expression datasets is increasing as Stanford University, the Allen Brain Atlas, the National Institutes of Health, and other institutions post their findings. "Web tools for sharing large-scale gene-expression data are very important not only for laboratories to share data, [but also to] reduce unnecessary duplicative efforts, and to pool data for achieving statistical significance," comments Rutgers University cell biology and neuroscience professor Dr. Wise Young. Research sponsored by VizX Labs estimates that the market for such products could one day be worth $400 million, up from about $92 million now. VizX's GeneSifter is a product designed to generate user-friendly interfaces for gene-expression data analysis, and GeneSifter was used by Washington State University scientists for research that has yielded several intriguing breakthroughs, including one that could lead to a better understanding of infant heart disease.
Click Here to View Full Article
- "Law Barring Junk E-Mail Allows a Flood Instead"
New York Times (02/01/05) P. A1; Zeller Jr., Tom
Instead of curbing the growth of unsolicited junk email, the year-old federal Can Spam Act has helped it along: Estimates reckon that spam currently accounts for about 80 percent of all email sent, compared to between 50 percent and 60 percent before the law was enacted. Antispam proponents such as Spamhaus Project founder Steve Linford contend that the law has legalized spamming by essentially granting bulk advertisers permission to send junk email as long as they adhere to certain regulations. Critics argue that Can Spam's biggest loophole is the requirement that recipients must opt out of being retained on an emailer's list, and violators simply use opt-out messages to confirm the validity of email addresses and the likelihood that people are using them. Institute for Spam and Internet Public Policy CEO Anne Mitchell says it is ridiculous to think that law enforcement agencies could halt spam's growth instantly, and notes that filters' general success probably contributed to the increase by forcing spammers to send out more junk email in order to maintain the dollar rate of return. Sen. Conrad Burns (R-Mont.) says judging Can Spam's effectiveness now is premature, noting in an email that the Federal Trade Commission may simply need a little prodding to enforce the law. Microsoft Internet safety enforcement lawyer Aaron Kornblum sees value in pursuing lawsuits against spam enablers under Can Spam, explaining that "Our objective with sustained enforcement activity is to change the economics of spamming, making it a cost-prohibitive business model rather than a profitable one." Unfortunately, analysts foresee the spam problem worsening as spammers take advantage of malware to turn PCs into "zombie" spam distributors and steal working email addresses from ISPs, while spam-friendly merchants subscribe to "bulletproof" Web host services to keep their Web sites offshore and out of U.S. jurisdiction.
Click Here to View Full Article
(Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)
- "Experts Using Computer Simulations in Delta 4 Probe"
Spaceflight Now (01/30/05); Ray, Justin
Computer simulation is being used to help determine why the main engines of a Delta 4-Heavy rocket cut off prematurely during a demonstration flight on Dec. 21, 2004. The lead suspect is cavitation, whereby liquid oxygen being piped through the feedline from the tank to the engine changes into gas, which may have fooled certain sensors into thinking that the liquid oxygen was depleted. This theory is being explored with simulations that model the flow of oxygen through the feedline on each booster, and the Air Force's Space and Missile Systems Center reported on Jan. 28 that initial results had yielded a better understanding of feedline flow dynamics. The statement indicated that the investigative team will gradually incorporate more detail into the simulations to more accurately represent the feedline and tank's internal structure, as well as simulate flight conditions at the time the engines cut out early. "The simulation team will use these models, along with the flight sensor data, to identify where cavitation is most likely to have developed in the liquid oxygen flow for the Heavy demonstration flight conditions," said the center. Simulations of liquid oxygen fluid dynamics during previous Delta 4 missions where the problem did not crop up will also be run to determine how accurate the cavitation models are. "I'm confident the data will also enable us to fine-tune all aspects of our future Delta 4 heavy-lift missions," declared Col. John Insprucker, the Space and Missile Systems Center's Evolved Expendable Launch Vehicle program director.
Click Here to View Full Article
- "Joysticks in the Classroom: Game-Design Programs Take Off"
Chronicle of Higher Education (02/04/05) Vol. 51, No. 22, P. A29; Mangan, Katherine S.
Colleges across the country are offering students courses--and even degrees--in video-game design, despite the misgivings of academicians as well as dedicated game enthusiasts. Seasoned game designer Warren Spector admits that the concept of college-trained developers is so new that initial skepticism among the professional community is unavoidable, but he says graduates will be embraced by the leading game companies, whose workforces are so large "that they'd be crazy not to encourage any form of preprofessional training they can find." Graduates of Southern Methodist University's video-game design program are especially well-positioned for employment, since the local area is swarming with video-game companies. The program integrates computer science and the humanities, with entering students selecting an emphasis in art, "level design," or software development; required introductory courses cover game development's impact on culture, society, law, and business. Student teams produce games in collaboration, under the supervision of 12 faculty members who are all game-industry veterans. Such courses can also prepare students to deal with the grueling "crunch times" that many critics have excoriated the gaming industry for. Executive director of the Southern Methodist program Peter Raad says the project initially invited scorn and unease among the faculty, given the recent publicity of violent games. However, he expects the program will ultimately be recognized as a training ground for future game designers as well as a place where disciplines are combined to create more educational games.
Click Here to View Full Article
(Access to this article is available to paid subscribers only.)
- "Simulating Fallujah"
Computerworld (01/31/05) P. 28; Verton, Dan
To better train soldiers for dangerous urban conflict situations, the U.S. military is using fully-automated, self-contained virtual environments that combine state-of-the-art IT systems and graphics engines with conventional explosives and fireworks to enhance their realism. VirTra Systems' Michael Kitchen says an emerging market for "fourth-generation warfare" products is being fueled by the war against terrorism. "Through the development of [advanced simulators], the soldiers can be placed in situations that are taken from actual combat incidents or created for specific missions," he notes. Paul Debevec, a filmmaker at the Institute for Creative Technologies at the University of Southern California, says programmers have only just begun to exploit the capabilities of PC graphics cards, which have overtaken those of costly simulation engines in recent years. "In particular, the cards now allow for arbitrary floating-point calculations to be performed at each pixel and vertex of a model, and the frame buffers have sufficient bit depth to represent the full range of light seen in the real world, from deep shadows to blinding sun," he says. Debevec expects the cards' speed and parallelism to advance to the point where they will be able to produce real-time 3D models that precisely mimic real-world surface reflections, enough polygons to generate realistic virtual human beings, and real-time simulations of light inter-reflections between the sky, walls, ground, and attire. Research is also underway to create virtual characters that can reason and interact with trainees using natural language. These characters would represent fellow soldiers and both friendly and unfriendly locals, exhibiting unpredictable, unscripted behavior to seem more true-to-life.
Click Here to View Full Article
- "Reverse Engineering a Controversy"
Information Today (01/05) Vol. 22, No. 1, P. 17; Pike, George H.
Many digital products blend both copyrighted and non-copyrighted components, and licensing restrictions or technological mechanisms granted by the Digital Millennium Copyright Act (DMCA) or other legislation can make accessing the non-copyrighted elements infeasible. Reverse engineering is designed to help software application developers circumvent this problem by enabling them to probe the applications to uncover their base source and object codes, separate the copyrighted components from the non-copyrighted components, and then use the acquired knowledge to make new products or variants of original products. Reverse engineering is a gray area in terms of legality, as demonstrated by a recent federal appellate court's decision to overturn an injunction against Static Control Components (SCC), which reverse engineered the source code on a Lexmark microchip embedded in Lexmark toner cartridges so as to create an alternative chip that would allow both third-party and refilled Lexmark cartridges to interact with Lexmark printers. Lexmark took legal action against SCC for copyright infringement as well as violation of the DMCA's anti-circumvention provisions, but the appellate court ruled that the DMCA did not apply because SCC's chip did not circumvent access to the Lexmark's printer software, as Lexmark claimed its cartridge chip did; rather, the SCC chip used the software to access the printer. The court also found that reverse engineering as an instrument for fair use was supported in this case. In addition, the primarily functional rather than creative nature of the Lexmark chip's code did not demonstrate enough originality to warrant copyright protection.
- "Coming to a Store Near You"
Software Development (01/05) Vol. 13, No. 1, P. 28; Henry, Peter
Radio frequency identification (RFID) technology has only recently started to catch on, despite being available for over two decades--and its adoption by retailers, distributors, and producers is encouraging experimentation. RFID is headed toward ubiquity, and will be a determinant in how a company competes in business, as well as how people personally shop. Major retailers such as Wal-Mart and Target are ahead of the curve in their aggressive application of RFID to supply-chain optimization. RFID's immediate industrial success faces a number of obstacles, among them: In-flux standards and technology platforms, public concerns about privacy infringement and information abuse, a lack of integration, variable performance, and radio frequency ranges. RFID is primarily relegated to three frequencies--13.56 MHz, 915 MHz (favored by retailers), and 2.45 GHz (favored by the medical and pharmaceutical industries)--each of which operates within a permitted modulated range and boasts its own performance quirks of tag size, range, and tolerance to materials on which the tags are affixed. Multiple-frequency accommodation looks favorable for the future. Meanwhile, the performance of RFID tags can be affected by the RFID system configuration, the material the tags are placed on, location, and whether the tags are passive (backscatter) or active (battery assisted). With proper planning, testing, incremental implementation, and minimal disruption of existing business processes, RFID can shave off labor operations and costs, maintain stocking levels, supply immediate physical accountability, optimize the supply chain, speed up actionable status, prevent loss, track theft, instigate dynamic marketing, and facilitate event management and tracking in real time.
- "A Contested Election for Touch-Screen Voting"
Digital Transactions (01/05) Vol. 2, No. 1, P. 38; McKinley, Ed
Almost 30 percent of U.S. voters used touch-screen electronic voting machines in the 2004 presidential election, and e-voting advocates claim the systems' relatively error-free performance is a confirmation of the technology's reliability. However, critics are calling for deeper analysis of e-voting machines, tougher standards, and more safeguards; adding fuel to their fire were complaints of undervoting, voter disfranchisement, and system malfunctions during the election. University of Iowa professor Douglas Jones, a member of the Iowa Board of Examiners for Voting Machines and Electronic Voting Systems, would like a federal election oversight agency created that would enact stricter voting machine standards and reduce the learning curve poll workers face when new technology is introduced. Touch-screen manufacturers dispute critics' claim that their systems are highly vulnerable to hacking, arguing that the lack of an Internet connection keeps this threat to a minimum. Opponents counter that bugs and fraud could crop up during the programming process before the machines are shipped for field use, while the modem or wireless transmission of election returns between precincts and counties is also a point of vulnerability. Meanwhile, some states have passed legislation requiring touch screens to be outfitted with voter-verifiable paper trails: Experts think California's mandate to have paper receipts in all voting systems by next year could be applied nationally. Michael Kerr with the Information Technology Association of America's Election Technology Council supports continued analysis of the paper-audit issue by the National Institute for Standards and Technology and other groups before laws are passed that could overlook better techniques. Critics also complain of inadequate testing of touch-screen systems, and allege that manufacturers are choosing and paying testing companies, thus introducing bias.
For information on ACM's e-voting activities, vist http://www.acm.org/usacm.
|