ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 622:  Wednesday, March 24, 2004

  • "U.S. Students Shun Computer Sciences"
    San Jose Mercury News (03/24/04); Schoenberger, Karl

    Enrollment of U.S. undergraduate students in computer technology and engineering fields declined by 19 percent in the 2002-03 academic year, while the number of newly declared computer science majors experienced a precipitous 23 percent drop-off last year, according to the Computing Research Association's annual Taulbee Survey. David Hayes, chair of San Jose State University's computer science department, reports that enrollment in the department has decreased by one-third over the last three semesters, while the number of majors has fallen from over 2,000 students to less than 1,400. Hayes says, "I'm hoping this trend will stabilize, but I don't know that it will." Many educators say students are concerned about offshore outsourcing leading to fewer job opportunities. "We should be prepared to take a look at what we're teaching and ask if we're preparing our students for jobs that are going to be subject to outsourcing," suggests Ohio State University's Stu Zweben, who directed the Taulbee Survey. He is also confident that the marketplace will continue to require a steady influx of tech graduates. "If you believe that most of the solutions to the complex problems that companies need to solve haven't been written yet, well, that's what we should be preparing our students to do," Zweben comments. Falloffs in computer-science undergrad enrollments at leading universities appear to be proceeding at a slower pace than those in second-tier schools, but Stanford University's Robert Gray says these students have had to lower their career expectations in the wake of offshoring.
    Click Here to View Full Article

  • "Online Swindlers, Called 'Phishers,' Lure the Unwary"
    New York Times (03/24/04) P. A1; Hansell, Saul

    A growing number of "phishing" emails is worrying Internet companies, banks, and insurance firms who often end up absorbing the cost of identity theft. ISP Earthlink has launched an aggressive campaign against phishers, who cost the company approximately $100,000 in customer support and related costs for each phishing email distributed, and Earthlink chief privacy officer Les Seagraves says criminal enterprises are assuming the bulk of phishing activity, which has increased significantly in just a few months. More phishing emails and Web sites trace back to Russia, Eastern Europe, and Asia, and are also becoming more sophisticated. Phishing originated when America Online billed customers per hour and miscreants, posing as AOL customer service agents, used instant messaging or email to trick other users into giving them their passwords--so the phishers could stay online at others' expense. Spam filtering company Brightmail says 4 percent of all its processed email in February was phishing messages, up from just 1 percent of messages in September. Although banks say phishing is still a small percentage of overall identify theft and fraud cases, victims are often hard hit because they do not realize they have been targeted: People sometimes do not understand unauthorized charges on their credit card bill are linked to a security alert they received from PayPal, for instance. Experts say phishers more often rely on fear than anything else, though their first goal is simply to excite some emotional response so that people quickly reply. Phishing emails and Web sites now use more polished grammar and non-suspect Web addresses such as "yahoo-billing.com." Microsoft's most recent Explorer makes it more difficult for phishers to use fake Web sites, and companies such as Earthlink and eBay have released toolbars to alert users when they are being phished. Even with education and prevention efforts, law enforcement will remain necessary:
    Click Here to View Full Article

  • "Next Net Moves Forward"
    CNet (03/22/04); Reardon, Marguerite

    The second phase of testing for Internet Protocol version 6 (IPv6) was recently completed as the Moonv6 testbed network was put through its paces for two weeks in March by the North American IPv6 Task Force, the Defense Information Systems Agency's Joint Interoperability Testing Command, and other entities. IPv6 is looked upon as the heir to IPv4, which many people say does not furnish sufficient IP address space to support the millions of devices that will likely swell the Internet in the coming years thanks to the emergence of mobile communications and new IP services. The Moonv6 network was tested for quality of service, security, networking protocols, application handling, and end-to-end domain server functionality on all major operating systems. The first phase of testing, which was wrapped up last October, only evaluated fundamental routing applications and basic network configurations. The second phase subjected routing protocols such as Border Gateway Protocol and Open Shortest Path First to more rigorous testing, and also covered 10-Gigabit Internet and firewall functionality. The Moonv6 project demonstrated that IPv6 and IPv4 can coexist on the same network, which should make the standard switchover less of a headache. The Moonv6 initiative and the adoption of IPv6 in the United States has been chiefly spurred by the Defense Department, which requires that all agencies be IPv6-enabled within two years. AT&T VP Rose Klimovich observes that most of the American interest in IPv6 is coming from the government.
    Click Here to View Full Article

  • "Copy Protection Plan Squeezes Home Users"
    EE Times (03/22/04); Yoshida, Junko

    Consumer electronics makers, PC hardware firms, and software companies are working together to define an "Authorized Domain" where all the digital content in a home would be subject to technical constraints. Metadata attached to the content would regulate how it is copied, stored, displayed, or redistributed, if at all. The Digital Video Broadcast Project says its Copy Protection/Copy Management (CPCM) scheme is meant to enable end users to freely share legal digital content among different devices they own; the goal of the system is to formulate a cheap and simple solution before "Napsterization" of digital video content forces numerous proprietary solutions on the market--and if that happens, industry insiders say consumers would have incompatible devices and manufacturers and broadcasters would have to conform to different copy-protection standards. The CPCM will allow content owners to broadcast usage rules, but will not require consumers to register their devices and will not include a return path feeding usage data to companies. "We want to do this without depending on Big Brother," said CMCP subgroup chairman and Walt Disney Television executive Chris Hibbert. In-Stat/MDR senior analyst Mike Paxton describes CMCP as a "fairly benign" solution, but says it could create more or less controversy depending on how it is implemented. The most difficult part for the industry coalition will be creating the actual copy protection and management software; previous attempts to encase digital content in software defenses have been defeated by hackers, who then go on to create limitless copies of the content. CMCP would also encounter difficulties when devices themselves change hands or when households split up.
    Click Here to View Full Article

  • "Pay Once, Share Often With LWDRM"
    Wired News (03/23/04); Van der Plujim, Henny

    Light Weight Digital Rights Management (LWDRM) technology developed by Germany's Fraunhofer Institute has the potential to resolve the bitter feud between the music industry and file sharers by granting consumers more freedom without compromising the rights of the industry. The institute considers the music industry's control over the development of current digital-rights systems to be the systems' biggest weakness; LWDRM allows consumers to decide what to do with the song or video clip they purchase by downloading a digital certificate from an independent certification authority, which affixes itself to the file and records the nature of its usage. Consumers will be permitted to copy music clips within a fair-use scheme, and will be responsible for what happens to the clips once they are tagged with the certificate. The number of people who can copy or borrow the products can be adjusted by record companies. Though the certificate can be circumvented, the adoption of LWDRM could benefit the music industry's image from a public relations perspective. A Fraunhofer representative declared that the institute will license the LWDRM technology with a free and fully integrated online store in an effort to draw industry support, as well as attach an integrated digital-payment system to the technology. "The aspects of interoperability, strong security, user friendliness and the licensing system are very strong points of LWDRM," noted GESAC secretariat general Veronique Debrossis. LWDRM was originally developed for the MPEG 4 format, but Fraunhofer modified the technology for the highly popular MP3 format.
    Click Here to View Full Article

  • "Robolympics Contestants Shoot for Gold"
    Nature (03/18/04); Pearson, Helen

    Some 414 robots and 600 human entrants have registered for the first-ever Robolympics, an international competition that will test the machines' mettle across a variety of disciplines through an assortment of grueling contests. Robotics Society of America President James Calkins, who founded the Robolympics, says the event offers roboticists an opportunity to meet and exchange ideas, and is designed to increase the public awareness of robotics. Scheduled contests include the Humanoid Robot World Cup Soccer Tournament, wherein fully autonomous machines will be required to shoulder weights, sprint 1.2 meters, navigate an obstacle course, and land a football penalty kick. "It is really testing the limits," notes University of Manitoba computer scientist and Robolympics contestant Jacky Baltes. Another competition will put combat-designed robots through their paces, with those that successfully eliminate their rivals winning as much as $3,400. Participants in the Ribbon Climber contest must race up a carbon-fiber ribbon, while a robo-triathlon will be held to determine the fastest wheeled and legged machines, as well as the fastest robots on water. Researchers say that events such as the Robolympics inspire the development of other robots with practical applications, such as machines that hunt for survivors in the wreckage of an earthquake, or "space elevator" technology in the case of the Ribbon Climber competition.
    Click Here to View Full Article

  • "Ad-Hoc Wireless Networking Technology for Battlefield Environments"
    ScienceDaily (03/23/04)

    Under the auspices of its Multidisciplinary Research Initiative (MURI) program, the U.S. Defense Department has apportioned about $3 million in funding over three years to a project to develop "space-time processing for tactical mobile ad-hoc networks" for establishing wireless mobile communications on the battlefield. The initiative involves six universities and is being led by electrical engineers at the University of California, San Diego. Ad-hoc wireless networks will help address problems soldiers in battlefield environments face, such as the difficulty in maintaining communications while moving about without being jammed or monitored by the enemy. Resilience and redundancy are also key, as the networks will have to remain operational even when communication nodes are damaged, destroyed, or out of range. A tactical ad-hoc network could incorporate hardware including backpack-mounted mobile radios, laptops, handheld computers, antennae affixed to vehicles, and airborne relays. The adaptability of such a network to changing conditions will depend heavily on a cross-layer algorithm that will allow the different layers of communications management programs to make collaborative decisions. There will also be an investigation into new antenna technology, coding, and error-correction systems, such as multiple-input/multiple-output (MIMO) devices that could allow communications to be maintained in volatile areas. James Zeidler of UCSD's Jacobs School of Engineering reports that tactical ad-hoc networks could also be applied as an emergency response tool for police and firefighters.
    Click Here to View Full Article

  • "Technology Solution to Slicing Spam Lags"
    CNet (03/22/04); Olsen, Stefanie; Festa, Paul

    Efforts to develop anti-spam technology standards are displaying a profound lack of unification, and some anti-spam experts are taking a long, hard look at the standards issue's progress in the wake of AOL, EarthLink, Microsoft, and Yahoo!'s joint lawsuit against scores of spammers. There have been few public signs of teamwork between the members of the Anti-Spam Technical Alliance (who are also the plaintiffs in the lawsuit), but they are individually developing anti-spam measures: AOL recently started testing its DNS-based Sender Policy Framework (SPF); Yahoo! often discusses plans to support the proposed DomainKeys email sender authentication scheme; and Microsoft has devised an email verification scheme of its own, Caller ID for Email, that focuses on message headers rather than senders. Members of the alliance acknowledge that agreement on common standards has proceeded slowly, partly because the problem is so complicated and there is little conclusive research into how effective these separate standards would be. Outblaze CTO Suresh Ramasubramanian predicts that components of the more viable of these standards initiatives will eventually be integrated into a compromise proposal. An AOL spokesperson says coalition members intend to test each other's proposed solutions, but are still engaged in separating the workable from the unworkable solutions. SPF, which has been deployed by AOL and Google and selected for IETF assessment, is a leading candidate for the common technical anti-spam solution. MX Logic's Scott Chasin suggests that proposed technical solutions developed by the Internet Research Task Force's Anti-Spam Research Group might attract more backing than any one company's proposal, and adds that technical solutions will have to be complemented by education and legislation if spam is to be effectively corralled.
    Click Here to View Full Article

  • "Clever Critter May Detect Hard-Drive Failures"
    NewsFactor Network (03/23/04); Martin, Mike

    Variations in temperature may presage a hard-drive failure, and Carnegie Mellon University (CMU) researchers have designed a new sensor to detect such variations. Chriss Swaney of CMU reports that gauging the amount of heat produced in a hard drive on a daily basis could help extend the life of the drive as well as save time and money. Michael Bigrigg of CMU's Institute for Complex Engineered Systems explains that there has been an almost 100 percent increase in the volume of material stored on hard drives over the past three years. He adds that researchers expect the amount of recorded global-climate data to swell from 2 billion GB in 2000 to 15 billion GB by the end of the decade. CMU's "Critter Temperature Sensor," which is the size of a dime, attaches to the user's desktop and watches for the slightest temperature variation. Bigrigg says the sensor will not only allow hard-drive failures to be predicted early, but will help researchers draw new insights on energy waste. However, Laszlo Kish of Texas A&M University cautions that heat is not the only factor that can lead to hard-drive failures. "This protection, which is to switch off the drive when it is too hot, would work only with one class of failures," he notes.
    Click Here to View Full Article

  • "Computers and Sensors Find Home in UF Seniors Project"
    Orlando Sentinel (03/21/04); Sichelman, Lew

    University of Florida researchers are developing and testing assisted-living technologies designed to aid senior citizens without sacrificing their freedom. A lab in UF's computer science engineering building is decked out as a mockup of a living space outfitted with assorted assistive devices, such as smart appliances and sensors that monitor a subject's movements, all of which are linked to a central computer network. "What this home demonstrates is the evolution from [assistance] devices to [assistance] environments," explains Sumi Helal, director of technology development for the UF Rehabilitation Engineering Research Center on Technology and Successful Aging. Examples of future assisted-living innovations that could be incorporated into the home include microwave ovens that automatically cook food according to specifications embedded in radio frequency identification (RFID) tags affixed to packages, and that can tell residents when the meal is ready via a video display; monitors programmed to alert caregivers if the resident runs into trouble; mobile phones that can control lights, curtains, the TV, and the stereo by vocal command; and sensors that notify residents of leaks, strangers at the door, and mail deliveries. Helal is confident that such an environment will reach mainstream acceptance in less than 10 years, once the diverse technologies have been integrated. He notes, however, that policy makers will have to determine whether these technologies are acceptable or constitute a breach of privacy, while retailers will have to sell products with built-in RFID tags. William Mann of UF's College of Health Professions cites studies indicating that seniors who use such devices are less costly to the health system, and usually exhibit a slower onset of decrepitude. He adds that the elderly and new technology are more compatible than many people think.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "'Nano-Lightning' Could Be Harnessed to Cool Future Computers"
    Newswise (03/22/04)

    Purdue University scientists have developed an ion-driven cooling device for electronic circuits that could prove significantly more reliable and less expensive than liquid-based cooling, and perhaps enable laptops to employ higher-performance chips whose heat output is beyond the tolerance of current laptops. Mechanical engineering professor Suresh Garimella says that conventional silicon manufacturing methods used in the semiconductor industry could be employed to fabricate most of the device's features. The device, whose development was partly funded by the National Science Foundation, involves a chip upon which are closely placed positively charged and negatively charged electrodes; the negatively charged electrodes consist of carbon nanotubes whose tips boast a maximum width of 5 nm. Passing current through the electrodes causes the negatively charged nanotubes to generate electrons, which are drawn toward the positively charged electrodes. The electrons' reaction with the surrounding air ionizes the air molecules, producing nano-scale lightning bolts and generating currents similar to corona winds. Future cooling devices modeled after this template will boast an "ion-generation region" where electrons are discharged, and a "pumping region" where another array of electrodes will produce the cooling effect by drawing in the clouds of ionized electrons and pumping them forward by rapidly switching between voltages. A later version of the design might use a thin film of diamond in place of the carbon nanotubes, since diamond is easier to fabricate and more durable. Garimella says the chief advantage of this concept is that "the creation of air as well as the cooling is all happening on one chip."
    Click Here to View Full Article

  • "Digital Revolutionary: Interview With Leonardo Chiariglione"
    Scientific American (03/22/04); Pistoi, Sergio

    After ushering in the MP3 and MPEG standards, electronics engineer Leonardo Chiariglione says the digital revolution is still far from being realized: His new Digital Media Project (DMP) is meant to complete the picture and bring about a new digital music experience for end users, new business opportunities for the music industry, and new ways for creators to express themselves. Although Chiariglione says music piracy is deplorable, he admits that the music industry's intransigence and unwillingness to go completely digital has resulted in a "stalemate" where both sides lose. MP3 was not meant to enable rampant online music piracy when the standard was approved in 1992; at that time, no one imagined that, within a decade, PCs would be powerful enough to decode MP3s and the Internet would have so much bandwidth. Chiariglione says online music stores such as Apple's iTunes are not good solutions because they actually impose more limits on users than did physical media such as tapes and store-bought CDs. DMP aims to give consumers unprecedented freedom with legally acquired music, and in the process will take away much of the incentive for online piracy, he says. The DMP system will not use a single algorithm, but will be a framework that can update algorithms via the Internet or wireless connections in order to mitigate cracking exploits; furthermore, each manufacturer would be able to use their own algorithms compatible with the universal standard. Chiariglione says future digital media standards will enable new Web media applications, such as media searches on the Web. By inputting sequences of digital music or video online, future Web search engines would be able to find similar media or content related to that media, and such a system could also be used for distribution.
    Click Here to View Full Article

  • "All Our Lives Are on File, for Sale"
    Atlanta Journal-Constitution (03/20/04); Stanford, Duane D.

    Many companies sell or trade consumers' personal information, gathering it from Internet sites, credit applications, government Web sites, and other sources. Much of the data is publicly available, and volunteered by consumers, but privacy experts worry that the accumulation of such data could be a problem, especially since consumers do not know who is collecting their data and where it is stored. Jeff Matsuura, director of the law and technology program at the University of Dayton School of Law, says, "I put a little information out here, and a little information there, and I don't really expect it's going to get all sucked up into one big file. I don't think people appreciate that is happening, and people would be more concerned if they did." Those in the business of accumulating information go to public records and data stores such as courthouse files and government agencies for their data. ChoicePoint gets public records from government agencies and courthouses and maintains a database of over 19 billion records, which is used to complete background checks. Information portal Abika offers cheating checks and psychological profiles, and co-founder Jay Patel got started by tracing email messages; his company gets 10,000 orders per month, he says. Abika uses a spider search program to find information on the Internet, and checks chat rooms, newspaper and magazine articles, and company databases. Freedom of Information director Charles Davis believes that people overreact to the issue of privacy, first by giving up information easily and then complaining about it. Davis worries that privacy fears will cause governments to further restrict access to public records, a rising practice already due to national security issues. Davis believes public records must remain open to public scrutiny. He says, "The great irony here is that the same thing that makes it so frightening is the same thing that makes it so empowering."
    Click Here to View Full Article

  • "IIT Software to Break Language Barriers"
    Times of India (03/20/04); Pandey, Jhimli Mukherjee

    Students and professors in IIT Kharagpur's computer science department are collaborating with leading Indian linguists to develop translation software designed to close the gap between the "educated elite" of India's urban centers and the less literate rural residents. "We have been given the responsibility of bridging this digital divide, so that rural people are also able to make use of the Internet," notes IITKgp professor Sudeshna Sarkar. The Union ministry of communications and information technology commissioned the institute to develop the software, and asked that the software initially focus on Eastern India's major languages--Assamese, Bengali, Hindi, and Oriya. The institute's computer science department is the only IIT network member that boasts a Communications Empowerment Lab, which is headed by Anupam Basu. He explains that the languages' grammar and the elements they share will be critical to the refinement of the software. "We are studying how verb forms change when we change the tense or mood of an expression," Basu adds. Sarkar says the software is also being designed to ease communication for blind cerebral palsy sufferers and other physically challenged users. One way the software does this is by converting computer text into speech.
    Click Here to View Full Article

  • "Location Awareness: The Key to Better Mobile Networks"
    IST Results (03/18/04)

    Mobile phones' location awareness capabilities could be tapped to significantly improve mobile network performance and management, as demonstrated by tests conducted under Information Society Technologies' CELLO project. The network-capacity difficulties mobile operators must contend with will only increase as user numbers and data rates skyrocket thanks to Web browsing and other widely used services. The traditional strategy for boosting a mobile network's capacity is to scale back the radius of base stations' area of coverage--known as "cells"--and add more cells, but many metropolitan areas already boast near-maximum cell density. CELLO project coordinator Jaako Lahteenmaki, from the Finnish research institute VTT Information Technology, says trials have proved that "it's possible to enhance 2G and 3G networks by using the location information available from the cellular network and mobile phones." When used together, the network and cell phones enable a massive distributed network-monitoring system that can gauge the spatial distribution of users and their signal data. Network operators could increase the accuracy of network performance data--and obtain it in real time--using location information broadcast by mobile phones, which would allow for faster coverage problem fixes. At the heart of the CELLO project is the Mobile Network Geographic Information System, a tool that collates location-specific information and stores it in a database, where it can be tapped by other network-planning and optimization applications. The results of the CELLO trials were sent to the 3GPP mobile communications standards body.
    Click Here to View Full Article

  • "Blueprint for Code Automation"
    Computerworld (03/22/04) Vol. 32, No. 12, P. 25; Sliwa, Carol

    Model-driven architecture (MDA) is slowly gaining acceptance among organizations that start with small projects highlighting MDA's benefits: For instance, the state of Wisconsin used MDA to develop a new Web-based unemployment insurance benefit system, laying down business process specifications on a diagram instead of writing code. Proponents of MDA say the approach provides a common language for business and IT sides, more disciplined code, and cost-savings from code reuse. MDA is sponsored by the Object Management Group (OMG) and uses Unified Modeling Language (UML) and other code-generating tools, and OMG CEO Richard Soley says at least two companies are creating all of their code with the MDA approach. Business analysts and developers jointly map out how the system will work using UML descriptions and create a model, then the chosen code-generation tool automatically generates implementation code. Wisconsin Workforce Development project director Lee Carter says another benefit of MDA is the ability to translate code back into business requirements, making it far easier for future development teams to understand the thought behind the application; he adds that MDA allowed the project to be completed much faster than anticipated. IT outsourcing provider PFPC reports similar gains in efficiency as well as accuracy and fewer hidden design flaws, while PFPC senior architect Ian Muang says developers in different locations can also work together more easily with MDA. This commonality might also facilitate simpler switches in runtime platforms since business and application logic is first defined in a platform-independent model (PIM) before being run through the code-generation tool. UML 2.0 should also provide more flexible and effective action semantics. Experts note, however, that MDA has serious drawbacks, including significant cultural resistance and tools that do not adhere to standards.
    Click Here to View Full Article

  • "All Eyes on Google"
    Newsweek (03/29/04) Vol. 143, No. 13, P. 48; Levy, Steven; Stone, Brad

    The Google search engine developed by Stanford grad students Sergey Brin and Larry Page has turned Web searching into a profitable enterprise and become so vital that former Lycos CEO Bob Davis exclaims, "The Internet without search is like a cruise missile without a guidance system." Google's enormous success has spurred a surge of competitors ranging from Internet behemoths to startups to Microsoft. Yahoo! has funneled over $1 billion into the overhaul of its search engine to make it closer to Google: One of the ingredients in this overhaul is an approach that Overture, a Yahoo! acquisition, trailblazed, in which search queries contain "paid inclusions" along with normal results; Yahoo! has initiated a Content Acquisition Program that allows companies to supply feeds of their pages to Yahoo!'s search index in return for a fee. Meanwhile, Internet Archive founder Brewster Kahle wants open-source search to be an option for consumers, and to that end is providing the infrastructure for such noncommercial engines. Microsoft is plotting its own attack strategy with a search engine retooling that will purportedly leverage software employed by hundreds of millions of users, and may include Implicit Query software that extrapolates what a user might search for based on his or her activities. Stanford research guru Anna Patterson says Microsoft's technical talent cannot hold a candle to Google's, but Microsoft VP Yusef Mehdi promises that their researchers' skills are being grossly underestimated. Google will face major competition to facilitate and exploit a number of search technology breakthroughs over the next few years, in such fields as deep content, personalization, artificial intelligence, localization, and multimedia. So that Google can keep pace, the company has instructed engineers to try to address the 100 most important search tasks listed by Google's top execs, and devote 20 percent of their time to personal projects.
    Click Here to View Full Article

  • "Primary Concerns"
    Baseline (03/04) No. 28, P. 15; Cone, Edward

    During the Democratic primary on March 2, Maryland used touch-screen voting machines statewide for the first time with no instances of voter fraud, but RABA Technologies security advisor Michael Wertheimer says the silence is deafening because those with malicious intent have had an opportunity to scout out weaknesses for attack in November. Commissioned by the state, RABA issued a report on Maryland's electronic voting in January and recommended several steps including physically securing the machine cases, installing firewalls to protect the central election computer system, patching the Windows 2000 software used on that system, and implementing stronger password protection. RABA was also able to dial into the server used to tabulate votes. Maryland election director Linda Lamone says the quiet March 2 primary proved the system is secure and that some of RABA's recommendations were acted on. The state did not fix the central election computer system yet because an update could destabilize it; Lamone says the remaining recommendations will be followed through by November. Wertheimer, a former National Security Agency employee, says he is disappointed with state officials' response so far and says they do not appreciate the present danger: An election worker or other insider serious about tampering with votes would need just five minutes with a server to change key code, he says. If hidden code was distributed throughout the state, it would endanger more than 3 million votes. State officials have staunchly opposed paper receipts, saying they would drive up costs and complexity, but Wertheimer says banks, e-commerce sites, and even gas station pumps all issue paper receipts and electronic election machines should be no exception; even more worrisome is the fact that ballots have no names inscribed on them.
    Click Here to View Full Article

    For information about ACM's concerns with e-voting, visit http://www.acm.org/usacm.

  • "Search Beyond Google"
    Technology Review (03/04) Vol. 107, No. 2, P. 34; Roush, Wade

    Google's enormous success with its search engine--and its apparent inability to develop a follow-up innovation momentous enough to sustain the company's market dominance--is encouraging Microsoft and other companies to invent their own tools that could eventually capture a good portion of Google's revenue. The breakthrough technology that makes Google so successful, the PageRank algorithm, scores Web pages according to the number of other pages they are linked to, but critics note that this does not mean the most popular Web pages are those most relevant to the user's inquiry. One company that offers an alternative search scheme is Teoma, whose search software is designed to ascertain Web site communities relevant to the search topic and extract from them the "authority" pages, or those with the most links; once the credibility of these pages is confirmed by checking to see if they are listed on resource pages created by subject specialists or aficionados, the engine ranks results based on how often each page is cited by authority pages. Another alternative search engine, Mooter, groups search results into a starburst configuration of clusters or themes, and ranks them according to several factors, including how frequently the search keywords and the cluster name crop up on each page. Mooter's inventors have pooled their knowledge of psychology, software, and neural networks to enable Mooter to more precisely tailor responses to individual users by taking note of which clusters or links the user clicks on. Microsoft, meanwhile, wants to make Web searching a key component of its Longhorn Windows upgrade, and is pursuing search software that can process natural-language queries. The software rewrites search phrases into a more easily processed format by applying rules it has learned from a database of sample sentences. The software, along with other breakthrough technology Microsoft is working on, may ultimately find use in the new Windows File System, which would be set up to allow all the data on a computer to be searched in a Google-like manner and altered by any application.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

    [ Archives ]  [ Home ]