HomeFeedbackJoinShopSearch
Home
 
ACM TechNews sponsored by Learn more about Texis, the text-oriented database providing high-performance search engine features combined with SQL operations and a development toolkit, that powers many diverse applications, including Webinator and the Thunderstone Search Appliance.   Learn more about Texis, the text-oriented database providing high-performance search engine features combined with SQL operations and a development toolkit, that powers many diverse applications, including Webinator and the Thunderstone Search Appliance.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Thunderstone or ACM. To send comments, please write to [email protected].
Volume 7, Issue 782: Friday, April 22, 2005

  • "Interest in CS as a Major Drops Among Incoming Freshmen"
    Computing Research News (05/05) Vol. 17, No. 3; Vegso, Jay

    The results of a survey from the Higher Education Research Institute at the University of California at Los Angeles (HERI/UCLA) estimate a more than 60 percent decline in the number of incoming freshmen thinking they would major in computer science (CS) between the fall of 2000 and 2004. The popularity of CS as a major among female undergraduates dropped 80 percent in the last seven years, and 93 percent since its all-time high in 1982. The Computing Research Association's Taulbee Survey of Ph.D.-granting CS departments confirms a four-year, 39 percent decline in the number of newly declared CS majors, while the last two years have each experienced a 7 percent annual drop in enrollments. A gap has always existed between newly-enrolled female undergrads indicating CS as a possible major and their male counterparts, but that gap has widened dramatically in the last few decades, doubling in the 1980s and tripling in the 1990s. CS appears to have lost its allure to incoming women freshmen, and this led to a fall-off in women earning CS degrees in the 1980s. The next expected fall in degree production, stemming from the dwindling popularity of CS as a major as indicated in the HERI/UCLA survey, has made it difficult to perceive how CS can fulfill projected future needs for IT professionals without boosting female undergrads' participation.
    Click Here to View Full Article

  • "U.S. Gets New Cyberterrorism Security Center"
    Computerworld (04/21/05); Weiss, Todd R.

    April 21 marked the official unveiling of the Cyber Incident Detection Data Analysis Center (CIDDAC) at the University of Pennsylvania; CIDDAC is a private-sector facility set up to monitor America's business infrastructure for real-time detection of cyberthreats. CIDDAC executive director Charles Fleming says the center is designed to help victimized companies reticent to share information with the government, and eliminate the bureaucracy that can slow down federal agencies' response to threats. Critical industries are being offered intrusion-detection services by CIDDAC under the aegis of a pilot project supported by the FBI and the Department of Homeland Security's Science and Technology Directorate. The tools to facilitate these services are Remote Cyber Attack Detection Sensor (RCADS) appliances that will be implemented outside corporate networks. The appliances can automatically and instantly route any intrusion data to the CIDDAC center, where is it assessed immediately and then relayed to law enforcement agencies. The authorities can employ the data to collate attack signatures that government investigators can use to more rapidly identify, pinpoint, and subdue cyberthreats. Assistant special FBI agent Shawn Henry says the data compiled through CIDDAC will allow the FBI and other law enforcement entities to thwart future attacks instead of merely responding to intrusions. Fleming says CIDDAC users will enjoy better protection against cyberthreats while still maintaining the privacy of their sensitive corporate data, adding that "privacy, trust, and anonymity are absolute essentials for the private sector to participate, and without the private sector, there is no program."
    Click Here to View Full Article

  • "Overly Smart Buildings"
    Technology Research News (04/27/05); Bowen, Ted Smalley

    Though new technology is available to create intelligent building systems that monitor and control the building's environment and security, architects and engineers still face problems with complexity, incompatibility, component failure, difficult operation, and outmoded operating parameters. Buildings are tough proving-grounds for complex and interdependent technologies, says Harvard University architecture professor Michelle Addington. Real-world behavior is unpredictable, which complicates efforts to measure, analyze, and respond to different situations. The application of commercial off-the-shelf technology also poses problems because intelligent buildings introduce new limitations and special cases: Distributed sensors, for example, require a specialized method of polling sensor nodes because of individual units' susceptibility to error and limited processing capacity. Swiss researchers recently conducted an intelligent building test that attempted to limit complexity by measuring just one parameter, user comfort; the researchers built fuzzy logic software that counted how many times building users changed their environment settings, and eventually the software was able to adjust settings to desired levels automatically. The benefits of intelligent buildings are also questionable given the additional testing required, additional network maintenance, and technology obsolescence. Intelligent building technology also may not be applicable to older buildings. Harvard's Addington says one solution is to measure and control fewer parameters.
    Click Here to View Full Article

  • "Researchers Propose Early Warning System for Worms"
    eWeek (04/20/05); Naraine, Ryan

    Professors Shigang Chen and Sanjay Ranka of the University of Florida's Computer and Information Science and Engineering department have written a paper proposing an early warning system for TCP-based Internet worms that promises to eliminate known vulnerabilities in current early warning systems. "The goal is to have a system to issue warnings at the very early stages of an attack and to provide information for security analysts to control the damage," the paper states. Chen says the plan combines a series of methods for automatically identifying the concentrated scan activity that signifies an ongoing worm assault, noting that the system monitors a "used" address space and pinpoints scan sources using outbound TCP RESET packets that indicate failed inbound linkage efforts, thus making localization more accurate and fortifying the system against anti-monitor measures. Chen says any existing distribution mechanisms--email, pagers, etc.--could be employed to post worm propagation advisories. Also included in Chen and Ranka's proposal is an anti-spoof protocol that can detect hosts potentially compromised by worms by winnowing out bogus scan sources, as well as a "system sensitivity" performance metric for gauging how responsive an early warning system is in broadcasting an ongoing worm attack. Chen says the system is designed for local deployment or co-deployment among enterprise networks. A distributed anti-worm system that defends against high-bandwidth distributed denial-of-service attacks has also been designed by Chen's team.
    Click Here to View Full Article

  • "Scientists Concerned About Slowdown in U.S. Government Research Spending"
    Voice of America News (04/20/05); McAlary, David

    Planned cutbacks in federally-funded research could be detrimental to long-term scientific innovation, the United States' readiness for future warfare, and America's international technological dominion, according to an American Association for the Advancement of Science (AAAS) report to be presented at an April 21 conference in Washington. "The concern is that by decreasing federal investments in R&D just at a time when high-technology products are more important than ever in our economy, and just as countries like India and China and South Korea are increasing their investments in these areas, the U.S. could be endangering its future economic health," says AAAS R&D policy director and report author Kei Koizumi. Whereas the inflation rate stands at approximately 2 percent, government research spending is only expected to increase by 0.1 percent. The AAAS study says defense research spending will lag behind inflation for the first time in 10 years, while energy and environment research will be dealt a particularly harsh blow. Koizumi says the bulk of U.S. R&D projects will continue to be funded by the Defense Department under the new budget proposal. He notes that more and more funding is being channeled into weapons system development, leaving basic research in the lurch. "The importance of basic research is that eventually the Pentagon is going to need to be ready for the wars of the future, and meeting some very tough technical challenges," he maintains. Koizumi also says thinner R&D portfolios resulting from the government funding cutbacks means fewer resources to train graduates in science and engineering and that "more scientific ideas may be pursued and exploited overseas, rather than in the U.S."
    Click Here to View Full Article

  • "North America to Have Fewer Developers Than Asia/Pacific"
    TechWeb (04/19/05); Gonsalves, Antone

    The Asia/Pacific region will overtake North America in overall number of software developers starting in 2006, according to new research from International Data (IDC). "For the large economy of North America, which has historically been the home of most developers, program development automation techniques, offshoring, and increasing productivity are all likely to put downward pressure on the rate at which the number of professional developers in North America increases," says IDC analyst Stephen Hendrick. The number of professional software developers in China will increase at a compound annual growth rate (CAGR) of 25.6 percent through 2008, and at 24.5 percent in India. The markets have huge populations, and will continue to spend heavily on information technology. The Middle East and Africa will have a CAGR of 18.3 percent. By 2008, there will be a total of 14.9 million software developers worldwide, as the number of developers increases at an annual rate of 9.8 percent.
    Click Here to View Full Article

  • "After Open-Source Controversy, Torvalds Turns to 'Git'"
    IDG News Service (04/20/05); McMillan, Robert

    Linux creator Linus Torvalds last week started "git," a new initiative to develop software that can rapidly make changes to the Linux kernel, following a conflict between open-source developer Andrew Tridgell and BitMover, developer of the BitKeeper software Torvalds had used to manage the kernel's development since 2002. BitMover permitted Linux developers to use a free version of BitKeeper for kernel development, but this allowance was revoked when the company became displeased with Tridgell's attempts to create an open-source version. Free software proponents counter that Tridgell's code allowed them to contribute to the kernel without having to accept BitKeeper's proprietary software license. In an email interview, Torvalds said "Git, to some degree, was designed on the principle that everything you ever do on a daily basis should take less than a second." A number of "subsystem maintainers" manage the Linux kernel, and each maintainer accepts changes and new bug fixes for a different portion of the kernel, and then submits the changes to either Torvalds or Andrew Morton; not all kernel developers employed BitKeeper, but critics claim most developers were pressured into using the software because Torvalds used it. Torvalds said many functions available with BitKeeper will be absent from git, which will also be more difficult to use. He said the BitKeeper-to-git transition's real impact would lie in how much the new software would slow down the contributing subsystem maintainers. Torvalds described Tridgell's software as "bad" because it lacked a useful purpose and carried no benefits to Linux developers or BitMover.
    Click Here to View Full Article

  • "Innovation Moves From the Laboratory to the Bike Trail and the Kitchen"
    New York Times (04/21/05) P. C2; Postrel, Virginia

    In his book, "Democratizing Innovation," Eric von Hippel of the MIT Sloan School of Management's Innovation and Entrepreneurship Group writes that many innovative industrial and consumer products are being developed by users first, a conclusion supported by mounting empirical evidence. Users can employ inexpensive computer-based design products to flesh out their concepts, while the Internet offers an easy outlet for users to share their designs. The book cites open source software as an obvious example of home-grown products. In an interview, von Hippel says, "Users are designing exactly what they want for themselves; they have a market of one. Manufacturers are trying to fit their existing investments and existing solution types to the largest market possible." Manufacturers' practice of mass customization yields less flexibility because it involves integrating pre-specified elements, and Von Hippel recommends manufacturers devote more study to "lead users" who invent their own creative prototypes to accommodate the uniqueness of their pursuits. Biking enthusiasts, for instance, have developed a broad spectrum of innovations, from protective clothing to tire enhancements for icy terrain, to better enjoy their hobby. A 3M study conducted by von Hippel and colleagues noted an eight-fold increase in sales from lead user-developed product concepts over sales from concepts developed in-house. Von Hippel believes industries should take a cue from semiconductor makers, who placed customizable chip design and fabrication in the hands of clients via toolkits.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Designing a Jetson Mobile"
    CNet (04/21/05); Spooner, John G.

    As a designer for Ford Motor, Anthony Prozzi is responsible for predicting what consumers will want in the automobile of the future. His Mercury Meta One concept car recently displayed at the New York International Auto Show demonstrated Prozzi's belief that future consumers will want personalized interfaces that provide all of the communications functionality they can access at home. The Mercury Meta One car featured three LCD screens and one touch-screen display that can be configured to suit different driver's tastes, such as showing more or less gauge information or having different screen orientations. The car dashboard noticeably lacked the buttons and knobs traditionally used to control radio and temperature functions because Prozzi says the entire interface can be more easily personalized if it resembles that of a personal computer. Sensory feedback, however, is still important in cars because it allows drivers to operate controls without taking their eyes off the road, and Prozzi is considering virtual keyboards and buttons that can be projected onto different surfaces, perhaps allowing the passenger to use in-depth functions without needing bulky hardware. Another key feature of the Mercury Meta One design is the key, which is more of a personalized remote control than a mechanical tool; electronic buttons are used for the car's ignition and shifter. Prozzi says future car keys will be status symbols similar to how some people flaunt stylized credit cards today, but that future keys will also be able to authenticate the user, store personal information, and communicate with other personal devices wirelessly. He says future cars should also be integrated with people's personal technology, such as day planner programs, so that cars can assist in mapping travel routes, providing reminders, and fulfilling other assistant tasks.
    Click Here to View Full Article

  • "Map Project Unlocks Europe's Landscape"
    IST Results (04/20/05)

    The IST-funded Geospatial Info-Mobility Service by Real-Time Data-Integration and Generalization (GiMoDig) project could potentially allow limitless numbers of users to access a massive repository of geographic data collated by European mapping agencies for an unlimited number of applications. Project coordinator and Finnish Geodetic Institute professor Tapani Sarjakoski says the commercialization of GiMoDig is several years off, but notes that "the GiMoDig software is now a working prototype and it can reveal the applicability of the approach, based on the data available in the four partner National Mapping Agencies [NMAs] involved in the project." The NMAs have been compiling geographic data on their respective countries for more than a century, but the information is kept in national silos that can only be accessed through the agencies and their partners with various technologies. GiMoDig helps users in different countries access the data with different devices. The integration element of GiMoDig involves blending data sets from different countries together, while the generalization element entails rendering maps in a vector format to maintain relevance to the user while keeping the amount of data reasonable for easy downloading and use on mobile devices. The user employs an integrated GPS receiver in his mobile device to request a map based on his current location. The data is retrieved from the agency databases in real time and the map downloads to the screen and displays relevant features based on user preferences. Sarjakoski says some of the knowledge acquired through GiMoDig will be applied to the Infrastructure for Spatial Information in Europe (INSPIRE) project, whose goal is to stimulate the development of a European spatial data infrastructure that supports integrated spatial information services for users.
    Click Here to View Full Article

  • "Patently an EU Tangle"
    Financial Times-IT Review (04/20/05) P. 6; Cane, Alan

    The European Commission's directive over the "patentability of computer-implemented inventions (CII)" has been a lightning rod for controversy. The proposal to harmonize CII management throughout the European Union via a comprehensible patent protection system is touted by advocates as essential to making Europe a key player in the global tech economy. But the open source community and other opponents criticize the directive's current draft as potentially detrimental to innovation and competition, as it could allow software patents that big organizations might leverage to discourage smaller businesses from pursuing technological advancements. Open-source developers believe software can be produced more effectively if authors can modify and distribute source code free of restrictions. However, former Cornell University professor Robert Lind conducted an independent study commissioned by the European Industry Association for the IT industries (EICTA) in 2004, concluding the open source community's view that patents inhibit innovation is "an unproven hypothesis." His basic argument was that patent protection is rarely used by the software industry, while telecoms and other tech industries see the removal of such protections as disadvantageous. "It is important to put to rest...the notion that allowing patents on CII will ultimately lead to the patenting of discoveries, mathematical methods, scientific theories, aesthetic creations schemes, rules and methods for performing mental acts, playing games, or doing business and presentation of information," Lind wrote. By this reasoning, the EU Council of Ministers' "common position" on the CII patentability directive, which supports patent protection, is valid.

  • "Pushing Virtual Reality"
    Des Moines Register (IA) (04/16/05); Vinluan, Frank

    An Iowa State University competition for students in the human computer interaction program allows researchers to test the capabilities of the Virtual Reality Applications Center, virtual reality technology that has already been used to study molecular structures and weather patterns. "We wanted people to push the limit a little bit on the hardware, show what it could do," says Jim Oliver, director of the center. Professors chose two games as winners of the contest, one that immersed the player in a mountainous terrain to smack a ball through panels of moving blocks, and another that places the player in a medieval castle to slay "monsters" with a sword. ISU's facilities feature one of the few C6 "cubes," a 10-foot-by-10-foot room that surrounds users with three-dimensional images and sound, in the country. The center received new equipment last year, and will get more upgrades this summer as a result of a $2.8 million grant from the U.S. Air Force. Students will be able to do even more with the virtual reality technology next year.
    Click Here to View Full Article

  • "Return of 3-D--and No Goofy Glasses"
    Christian Science Monitor (04/21/05) P. 14; Lamb, Gregory M.

    Three-dimensional imagery is enjoying a resurgence in the entertainment sector, and innovators are furthering the technology for applications in fields that include science, engineering, security, and advertising. After a brief flirtation in the 1950s, 3-D films are generating new interest thanks to the advent of techniques and equipment that eliminate the need for the juxtaposition of images captured by multiple cameras, and allow movies already filmed with a single camera to be converted into 3-D. Special eyewear is still required to view 3-D movies in all their glory, but companies such as LightSpace and Opticality are developing "naked eye" 3-D images where such accoutrements are unnecessary. LightSpace's DepthCube system features images that are rear-projected onto 20 layered screens 20 times faster than the ordinary rate of video projection, giving the viewer an illusory sense of depth behind the screen and eliminating the eyestrain associated with glasses-based 3-D. A naked eye 3-D system from Opticality is being exhibited in Japan as a 180-inch-long "video wall" that uses the "motion parallax" effect that allows observers to see different views as they move in relation to the wall. Opticality's Keith Fredericks expects 3-D technology to find its way into movie screens, televisions, handheld games, and cell phones, while LightSpace foresees the technology's incorporation into computer terminals. LightSpace President Alan Sullivan says the U.S. government has an interest in 3-D X-ray machines that can provide a clearer picture of the contents of packages, luggage, and other objects passing through airports. Fredericks says progress toward a true 3-D hologram is moving forward, although such a breakthrough would require 1 trillion pixels.
    Click Here to View Full Article

  • "Getting Flat, Part 1"
    Linux Journal (04/14/05); Searls, Doc

    New York Times columnist Thomas Friedman's new book, "The World is Flat: A Brief History of the Twenty-First Century," provides unique insight into how the open-source and free software movements fit into the context of other important "flattening" events and movements, writes Linux Journal senior editor Doc Searls. The book says the fall of the Berlin Wall, the emergence of the Windows operating system, the digital revolution, the popularity of the World Wide Web, and the proliferation of application-to-application work-flow programs all break down hierarchical barriers and foster further flattening forces, such as the open source movement. However, Friedman fails to recognize that open-source and free software ethics were behind important Internet protocols and efforts. He also places undue importance on Netscape's browser efforts in commercializing "alphabet soup" protocols such as TCP/IP and HTTP when those protocols were positioned for widespread use by the hacker culture that spawned them much earlier. As Eric Raymond says, "This wasn't an accidental byproduct of doing neat techie stuff; it was an explicit goal for many of us as far back as the 1970s." Friedman correctly understands the value of open source software in providing infrastructure, or what he calls "vanilla" software, but does not recognize that open source thus expands the software market by creating new possibilities for using open-source products commercially. Instead, he falls into the trap of viewing free and open-source software as primarily a threat to commercial software and even includes an Economist quote labeling open source a "post-capitalist model of production." Searls says open source is neither anti-business nor anti-capitalist; in fact, he says "more money will be made because of open source than will be made with open source."
    Click Here to View Full Article

  • "Letting the Net Speak for Itself"
    SiliconValley.com (04/18/05); Nunberg, Geoffrey

    Stanford University linguist and "Going Nucular" author Geoffrey Nunberg dismisses fears of non-English Web content being crowded out by a domineering Anglo-Saxon viewpoint as groundless. He says native English-speaking Web users have become a minority, while the growth of non-English Web content is on the rise. The declining percentage of English Web pages, as indicated by comparisons between a 1997 study Nunberg co-authored and a 2002 analysis from the Online Computer Library Center, leads Nunberg to conclude that more than 50 percent of Web content will be non-English in several years' time. He says the Web's international nature has prompted numerous Web sites in other countries to post their sites in English as a courtesy to English-speaking foreigners, but the proliferation of foreign Internet use among small businesses and individuals is causing a drop-off in the proportion of these sites. Nunberg argues that the non-English-speaking world's claims of Web marginalization stem from search engines' tendency to rank popular English-language pages highly in their query results. He says the non-English world should see this trend as a positive sign that the distribution of information is no longer restricted by geography. Nunberg maintains the digitization of libraries and other media in the near future will make anytime-anywhere access to such content a reality. However, he cautions that this trend "also increases the inequalities between [English and other] languages and ones spoken in poor and developing nations, where online access is restricted to a small elite and where even basic information is not available online in the local languages."
    Click Here to View Full Article

  • "UC Berkeley-USC Project to Study "Digital Kids""
    UC Berkeley News (04/14/05); Maclay, Kathleen

    The John D. and Catherine T. MacArthur Foundation is funding $3.3 million in research that it ultimately hopes will help improve how schools use new digital media as a tool for learning. Over the next three years, Peter Lyman, a professor at UC Berkeley's School of Information Management & Systems (SIMS), will head a team of investigators looking into how young people from ages 10 to 20 use cell and camera phones, Web logs, instant messaging, gaming devices, and other digital media to create and share knowledge. Mizuko Ito, a communications research scientist at the University of Southern California who has studied the use of digital media among kids in the U.S. and Japan, and Michael Carter, an educational software developer at the Monterey Institute for Technology and Education, will join Lyman in the study, which will also analyze how this digital environment is affecting learning. "Common sense suggests that exposure to digital media affects young people in formative ways, reflected in their judgment, their sense of self, how they express their independence and creativity, and in their ability to think systematically," says Jonathan Fanton, president of the foundation. "So far, there is little empirical evidence to back this up."
    Click Here to View Full Article

  • "Step Into the Future"
    InformationWeek (04/18/05) No. 1035, P. 32; Dunn, Darrell; Garvey, Martin J.

    Many data centers are planning upgrades to take advantage of new technologies and make their operations more efficient, but an InterUnity Group/AFCOM survey of 161 data-center professionals finds most respondents to be concerned that their companies are purchasing new equipment without devoting sufficient attention to power or cooling needs. Liebert general manager Steve Madara is convinced that liquid cooling technology, or some variant of it, will emerge as the primary solution for future enterprise data centers, while Dell, Hewlett-Packard, and IBM have started offering evaluation services to help clients design blade-server environments with optimal thermal properties. "We'll end up with small mainframes because the new servers will be so dense and so hot," predicts Newell Rubbermaid data-center and network analyst Paul Watkins, who also thinks liquid cooling innovations could address the reduced ventilation space such a move entails. Some companies polled in the InterUnity/AFCOM survey will have to change the interaction dynamic between data-center managers and high-level IT executives over developing infrastructure plans. Twenty-six percent of respondents indicate concern about a lack of communication with senior management and 27 percent cite low interaction between IT and facilities departments. In order to comply with the Sarbanes-Oxley Act and other recent legislation, companies "are digging deeper and deeper and spending more and more money to make these [data centers] more robust and safe from every possible threat of downtime or loss of critical information," says Syska Hennessy VP Michael Fluegeman. Reliability boosts via the deployment of dual-bus UPS systems and the construction of partitioned data centers is expected. Despite challenges, almost 50 percent of the surveyed companies are not transitioning to utility-computing models.
    Click Here to View Full Article

  • "Together--At Last"
    Computer Business Review (03/05); Clarke, Gavin

    Billions of dollars in development costs and unrealized potential income are going to waste on late or failed corporate software projects, but application lifecycle management (ALM) vendors are promoting strategies to guarantee more effective collaboration among business application development teams, and to make business managers more knowledgeable about project parameters and software performance. ALM providers cite a failure in the application development process, partly driven by market saturation and commoditization of application development tools. IBM Rational, Borland Software, and other major vendors have been devoting more attention to integrated "platforms" in recent years in light of three industry trends: Mounting evidence of application development's failure and the resulting waste of money on software-reliant business projects; increased outsourcing as a cost-cutting measure; and rising pressure from official and government regulations that prescribe business behavior. IBM Rational's solution provides a rational unified process for application development via integrated tools that aid in the modeling, architecting, delivery, and management of software encompassing companies' business processes by individual developers. Borland's Software Delivery Optimization Strategy aims to supply a series of role-based environments that provide only the features needed by individual ALM team members, which are then combined to facilitate better communication and collaboration between ALM teams. Microsoft is also getting into the act with the upcoming launch of its Visual Studio 2005 Team System platform, whose features include application modeling and deployment by ALM team members via the Whitehorse environment, and a project management infrastructure via the Team Server centralized server.

  • "Stopping Spam"
    Scientific American (04/05) Vol. 292, No. 4, P. 42; Goodman, Joshua; Heckerman, David; Rounthwaite, Robert

    Software programmers and purveyors of junk email are locked in an ever-escalating arms race as the spread of spam threatens to compromise the integrity of Internet communications, write anti-spam experts and research collaborators Joshua Goodman, David Heckerman, and Robert Rounthwaite. However, smart software filters, email sender authentication schemes, legal restrictions, and other anti-spam efforts could hold back the tide of spam through widespread usage or enforcement. The authors propose a combination of spam filters with machine-learning capabilities and proof systems designed to make spamming computationally and/or financially unaffordable. Machine-learning systems can be thwarted by spammers who obscure their output's wording, but such filters can be trained to recognize and adapt to these tactics; an important component of the researchers' work is the employment of n-gram techniques that use subsequences of words to identify key words frequently associated with spam. Among the proof system options Goodman, Heckerman, and Rounthwaite investigate are human interactive proofs, which are puzzles or problems that humans can easily solve but computers cannot; computational puzzles that senders' email systems must unravel; and micropayment schemes in which spammers pay a small amount of money for each email, so that the cumulative cost becomes prohibitive. The authors also see reputation services that certify legitimate senders playing an important role in anti-spam efforts, and give high marks to the Sender ID Framework as an sender authentication scheme designed to combat email "spoofing." Goodman, Heckerman, and Rounthwaite think federal legislation can complement technological defenses against spam.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM