HomeFeedbackJoinShopSearch
Home
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM.

To send comments, please write to [email protected].

Volume 5, Issue 582: Friday, December 12, 2003

  • "Profiling System Takeoff Delayed"
    Wired News (12/12/03); Singel, Ryan

    The launch of the Computer Assisted Passenger Prescreening System (CAPPS II) has been postponed as industry and federal agencies take a closer look at the project, which is designed to profile air travelers and determine if they may be criminals based on information they are required to provide. A computer checks such information against consumer databases and watch lists of suspected terrorists and known criminals, but members of Congress, General Accounting Office (GAO) researchers, air travel industry organizations, and even the European Union have voiced concerns over CAPPS II. Congress recently barred CAPPS II from implementation until the GAO submits a report due in mid-February covering the program's effectiveness and privacy implications. The European Union has said that European airlines could be forced to break EU law because the United States mandates that airlines flying in from overseas must supply full access to their traveler database. Negotiations are underway between the EU's Data Protection Working Party, the U.S. Homeland Security Department, and the State Department to scale back the number of data elements airlines must disclose, guarantee the establishment of an independent privacy-protection office, and ensure that the data is used expressly for anti-terrorism purposes. The Air Transport Association also raised issues with the Transportation Security Administration over CAPPS II's financial ramifications: If the system discourages people from flying, it could damage an airline industry that is already in dire straits, while the cost of overhauling reservation databases to accommodate more information and gather missing data at check-in would run in the tens of millions of dollars. The government originally planned to deploy CAPPS II earlier this year, but the delay could push the initial rollout to 2004, with full rollout in the following year.
    Click Here to View Full Article

  • "Patenting Air or Protecting Property?"
    Washington Post (12/11/03) P. E1; Krim, Jonathan

    The deluge of new patents on Internet technologies and software is impeding innovation, according to industry experts and proponents of more traditional patents. A class of small companies is emerging that claim patents in key technological areas and wait until some other company commercializes a product using their patented technology or process: Acacia Research, for instance, is suing a number of Internet pornographers and on-demand hotel movie providers for infringing on its audio- and video-streaming patent; if successful, the company could go on to demand license payments from universities that use remote learning, cable providers, and streaming media firms such as RealNetworks. Intel Chairman Andrew S. Grove warns that the entire intellectual property system needs to be revamped to fit with the information age. The FTC and the National Academy of Sciences have noted the need for reform in the patent system, but patent officials say they lack resources. Increasing resources for the patent office would allow for better prior art research and more processing volume. DivX Networks CEO R. Jordan Greenhall says the entire idea of software patents is suspect, noting that software innovation was thriving before the introduction of software patents in the early 1980s; moreover, the 20-year lifespan of patents is far too long for software ideas which are often spent in just a few years. Software and Internet technology patents protect the creative efforts of small companies and large firms are often the ones who use patents as a weapon, says Silicon Valley attorney Gary L. Reback. He relates how IBM lawyers forced Sun Microsystems in the 1980s to pay millions of dollars in licensing fees simply because they argued some of IBM's 10,000 patents would have been violated by Sun.
    Click Here to View Full Article

  • "New Software Helps Supercomputers Think Fast"
    NewsFactor Network (12/11/03); Martin, Mike

    MPI for InfiniBand on VAPI Layer (MVAPICH) software developed by Ohio State University researchers is used to coordinate communication between the clustered desktops, or nodes, that make up supercomputers such as Virginia Tech's Macintosh-based machine, which claimed the No. 3 spot on the list of the world's 500 fastest supercomputers. OSU's Pam Frost reports that MVAPICH serves as a link between conventional supercomputing software and new networking architecture. "At some point, adding nodes to a cluster doesn't make the calculations go any faster, and researchers have to rely on software to manage communication between nodes effectively," notes MVAPICH research team leader Dhabaleswar Panda. "MVAPICH takes that software a step further by connecting it with the emerging 'InfiniBand' technology." Panda explains that he and Ohio Supercomputer Center researcher Pete Wyckoff developed MVAPICH to establish interoperability between message passing interface (MPI) software and InfiniBand. MVAPICH has also been used by Intel and InfiniBand developer Mellanox Technologies to create a supercomputer that performs teraflop-level computing. Panda expects that such a breakthrough will help pave the way for commoditized systems that will allow commercial firms and research facilities with limited budgets to take advantage of supercomputing.
    Click Here to View Full Article

  • "Automated Analysis of Bee Behavior May Yield Better Robots"
    ScienceDaily (12/11/03)

    Georgia Institute of Technology robotics experts led by Tucker Balch are using a computer vision system to automatically analyze the movements of social insects--bees especially--in the hopes of drawing new insights on animal behavior and applying those insights to the design of robots and computers. The system, which is part of the institute's BioTracking Project, is being employed to study data on the sequential movements that encode information--such as the waggling motions bees perform to tell each other where food sources are located--and boasts an 81.5 percent accuracy rate. That precision enables researchers to construct a successive system to accurately ascertain a bee's behavior from its sequential movements, according to Balch. The researchers first videotape bees (some of which are marked) for 15 minutes, and the video footage of the marked bees is converted by software into x- and y-coordinate location data for each bee within each frame. Researchers hand-label certain segments of this data and feed them into the automated analysis system as motion examples. Balch and his team plan to build a system that can infer executable models of animal behavior and run simulations of them, but accurately mimicking insect movements in robots means reconciling the differences between the motor and sensory systems of the robots and their biological counterparts. Meanwhile, Balch and researcher Zia Khan are analyzing primate movements at Emory University with a similar computer vision system in the hopes of extrapolating behavior models that can be computerized. The ultimate goal of Balch's research, which is funded by a grant from the National Science Foundation, is to design robots that use animal-like behavior to smoothly interact with people in noisy, dynamic, and unknown environments.
    Click Here to View Full Article

  • "In a Data-Mining Society, Privacy Advocates Shudder"
    Associated Press (12/10/03); Bergstein, Brian

    Concern is growing among privacy proponents and civil liberties advocates that our privacy is being stripped away by the availability of databases listing personal information as well as the "bread crumbs" left behind by our activities in the networked world. With the spread of the Internet, it is also relatively simple for people to acquire and abuse this technically public data. For example, employers have fired employees upon uncovering some embarrassing or scandalous item about them in such databases, only to learn that the information they received was misleading or inaccurate. But civil libertarians are even more worried that technological improvements will make personal information accessible to governments, advertisers, and insurance firms to the point where all citizens' actions can be monitored. Activists such as Private Citizen founder Robert Bulmash are also concerned that terrorists or identity thieves could exploit such databases, and believe American companies should adopt the European model of obtaining permission from citizens before marketing or sharing their personal data. Controversial projects being developed that critics fear could help pave the way for a "surveillance society" include the Computer Assisted Passenger Prescreening System to keep track of air travelers' personal information; the scanning and archiving of foreign visitors' fingerprints and facial images next year; and the multi-state Matrix law enforcement and terrorism database project, which bears a similarity to the Pentagon's defunct Total Information Awareness data-mining initiative. Meanwhile, the National Science Foundation recently kicked off a five-year, $12.5 million project to investigate whether Internet communication protocols and applications could be overhauled to take consumer safeguards such as copyright law and medical-privacy rules into account.
    Click Here to View Full Article

  • "E-Mail 'Cluster Bombs' a Disaster Waiting to Happen, Computer Scientists Say"
    Indiana University (12/10/03)

    The December 2003 issue of ;login: features a report by researchers at Indiana University Bloomington and RSA Laboratories in Bedford, Mass., that says miscreants could use Web sites to bombard the inboxes of Internet users with hundreds or thousands of electronic messages in a short period of time. Such email "cluster bombs" would generate a huge demand on the bandwidth of an Internet connection that could make it difficult for Internet users to perform online activities or even access the Web. Moreover, bombers could apply the strategy to SMS (short message service) messages to paralyze cell phone users. IUB computer scientist Filippo Menczer and RSA Laboratories principal research scientist Markus Jakobsson say a bomber could use special software called agents, Web-crawlers, and scripts to fill out thousands of forms simultaneously, and then have the automatic confirmation function of Web sites flood an inbox with messages about whether the individuals want to subscribe to the site. "We propose that Web forms be written so that the forms do not cause a message to be sent to subscribers at all," says Menczer. "Instead, the form would prompt subscribers to send their own emails confirming their interest in subscribing."
    Click Here to View Full Article

  • "Buried Treasure?"
    Financial Times-IT Review (12/10/03) P. 1; London, Simon

    Economists have long dismissed the concept of a causal relationship between IT investment and productivity growth, but theorists such as MIT's Erik Brynjolfsson believe one exists, and over the last several years an unusual trend has unfolded--significant productivity gains in the United States despite sharp declines in business IT investment and economic growth--that appears to support such a theory. Brynjolfsson explains that it can take as long as five years for major IT investments to yield productivity gains because most expenditures cover other elements--business process redesign, consultation, training, etc.--that create intangible assets to support and complement the technological components. The MIT economist estimates that the ratio between IT spending and "support" spending is roughly 10 to 1, based on a study across a spectrum of projects. Adding weight to Brynjolfsson's argument is a study of productivity gains by McKinsey Global Institute (MGI) concluding that the step-change in American productivity growth in the 1990s was centered around the semiconductor, wholesale, securities, retail, telecommunications, and computer manufacturing sectors, which were characterized by IT investment accompanied by heated competition with little restraint on services, price, products, and distribution. MGI finds that managers were forced to develop and implement IT-related innovations in response to these competitive pressures. Managers would do well to consider the contention borne out by Brynjolfsson and MGI's research: That IT's influence on productivity growth is only felt when accompanied by major investments in human resources, innovation, and business process overhauls. MGI director Diana Farrell says researchers should now concentrate on understanding the "performance levers" relevant to each company, which will enable managers to choose IT projects that promise to yield the most productivity gains.

  • "Megabits and Multimedia Specs Await New Bluetooth Road Map"
    EE Times (12/10/03); Merritt, Rick

    The industry consortium supporting the Bluetooth short-range wireless standard is setting up a working group to plan the next iteration: The newly announced group is in charge of forecasting market demands and guiding Bluetooth along those lines. Bluetooth is becoming a mature technology and needs to address consumer needs more sensitively than in the past, when there were no real similar wireless options, says Bluetooth Special Interest Group (SIG) Chairman and Nokia senior manager Markus Schetelig. Bluetooth currently operates at 723 Kbps but will ramp up to either 1.4 Mbps or 2.1 Mbps under a medium-rate specification that is currently undergoing approval within the consortium; that specification will provide different speeds based on whether four- or eight-level Phase Shift Key modulation is used and will allow for greater multimedia use, such as between MP3 players and headsets. The specification, set to undergo formal product testing in February, will also reduce power consumption by as much as two-thirds for some devices. Some vendors already sell Bluetooth chips based on the unofficial medium-rate specification, while others are ready for a firmware upgrade. A group of audio/video companies within the Bluetooth SIG is also working on real-time streaming applications for Bluetooth, while PC industry players are working on Bluetooth transmission of real-time streaming based on Internet Protocol. International Data predicts that the number of Bluetooth chipsets shipped in products will increase to 637 million per year by 2007, at which point up to 65 percent of mobile phones, 44 percent of PDAs, and 36 percent of laptop computers could be Bluetooth-enabled.
    Click Here to View Full Article

  • "First Test of IPv6 Network Goes Well"
    IDG News Service (12/09/03); Gross, Grant

    The results of the first test of a next-generation Internet powered by Internet Protocol version 6 (IPv6) were very positive, according to the University of New Hampshire Interoperability Laboratory's Ben Schultz at the IPv6 Summit in Arlington, Va. "We only found small problems and small issues that need to be fixed," he declared, noting that FTP and HTTP ran over the network and Internet security measures functioned with no apparent hitches. The test constituted the first phase of the Moonv6 project, in which seven U.S. military sites were linked together via a network running on IPv6; Schultz said the second test, which will run from early February to mid-April, will concentrate on longer-term testing of Internet applications. The phase one test involved the participation of the U.S. Defense Department and 11 vendors connected to the Internet2 project, while the North American IPv6 Task Force and the University of New Hampshire Interoperability Laboratory also collaborated on the Moonv6 project. IPv6, which promises improved security and the ability to accommodate more IP addresses, is slated to run the Defense Department's Global Information Grid within five years, while the protocol is already being employed by certain Europe- and Asia-based companies. The Moonv6 network tests should make a significant contribution to the rollout of IPv6 products and services. U.S. Marine Major Roswell Dixon of the U.S. Joint Interoperability Test Command commented that it took longer than anticipated to build the Moonv6 network, and pointed out that the next test phase will give researchers plenty of time to study glitches in detail.
    Click Here to View Full Article

  • "Mining the Vein of Voter Rolls"
    Wired News (12/11/03); Zetter, Kim

    Registering to vote requires people to list personal information that by all rights should be kept confidential, yet election officials are offering this information to political parties, candidates, and data collectors for the purposes of marketing or other forms of exploitation. A 2002 study by the California Voter Foundation found that 22 U.S. states offer unrestricted access to voter lists. The president of the foundation, Kim Alexander, explains that voters fill out the registration forms believing their information will only be used to administer elections, and are surprised when they are courted by "campaign strangers" who have acquired that information. She adds that this is especially problematic if states do not specify that some of the information they ask for--email addresses, for instance--is optional, or fail to explain to voters why such personal data is necessary. There is no legal mechanism to prevent states from making voter information available; in fact, the Help America Vote Act passed by Congress last year makes retrieving such information easier than ever by requiring states to build a centralized voter-registration database. Though Jim Dempsey of the Center for Democracy and Technology believes political parties and candidates have the right to use voter-registration data for campaign purposes, making such data available to anyone else endangers the soundness of voter rolls. Alexander says state governments should alert voters on the forms that their data may be used for purposes other than election administration, while states should improve safeguards to prevent commercial exploitation of voter-registration information. But she also notes that "Politicians need to reign in the laws, yet they're the biggest consumers of [voter-registration] data."
    Click Here to View Full Article

  • "An Electronic Assist for Perilous Driving"
    New York Times (12/11/03) P. E6; Austen, Ian

    Electronic stabilization systems are designed to prevent rollovers and other skidding-related accidents by having in-vehicle sensors collate data such as brake pressure, steering wheel orientation, wheel speed, and acceleration, while a microprocessor compares this information to determine whether the car is out of control. Philip Headley of Continental Teves attributes the emergence of electronic stability control systems primarily to the falling cost of computing in the 1990s. His company estimates that the percentage of stability control system-equipped cars sold in the United States and Europe is 8 percent and around 50 percent, respectively. Antilock braking systems' (ABS) disappointing performance on actual roads, compared to their test track performance, may account for low U.S. sales of stability control systems. Insurance Institute for Highway Safety President Brian O'Neill reports that stability control might offer advantages over ABS because "stability control...doesn't depend on the driver doing the right thing." Stability control systems are currently undergoing testing by the National Highway Traffic Safety Administration, and agency representative Rae Tyson notes that the technology appears to be broadly effective. Headley believes steering control will advance even further, to the point where cars will be able to steer themselves during accidents by connecting stability control to in-vehicle proximity radar, video cameras, and satellite-based mapping systems. "This is about taking an average driver and allowing them to perform like a skilled driver," explains Robert Bosch VP Robert Rivard.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Where In the World Is the Virtual IT Worker?"
    NewsFactor Network (12/09/03); Ryan, Vincent; Long, Mark

    Telecommuting has stepped out of the limelight at companies for a number of reasons, most having to do with the trend toward offshore outsourcing and the economic downturn; companies no longer feel as pressed to lure the best new hires with perks such as telecommuting, and even the assumption that telecommuting saves money is being challenged. Still, Gartner found in a June 2003 survey that about one-third of executive respondents outsourced or intended to outsource some of their IT functions, a variation on telecommuting as the work is done remotely. Some of that work has gone overseas, where developers get paid less than 20 percent what U.S. counterparts receive. But Software Outsourcing Research executive director Marty McCaffrey says offshore outsourcing has many hidden costs, including the extra effort involved in coordinating a project remotely. Telecommuting, meanwhile, has also been touted to save money, though those arguments may not hold up while corporate revenues remain depressed; lack of office space, for example, is no longer an issue as it was during the tech boom because of years of downsizing. Still, the International Telework Association and Council found in September that the number of U.S. employees who work from home at least one day per month has increased 40 percent in the last three years, while about 42 percent of those telecommuters work from home one day per week. IT staff are the most likely to telecommute because they can deal with technical issues better than employees who are not as tech-savvy. Robert Half Technology's Jeff Markham says executives are another segment being offered telecommute options. As the economy improves, telecommuting may prove to be a mixed bag for IT workers because it can make outsourcing a more likely option, while also signaling an employer wants to retain that worker by offering a telecommuting option.
    Click Here to View Full Article

  • "Everyone Wants to Govern the Internet"
    Inter Press Service (12/11/03); Cariboni, Diana

    Discontent over Internet governance is growing as the world increasingly lays claim to a technology built by consensus. Begun as a U.S. military project called ARPANET in 1969, the nascent data network system was soon transferred to academic hands and incorporated into the National Science Foundation Network. In the 1990s, the Internet had become a global project with technical standards often formed by the consensus of small groups of contributors, and sometimes by single individuals. As the idea of a free information exchange grew in popularity, corporate interests began to take notice and wanted to have a say in the Internet's governance, says ICANN board member and Mexican Internet expert Alejandro Pisanty. Today, Internet standards are governed by three groups: The Internet Engineering Task Force (IETF) made up of technical experts, who work out standards based on wide consensus; the World Wide Web Consortium (W3C), where member organizations set standards having to do with accessibility, user interface, and architecture; and ICANN, which handles the domain name system. The U.S. government established ICANN in 1998 as the body that would eventually assume technical management of the Internet. Major ICANN actions to date include the assignment of domain name registrars, new top-level domains, and policing unilateral actions such as VeriSign's recent Site Finder service. Critics of ICANN complain that the group lacks the accountability present in other organizations, such as corporations and non-profits. Pisanty says ICANN is moving to broaden participation, especially among Southern hemisphere countries which have agitated for more involvement; those countries prompted the U.N. secretary-general to set up a working group on Internet governance that will investigate new models for governing the Internet.
    Click Here to View Full Article

  • "IT Workers Feel Effects of the Long Downturn"
    Dallas Morning News (12/07/03); Godinez, Victor

    The downturn in the technology industry continues to have an enormous impact on information technology workers. According to experts across the industry, IT workers are overworked, concerned about job security, and are not happy with their careers. "People who have been able to hold on to a job or find a job have either taken a cut in salary or benefits," says Wanda Brice, president and chief executive officer at Computer Directions, a placement firm in Dallas. "If they have remained employed as a full-time employee, they have watched colleagues leave and get cut, and that's very demoralizing." Such sentiments were reflected in a survey released by the Computerworld trade journal last month that found that 55 percent of respondents said they were less satisfied with their jobs compared with a year ago. Though IT workers complained about their workload and budget cuts, Computerworld editor in chief Maryfran Johnson said they expressed even more concern about offshore outsourcing. IT workers expect more jobs will be outsourced, and added that other job openings may be filled by foreign workers in the United States participating in the H-1B or L-1 visa programs. Johnson says, "What causes us more concern are...worry on the future of the IT profession."
    Click Here to View Full Article

  • "Why Software Quality Stinks"
    CIO (12/01/03) Vol. 17, No. 5, P. 28; Surmacz, Jon

    The poor quality of software can only be improved by the deployment of an effective software quality assurance (SQA) program, yet 38 percent of developers polled by the Cutter Consortium across more than 150 software development organizations report that their companies lack such programs, while 31 percent say they have no SQA personnel at all. Seventeen percent claim no software quality problems, while 36 percent say they do not address quality until the very end of the development cycle. Another 36 percent attest that their companies boast a SQA team that is closely involved in most (if not all) software development projects, 29 percent report the presence of integrated SQA team members in each project, and 25 percent remark that most or all projects are the responsibility of an SQA manager. Fifty-three percent of respondents indicate that their senior management believes the company's software quality is reasonable but should be improved; 30 percent say management considers the software to be of consistently high quality; 11 percent of developers note that senior management never discusses software issues with them, and 6 percent report that management is very dissatisfied with software quality. A five-step procedure for setting up an effective software quality team can be extrapolated from the survey's findings: First, getting senior management on the side of the SQA manager is critical. Second, a quality organization must be established with a manager "who has spent a few years in the trenches and has gotten products out the door," according to the Cutter Consortium's E.M. Bennatan. The three remaining steps include training developers as well as the quality assurance group, obtaining customer or user group feedback, and collating metrics to track software quality improvements.
    Click Here to View Full Article

  • "Prepare to Be Scanned"
    Economist Technology Quarterly (12/06/03) Vol. 369, No. 8353, P. 17

    Despite unresolved issues of intrusiveness and cost, biometrics system rollouts are accelerating in the wake of the Sept. 11, 2001 terrorist attacks; biometrics security will be standard at U.S. airports and seaports starting early next year, foreigners will need to subject themselves to biometric scanning under the US-VISIT program, and new passports containing biometric data will be issued to Americans starting in 2004. Yet the General Accounting Office (GAO), the National Academy of Sciences, and other organizations report that biometrics technology is still not ready for widescale implementation. Biometrics can be employed for the purposes of identification or verification, but the most advanced biometrics systems are better equipped for the second purpose: This is because the databases such systems use to correlate identifying characteristics--be they fingerprints, hand geometry, faces, or irises--contain millions of users, which makes the risk of false positives too great. Biometrics technologies each offer different levels of effectiveness, with iris scanning generally considered to be the most reliable for identification. The least reliable method is face recognition, according to findings from America's Transportation Security Administration and the Face Recognition Vendor Test. A forthcoming paper from Michigan State University's Anil Jain proposes that "multibiometric" systems combining several biometrics techniques are more effective than individual biometrics, and both Europe and the United States are looking at a multibiometric approach that integrates face recognition and finger scanning. Biometrics-based identity-verification systems may be more effective than those used for identification, but the GAO and other agencies do not think the benefits outweigh the costs. Establishing biometrics systems at airports or border crossings could divert funds away from other critical security areas, while the long-term effect of biometrics will be an erosion of personal privacy, especially once face recognition techniques are perfected.
    Click Here to View Full Article

  • "Toward a Brain-Internet Link"
    Technology Review (11/03) Vol. 106, No. 9, P. 30; Brooks, Rodney

    MIT scientist Rodney Brooks speculates that brain implants which connect people wirelessly to the Internet could be a reality by 2020, thanks to technological advances as well as military and medical research initiatives. Brooks cites State University of New York (SUNY) researchers' successful implantation last year of devices in rat cortexes that stimulate certain areas of the rodents' brains so that they turn in specific directions in response to commands from a laptop. Three years earlier, SUNY researcher John Chapin and Duke University's Miguel Nicolelis recorded neural signals in rats when they pressed a lever to induce a robot arm to release water, training a computer to recognize the signals so that the animals could eventually trigger the arm by thought alone; a similar experiment using primates soon followed. Machine-neuron link technology for humans is already in use: Hearing-disabled people are now using cochlear implants, and human trials are underway involving neural implants that give blind people a rudimentary perception of the surrounding environment, and quadriplegics crude mental control of computers. The Defense Advanced Research Projects Agency has also established a wireless brain-machine interface program with the initial goal of developing thought-controlled biomedical equipment. Brooks concludes that the technology must meet additional challenges before it can reach its full potential. "We need algorithms that can track the behavior of brain cells as they adapt to the interface, and we'll need better understanding of brain regions that serve as centers of meaning," he explains.
    Click Here to View Full Article
    (Access to full article available to paying subscribers only.)

  • "The Taming of the Internet"
    Business Week (12/15/03) No. 3862, P. 78; Baker, Stephen

    The proliferation of spam and the adoption of spamming methods by virus writers and online fraudsters has delivered a jolting reality check to Web authorities, and is driving a migration away from a free-for-all Internet towards a more regulated environment. Companies and individuals are implementing new solutions and establishing new guidelines of ethical Internet communications; a recent poll by the Pew Research Center finds that 70 percent of people online do not divulge their email addresses. This transformation is causing the stock of security companies to rise, and spurring AOL and Microsoft to invest research dollars into cutting-edge spam filters and child-protection measures, while Silicon Valley venture capitalists are backing startups that offer anti-spam software. However, this scaling back of the Internet's openness is bad news for academics, enthusiasts, and others who rely on peer-to-peer networks to access documents and content vital to research or entertainment, while startups without trustworthy credentials could be hamstrung by the filtering, blockage, or dismissal of their outgoing email. The spam problem is also engendering a rethink of online marketing: Businesses of all sizes are eliminating mass emails in favor of special offers, coupons, sweepstakes, and promotions. The highly selective white list anti-spam solution, which shuts out most of the Internet populace, is causing online communities to become more cloistered, a trend that promotes safety but could balkanize the Web and inhibit the growth of new technologies. Analysts also expect online users to set up separate email accounts for friends, work, e-tailers, and even pornography. The development of anti-spam measures has forced spammers to change their strategies as well: Now instead of sending unsolicited emails, many spammers are scamming people by fooling them into thinking they are dealing with legitimate companies through spoofing, bogus Web sites, and other underhanded tactics.
    Click Here to View Full Article



    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM