HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 718: Friday, November 12, 2004

  • "Mostly Good Reviews for Electronic Voting"
    New York Times (11/12/04) P. A20; Schwartz, John; Zeller Jr., Tom

    Skepticism over the security and reliability of electronic voting systems continues to linger, despite assurances from election officials and experts that most e-voting machines performed smoothly on Election Day. Critics are still vocal about the systems' vulnerability to tampering and bugs, and their lack of a verifiable printed audit trail. Though some election officials acknowledge the need for more accountability and reliability, they argue against paper trails on the grounds that touchy printers could become a headache for poll workers. Maryland elections administrator Linda Lamone contends that a more reliable verification measure is necessary, and her state is collaborating with voting technology vendor VoteHere to develop such a system. Gartner analyst John Pescatore says the lack of major incidents with e-voting systems on Election Day is no indication that the machines are reliable, and argues that voting "should be so trustworthy that we expect everything to go right." A Nov. 9 report from the Caltech/MIT Voting Technology Project concluded that there was no evidence to suggest that e-voting machines were used to rig the election in favor of President Bush, but that has done nothing to allay fears of voting fraud. "If we don't fix our voting technology situation, we will have a serious and justifiable erosion of public confidence," warns Johns Hopkins University computer science professor Aviel Rubin. However, touchscreen voting's overall passing grade, along with the Help America Vote Act, will undoubtedly encourage increased use of the technology.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Future Watch: Supercomputing Technology to Keep Tabs On"
    Computerworld (11/11/04); Weiss, Todd R.

    Representatives of three major supercomputing efforts showcased at this week's SC2004 conference agree that supercomputer research has real business implications. One project, NASA's Columbia supercomputer, is comprised of 20 Silicon Graphics Altix machines, each boasting 512 Intel Itanium 2 processors that collectively yield a peak speed of 42.7 trillion calculations per second; Columbia proves that off-the-shelf technology can be used to tap huge amounts of computing power in less than 120 days, an advantage that can also be applied to computationally intensive business IT operations, according to project manager William Thigpen. Furthermore, NASA is working to tackle the challenge of transferring large data files between two points, and Thigpen is confident that the technology used to address the problem will trickle down to corporate computing. Also spotlighted at SC2004 is the Internet 2 project, an initiative to develop advanced Internet technologies coordinated by a consortium of over 200 academic institutions, and featuring the participation of approximately 50 private companies. Internet 2 executive director Laurie Burns says the project is of interest to businesses as a transformational tool: She notes that corporate members such as Johnson & Johnson see the potential for faster and more secure drug prototyping research, for example, through technologies developed via Internet 2. SC2004's StorCloud project is at hand to highlight high-performance storage technologies whose performance is closer to that of supercomputers, using commercially available components from diverse manufacturers. Bryan Bannister with the San Diego Supercomputing Center explains that StorCloud's strategy is often employed by IT users who blend products from multiple vendors and then figure out how to make them interoperate. StorCloud is designed to fulfill storage requirements that are growing exponentially as more computing power becomes available.
    Click Here to View Full Article

  • "Shared Awareness Key to Successful Computer-Supported Collaboration"
    Penn State Live (11/09/04)

    A Penn State research project funded by the National Science Foundation has yielded a paper proposing an evaluation schema for activity awareness-based computer-supported collaborative systems, which was detailed at this week's ACM's Computer Supported Cooperative Work Conference. The Penn State researchers explain that such a system could promote more successful team collaboration to solve complex, unclear problems because the software tools they employ would support members' shared comprehension of long-term objectives, plans, obstacles, and resource apportionment. "The more shared activity awareness among users of computer technologies, the more effectively a group will function," concludes Edward M. Frymoyer Professor of Information Sciences and Technology John M. Carroll. The activity awareness model the researchers created incorporates such factors as context, communication, coordination, and whether the work is structured in a way that requires either frequent or infrequent communication. The Penn State team devised a suite of activity awareness-augmenting tools, including a concept-map interface and a timeline for document histories, deadlines, and project status. The research involved the study of sixth- and eighth-grade students in two Virginia middle schools as they collaborated on science projects over two years, using a Java-based system that included an integrated chat tool and a real-time interaction editor. In the first year of the experiment, remote collaboration between students was difficult because of a lack of activity awareness between the teams.
    Click Here to View Full Article

  • "Panel Discusses Threat of Cyberterrorism at Workshop"
    Cornell Daily Sun (11/12/04); Holmes, Casey

    Separating the myths of cyberterrorism from the reality was the goal of a workshop sponsored by the Cornell Information Assurance Institute and the Peace Studies Program, noted moderator and postdoctoral associate of peace studies Giampiero Giacomello. Cornell Information Assurance Institute director Fred B. Schneider identified the idea that existing security measures could utterly thwart hackers or cyberterrorists as a myth. "The game is moving vulnerabilities around to places where it will do the least good for the attacker," he explained. Georgia Tech College of Computing professor Seymour Goodman said terrorist organizations are currently using cyberspace as a tool for enlisting new members, laundering money, researching potential targets, and passing encrypted messages, but warned against discounting the possibility that such entities lack or will never recruit people with the skills to execute a cyberattack. National Research Council scientist Herb Lin pointed out that there are dramatically fewer physical barriers in cyberspace than in the real world, and increasing automation of computer systems raises the risk that a catastrophic terrorist attack could be remotely carried out by a single person. The presenters listed numerous factors that have contributed to the vulnerability of computerized U.S. infrastructure, such as putting in security as an afterthought in the design of original infrastructure computer programs; computer software standardization; people's tendency to not update or upgrade their information security for the sake of convenience; and legacy software created before the advent of pervasive computer networking, when security was not prioritized or fully realized. Goodman explored several cyberterrorism scenarios: One outlined a cyberspace-directed attack on institutional infrastructures, while another detailed a combined cyberattack and physical assault.
    Click Here to View Full Article

  • "Emotional Computing"
    Washington Times (11/11/04); Geracimos, Anne

    Researchers are helping computers to recognize basic cues so they can respond to human emotion. So-called affective technology imbues computers with the ability to sense and convey emotion, through aspects of communication that have previously been ignored in the computing realm, such as tone of voice, facial expressions, and even basic physiological reactions like sweaty palms. MIT affective computing researcher Rosalind Picard says prototype computers have been created that have those capabilities, but still are not able to intelligently express emotion; Picard wrote "HAL's Legacy: 2001's Computer As a Dream and Reality" in 1996 where she talks about the subject of computer emotional intelligence in depth. Picard sees opportunities for emotionally aware computers in education and health care, such as computers that could change tactics when children become bored or frustrated with the learning process. People who are undergoing psychological therapy could be monitored by devices such as a mobile phone; therapists could be alerted when that person is under stress and needs counseling. MIT graduate student Carson Reynolds helped with an email program called Emotemail that helps transmit a writer's emotional state along with the email text: A camera takes pictures of the author's facial expressions as they write so that small photos are embedded alongside each paragraph, and text is given a different background color depending on the time it took to compose it, so that readers can know when to look for nuance. American Association for the Advancement of Science ethics director Connie Bertka is somewhat hesitant about computers catering to people's emotional states. Although she might compensate for the emotional state of a five-year-old child, would she want a computer to do the same for her, she asks?
    Click Here to View Full Article

  • "E-Mail Authentication Will Not End Spam, Panelists Say"
    Washington Post (11/11/04) P. E1; Krim, Jonathan

    Experts gathered at a recent Federal Trade Commission (FTC) forum said email authentication will not solve the problem of spam and online fraud: Hackers and spammers working in tandem have already created vast networks of zombie computers that could generate spam email messages with matching sender addresses and Internet mail servers; email authentication schemes rely on matching email header information to the actual source of the email. Symantec estimates that as many as 30,000 computers were drafted into zombie networks each day during the first half of this year. "We'll be lucky if we solve 50 percent of the problem," said MailFrontier Chairman Pavni Diwanji. Still, the gathered panelists agreed email authentication is a necessary building block in future solutions: MX Logic chief technology officer Scott Chasin said much of the Internet's underlying addressing architecture is still vulnerable to hackers because it is unencrypted. Industry and standards-setting organizations need to work faster to secure that infrastructure through projects such as email authentication, which is part of the reason the FTC decided to host the forum. Recently, Microsoft and other email authentication participants failed to work out licensing terms for a proposed standard, leading to the dissolution of the Internet group formed to create that solution. If the problem of spam and other illegal online activity is not solved soon, shaky consumer confidence could seriously impact the economy, warned Digital Impact vice president R. David Lewis, whose firm provides bulk email services. Chasin said hackers' technical capabilities are advancing quickly, even making computer operating systems themselves a platform for phishing attacks.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Trying to Make the Pen as Mighty as the Keyboard"
    New York Times (11/11/04) P. E5; Ricadela, Aaron

    The viability of using a stylus to direct the actions of a computer is hampered by the need for users to learn a shorthand whose variations can confuse the computer, the longer time it takes to write on a screen than type, and the awkwardness of pointing at and clicking on onscreen icons. Research into pen-gesture systems could address the problem if the resulting technology opens up new avenues of human/computer interaction instead of just substituting for a mouse, argues Brown University computer science professor Andy van Dam. In October, Dr. Shumin Zhai of IBM's Almaden Research Center released Shorthand-Aided Rapid Keyboarding (Shark) software on IBM's alphaWorks Web site; Shark supplants the onscreen keyboard typical of Tablet PCs with a stylus-friendly keyboard that configures common letter combinations in a hexagonal grid arrangement. Tracing the stylus across the virtual keys allows users to write words through the formation of shapes Zhai calls sokgraphs, which are displayed onscreen. Zhai says that, with enough practice, users can become so familiar with the shapes that they can gesture them from memory; Shark maintains a lexicon of roughly 8,000 words, and can learn new ones by scanning text files and emails. Meanwhile, Microsoft researcher Ken Hinckley's Stitching software technology lets palmtop and Tablet PC users exchange files by gesturing with their stylus toward each other's screens: His prototype StitchMaster software enables users to choose photos on their Tablet PC by tapping or circling groups of them, and then transfer the files wirelessly to another user's PC with a gesture. The goal of Hinckley's collaborative stitching project is to allow a user to send a document to several persons outside of arm's length by dragging the file to the top of their screen, thus eliminating the discomfort some people may feel with the close physical interaction necessitated by the prototype software.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Battling Hackers Is a Growth Industry"
    Wall Street Journal (11/10/04) P. B3B; Richmond, Riva

    Job opportunities and salaries for technology professionals have lessened in recent years, due to competition and lower technology spending, but a new study from IDC reports that security specialists are operating in an expanding market thanks to an increase in malicious attacks on computer systems, new regulations, and the risks posed by new communications technologies. IDC analyst Allan Carey says organizations are looking particularly for security specialists with business expertise, and forecasts that the number of full-time information security professionals will rise almost 14 percent per year around the world, going past 2.1 million in 2008. The study was commissioned by the International Information Systems Security Certification Consortium. Consortium board member Howards Schmidt says the possible national shortage of security professionals is worrisome, given that U.S. companies are becoming more reliant on their computer networks to do business. Many organizations report that the market is tight, and so compensation tends to be better for security specialists, and job security may also be better since companies are often reluctant to off-shore security functions. Security outsourcing does occur when those functions are taken over by specialized firms and big IT services companies, however. The highest security job growth rates will occur in the Asia-Pacific region over the next five years, at a rate of 18 percent a year, according to IDC; Europe job growth rate will average 11 percent during that time, while the U.S. rate will average 12 percent. However, the SANS Institute's Allan Paller says much of the growth is coming from within, as traditional systems administrator and auditor positions add new security responsibilities to their work.

  • "Computer Links Open Opportunities at UL"
    Lafayette Daily Advertiser (LA) (11/07/04); Sills, Marsha

    University of Louisiana (UL) researchers expect to take advantage of new research opportunities once the state is connected to the National Lambda Rail network and the link is distributed to state research institutions via the Louisiana Optic Network Initiative (LONI), which has been pledged $40 million over 10 years by Gov. Kathleen Blanco. LONI will establish links between UL and supercomputers at Louisiana State University (LSU) and elsewhere in the United States, with information being transmitted along fiber-optic lines at 40 Gbps. UL President Ray Authement believes research conducted through LONI will trickle down to industries, especially small companies that cannot afford equipment or research investments. UL computer science department head Magdy Bayoumi, director of UL's Center for Advanced Computer Studies, says the link "opens new windows for current research and it will create opportunities for new research that we don't think about right now." Also expected to play a major role in UL research through LONI is the Lafayette Economic Development Authority's $18 million Acadiana Technology Immersion Center, a visualization tech facility that Bayoumi sees as a key enabler for bioinformatics research. Meanwhile, UL Health Informatics Center director Phil Caillouet wants LONI to help create a health information network for the purpose of collecting and analyzing "non-identified data to get a better understanding of health problems and what services are needed." He notes, however, that some projects will be impeded by a lack of willing participants rather than infrastructure. In addition to UL and LSU, LONI will connect Louisiana Tech, Tulane University, Southern University, the University of New Orleans, and LSU Health Sciences Centers.
    Click Here to View Full Article

  • "Industrial Net Security Found Lacking"
    Investor's Business Daily (11/09/04) P. A5; Howell, Donna

    British Columbia Institute of Technology security researcher Eric Byres and British PA Consulting Group have published a study that indicates that industrial cybersecurity needs to be strengthened in the face of increasing outside cyberattacks. Since 2000, Byers discovered a tenfold increase in the number of successful cyberattacks on computerized control systems, some of which have inadequate safeguards and some of which have no protections at all. The study began with research for one oil company, and the researchers began compiling a database of incidents as other industrial organizations joined in the effort. Activity has picked up this year, Byres says, and more attacks are being reported; he believes that most of these attempts come from outside the organization, though only a minority of them are targeting a particular organization. He says a firewall is not enough to protect supervisory control and data acquisition (SCADA) systems, and that many administrators do not even realize their systems are vulnerable to Internet-borne threats such as worms and viruses since they weren't aware such systems were Internet-enabled. However, the need for remote management brought these systems online, since such access is a must-have for industrial control systems today because experienced engineers or system vendors need access. Although most current attacks on SCADA systems are opportunistic--they are attacked because they're there--Byres says someone with inside knowledge could cause a blackout.

  • "Group Aims to Create Hallmark of Security"
    CNet (11/08/04); Kawamoto, Dawn; Hines, Matt

    The Applications Security Consortium's goal is to establish "minimum criteria" for protecting Web-based applications, and the group plans to make its debut at the Computer Security Institute's annual conference. Consortium members include Teros, F5 Networks, NetContinuum, and Imperva. NetContinuum CEO Gene Banman says all participants agreed that the market needed clarification in what was needed to protect Web-based applications; all four companies have expertise in application firewalls. The consortium has invited Check Point Software Technologies, Symantec, Juniper Networks, McAfee, and Cisco Systems to test their security software and hardware products against the consortium's five criteria, monitored by TruSecure's ICSA Labs. Applications Security Consortium criteria include: detection and blocking of malicious executable commands, prevention of data insertion through the illicit control of format and type, prevention of cookie tampering, protection of application fields from modification, and protection of URL parameters. The testing program is intended to improve the protection for Web applications' underlying software protocols and application code, with the consortium saying that many security companies do not live up to their claims concerning such security--which results in higher risk, the group adds. Analysts say the five criteria are a good place to start for application firewalls.
    Click Here to View Full Article

  • "Home Is Where the Hand-Held Is"
    Far Eastern Economic Review (11/04/04) Vol. 167, No. 4, P. 38; Wagstaff, Jeremy

    Home automation technologies are slowly spreading across the Asia-Pacific region, enabled by fast and cheap Internet access. The foundation of a smart home begins with computers linked together by a wired or wireless network, after which other appliances such as stereos and televisions can be networked as well. Once the networking infrastructure is in place, entertainment applications will drive the adoption of home automation, says IDC Asia/Pacific analyst Claudio Checchia. Andrew Merritt, owner of Australia-based home networking hardware vendor Blakemore Integrated Automated Systems, is developing a prototype residence featuring networked entertainment, lighting, heating, and security controlled by a central computer or server via a universal remote. Meanwhile, Korea-based LG's HomeNet range uses the Internet to connect a fridge, washing machine, air conditioner, microwave, and projector TV. One of the biggest hurdles to overcome is the lack of standard communications between devices from different manufacturers: Software that can act as an intermediary between various gadgets is one solution being explored, while another skirts intermediaries altogether by reducing complexity; in this vein is the Internet Zero project at MIT's Center for Bits and Atoms, which proposes that intelligence be inexpensively embedded into home devices without introducing new standards. Speaking at a recent lecture, Center for Bits and Atoms director Neil Gershenfeld remarked, "Our experience has been that the killer app is...managing complexity rather than any one new service." Smart homes are not being designed strictly with luxury or security in mind: Efforts include spaces that monitor elderly inhabitants and provide caregiver services, while even more forward-looking visions include homes that anticipate owner needs and are capable of self-maintenance.

  • "In Search of Experts"
    InformationWeek (11/08/04) P. 36; Greenemeier, Larry; Foley, John

    The advantages of open-source deployments to enterprises, which include cost savings, independence from vendors, and standardization, are sometimes undermined by a lack of foresight among companies regarding the costs of building expertise and supporting implementations. Experts such as Yankee Group analyst Laura DiDio and Linux NJ.com consultant Faber Fedor concur that demand for experienced workers far outstrips supply, which means that skilled Linux administrators and other workers with open-source proficiency can demand salary premiums. A Yankee Group study finds that a complete migration from a closed-source system such as Windows to an open-source system such as Linux can cost three to four times as much as a Windows upgrade, and take three times as long to implement; similarly, a May Forrester Research study estimates that Linux training is 15 percent more expensive than Windows training, on average. IT workers experienced with Unix and C/C++ programming are better prepared to transition to Linux. One way to deal with a lack of available talent is to query the online open-source programming community, but most companies are more likely to prefer open-source support services offered by major IT vendors, although this option can increase total deployment costs. One strategy employed by Bob Ductile of KeyCorp's Key Technology Services division is to scour online open-source repositories such as SourceForge.net to find programmers engaged in projects that illustrate desirable skills. Robert Jones with Glacier Technology Services' HotLinuxJobs division predicts that open-source expertise supply and demand will equalize as colleges accelerate open-source programs and churn out graduates who will morph into the next generation of Linux experts.
    Click Here to View Full Article

  • "Going Down Fast"
    Computerworld (11/08/04) Vol. 32, No. 45, P. 51; King, Julia

    Massive job cuts, a hiring freeze, work-related stress, and frustration have combined to bring IT worker morale to an all-time low, according to research. Morale problems among IT personnel were reported by almost 75 percent of 650 companies polled by Meta Group in June, compared to two-thirds the year before. Experts are urging managers to measure other factors that can impact morale and job satisfaction even more than salary, benefits, or training; these factors include anger over less available resources, asinine policies and procedures, impractical expectations, managerial ignorance, and impossible demands. IT workers' biggest reported fear is being laid off as companies export software development, maintenance, and support jobs to less expensive outside contractors as a cost-cutting measure, while many working professionals are feeling overworked and burned out as a result of extensive staffing cuts. Such complaints carry an undertone of bereavement at the loss of the respect, wages, perks, and status IT workers once commanded. CDW IT director K.C. Tomsheck suggests that managers could assuage IT pros' concerns by devising transparent career development plans; CDW's IT staffers each possess a written personal development plan that includes tech-specific training goals and long-term performance objectives. CIOs of companies whose IT workers exhibit generally high levels of morale and job satisfaction cite the need for clear, honest, regular, and direct communication between management and IT personnel. "It's the unknown that causes anxiety and stress--not knowing if you're going to have a job next week or be outsourced," explains Minnesota Life Insurance CIO Jean Delaney Nelson.
    Click Here to View Full Article

  • "Self-Navigating Vehicle"
    EE Times Supplement (11/04) P. 41; Murray, Charles J.

    Many automotive experts believe the enabling technologies for vehicles that can automatically avoid collisions and communicate wirelessly with one another to stay abreast of traffic and road conditions are emerging, although a self-navigating car is unlikely to be realized for at least 20 to 30 years. The obvious advantages of such vehicles include a significant reduction in fatal crashes caused by driver negligence, and less time wasted driving to and from work. The first three elements of self-driving vehicles--adaptive cruise control, lane-keeping, and collision avoidance systems--are in various stages of development and/or deployment, but fully autonomous cars will need to incorporate improved sensors, electronically controlled steer-, brake-, throttle- and suspension-by-wire systems, faster and more powerful computers, and route-mapping. Drive-by-wire systems send messages to navigational controls--steering wheel, brakes, engine, etc.--by reading the movements of the driver's hands and feet using sensors embedded in the accelerator pedal, the brake pedal, and elsewhere; by-wire will be a key enabling technology for collision avoidance. The by-wire systems will be controlled by the processors that serve as the vehicle's "brain." In addition, engineers say better software will be needed to give cars a clear understanding of typical obstacles faced by motorists every day. The biggest impediment to the creation of self-navigating autos will be motorists' reluctance to cede control of their cars to computers. Engineers think drivers will be conditioned to accept automated navigation through the gradual introduction of incrementally advanced technologies. The Defense Advanced Projects Research Agency is funding autonomous vehicle research with the goal of having a third of the military's transport vehicles driverless by 2015.
    Click Here to View Full Article

  • "New Styles in Storage Architecture"
    Bio-IT World (10/04); Salamone, Salvatore

    Storage performance is becoming vital as life science databases expand dramatically and computational resources' processing power increases, and life science organizations are now considering new storage architecture solutions offered by upstart companies rather than industry heavyweights. One such company is Panasas, whose intelligent object-based storage system can dynamically distribute data across multiple storage blades, and facilitate parallel data transfers between storage blades and cluster servers with its file system, thus removing congestion; this product fulfilled the storage system criteria of Rockefeller University's Laboratory for Computational Genomics, which wanted a massive, inexpensive solution with high I/O. Also working in Panasas' favor are the credentials of CTO Garth Gibson, founder of the Parallel Research Lab at Carnegie Mellon University and a significant contributor to redundant array of inexpensive disks (RAID) technology. New-generation storage vendors combine leading-edge technology architectures with flexibility, as demonstrated by their willingness to customize products to clients' preferences. Netezza, for example, modified its Netezza Performance Server (NPS) for life science with input from experts such as J. Craig Venter Foundation CTO Marshall Peterson; the resulting NPS bioinformatics data warehouse can store terabyte-scale genomic databases and process sequence analysis SQL queries with dedicated hardware and software. Traditional storage systems providers have not been idle: Virtually all the major vendors have nearly doubled their systems' I/O data throughput and augmented products to address regulatory compliance and data projection issues. But high-performance storage systems are useless unless they seamlessly interact with the computing systems. "The key is to have a [system] that is stable and has high availability," notes Michelle Butler of the National Center for Supercomputing Applications.
    Click Here to View Full Article

  • "Understanding Spyware: Risk and Response"
    IT Professional (10/04) Vol. 6, No. 5, P. 25; Ames, Wes

    The evolution of spyware may be outpacing that of spyware countermeasures, but a recently identified profit potential will drive the maturation of countermeasures and close the gap between the threat and the solution, provided that IT professionals become familiar with spyware's operational mechanisms. Spyware comes in three flavors, each of which carries its own level of user risk: The simplest spyware iteration, basic cookie identification, allows returning users to be recognized by one specific Web site and associated with the known stored information the user provided upon his first visit. Associated cookies, the next step up the spyware pyramid, can record all user activity and captured keystrokes and share that data among sites affiliated with a spyware data server, usually for the purpose of targeted advertising; some of the data that can be collected--credit card information, passwords, etc.--is potentially intrusive, and the process of collection is usually invisible to users, who often do not realize what is going on. Application-based spyware has the highest user risk potential: Users cannot block the spyware, which can commandeer the user's system, find and record any desired data, and transmit the data to an outside source--and the spyware does not require the user to share data with a member site. Application-based spyware's infiltration techniques include riding on popular or desired applications downloaded by users; offering built-in utility services such as the storage and retrieval of passwords, accounts, addresses, and phone numbers; and executing a Java or ActiveX Web site application, which makes spyware infiltration completely hidden from the user. Blending several spyware methods can facilitate successful system penetration even in the presence of user restrictions. Possible indications of spyware include unknown disk activity and CPU use, previously nonexistent software conflicts, slow response, and system failures.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM