HomeFeedbackJoinShopSearch
Home

       HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM. To send comments, please write to [email protected].
Volume 5, Issue 570:  Wednesday, November 12, 2003

  • "Caught By the Act"
    Washington Post (11/12/03) P. E1; Ahrens, Frank

    The Digital Millennium Copyright Act (DMCA) is a source of frustration for companies and individuals targeted by copyright holders for allegedly breaking the law--specifically, the provision that bans people from bypassing anti-copying measures put in place to ostensibly protect intellectual property from piracy. These businesses and individuals argue that circumvention of such measures should be allowed under the terms of fair use, and legislators such as Reps. Rick Boucher (D-Va.) and Zoe Lofgren (D-Calif,) have introduced bills to rewrite the DMCA to reflect this concern. Among the well publicized cases of copyright holders leveraging the DMCA to halt copy control circumvention is Lexmark International's suit against Static Control Components, a business that remanufactures laser printer toner cartridges. Lexmark alleged that, in order to enable the remanufactured cartridges to work, Static Control unlawfully copied computer code employed by chips in Lexmark cartridges that establish compatibility between the cartridges and printers. A court granted Lexmark an injunction against Static Control in February, while Static Control's request for a fair-use exception to the DMCA was rejected by the U.S. Copyright Office in late October. Static Control CEO Ed Swartz worries that Lexmark's triumph could set a precedent that would encourage draconian monopolization. Business Software Alliance attorney Emery Simon contends that such fears are exaggerated, arguing that the DMCA is set up to curb piracy, not trample fair-use rights. Nevertheless, the Electronic Frontier Foundation writes, "As an increasing number of copyright works are wrapped in technological protection measures, it is likely that the DMCA's anti-circumvention provisions will be applied in further unforeseen contexts, hindering the legitimate activities of innovators, researchers, the press, and the public at large."
    Click Here to View Full Article

  • "Yet Another Rendition of Linux"
    Wired News (11/11/03); Baard, Mark

    Acting executive director of the Desktop Linux Consortium Bruce Perens told attendees at the organization's first meeting in Tyngsboro, Mass., that some of the biggest companies in the world are backing a new version of Linux designed to counter Red Hat's enterprise Linux and serve consumers locked out by Red Hat's recent announcement that it would halt the sale of a consumer version of Linux in retail outlets. UserLinux, whose corporate sponsors Perens did not name, will be based on Debian GNU/Linux, which is supported by about 1,000 developers; the operating system will also be free for unlimited use and will come with certification from large computer manufacturers. In return for donating developers to the project, UserLinux participants will receive an operating system with limitless seats and options for paid technical support from many rival service providers. UserLinux should be available within half a year, while discs equipped with a consumer version of UserLinux should be released not long after. Perens said that UserLinux sponsors will welcome an operating system that is cheaper, less virus-susceptible, less buggy, and easier to implement and maintain than Microsoft products, and that comes with less "odious" terms of use than commercial Linux versions. Anyone who contributes to the promotion of UserLinux will receive free membership in the Desktop Linux Consortium, declared consortium organizers. However, former Ximian owner Nat Friedman is skeptical about many predictions of Linux's growth by Linux advocates and industry analysts. "The press likes to portray this as some kind of David and Goliath story, but that sets up unrealistic expectations," he contends. Novell's recent buyout of SuSE Linux and Red Hat's discontinuing commercial sales of consumer Linux have generated concern among Linux proponents.
    Click Here to View Full Article

  • "System Halts Computer Viruses, Worms, Before End-User Stage"
    Innovations Report (11/12/03)

    Washington University computer scientist John Lockwood led research on a hardware-based network protection system that guards against computer viruses and worms by scanning all incoming traffic. The Field-programmable Port Extender (FPX) technology uses field programmable gate array (FPGA) circuits running in parallel to rapidly scan data packets and quarantine potential malware. Because the chip arrays are programmable, administrators can update virus and worm definitions as needed and adjust the detection devices to focus on certain characteristics. FPX is capable of scanning traffic at a rate of 2.4 billion bits per second, and Lockwood says each individual byte is scanned. Existing network security, such as firewalls, cannot sufficiently defend against a virus or worm attack once computers inside the perimeter are infected, says Lockwood. The increasing speed of computers and the reach of the Internet also add to the threat of computer malware infection, as evidenced by the rapid spread of SoBigF, which infected more than 1 million machines in the first 24 hours, reaching more than 200 million computers in one week. In addition, today's security depends too much on end users, who often do not spend adequate time updating their software defenses. Such a system also places extensive burdens on IT departments, which spend a lot of resources tracking down and fixing vulnerable software. Placing FPX devices at critical network junctions would create subnets and effectively protect large networks, says Lockwood; the FPX system is a rack-mountable device that can be installed in network closets and set to either quietly eliminate malware or identify the end user through a pop-up message, and administrators configure the system using a Web-based interface.
    Click Here to View Full Article

  • "It's a Vision Thing"
    Financial Times-IT Review (11/12/03) P. 1; Nuttall, Chris

    Information visualization is being heralded as a solution to data overload faced by workers and consumers, and many companies are developing tools such as heat maps, expanding family trees, and animations that render information in less confusing, more navigable ways; this is of critical concern with the amount of stored data swelling, and the growing need to analyze data in real time or close to real time. A new report from International Data (IDC) indicates that information visualization aids will one day be required by all industry professionals, and expects the interactive data visualization tool market to be worth $2.2 billion in four years. The diagrammatic technique for establishing relationships between seemingly random pieces of data--a longtime tool of law enforcement--has been adapted into software for corporate use, one of the more recent examples being a desktop program from anacubis, which Bloor Research's Robin Bloor says can be especially helpful in planning corporate mergers. "The big hit from this type of analysis is that you can often see things that are not obvious just from perusing the data," explains Bloor; this ability to uncover new data relationships can help open up new areas of academic research as well. A visualization tool from Xrefer taps data from 150 scientific, literary, technological, and business-related works, providing XML tags to generate linked clusters forming a clickable map. Another visualization tool from Fractal:Edge depicts data as labeled, color-coded concentric circles. A major benefit of visualization tools for the enterprise is their ability to facilitate better collaboration between a business's various departments. "Information vendors have started to adopt XML--adding tags so that it's possible to add a visualization layer where before it was a real challenge," observes anacubis general manager Greg Coyle.

  • "Life After Batteries: What's Next for Notebooks"
    E-Commerce Times (11/11/03); Millard, Elizabeth

    Battery limitations have been a sore point for notebook computer users, who face the catch-22 of settling for bare-bones notebooks to maximize battery life or contending with shorter battery life in order to enjoy more notebook features. Notebook manufacturers such as Alienware often prioritize performance and user experience over battery life, since gamers and other users value those qualities more; but the battery constraints of mobile computers are a frequent source of frustration for travelers and on-the-go users. The IEEE has been working on a solution to the conflict between battery life and notebook features, and recently issued a draft of a new standard that boosts the reliability of rechargeable lithium-ion and lithium-ion polymer batteries for mobile computers. The standard, which is scheduled to be ready by the first half of next year, "seeks to improve user experience by addressing the entire system from individual cells to the overall device," explains Jeff Layton, who chairs the working group that developed the specification. Other radical battery concepts under development include Cymbet's design for a thin, flexible lithium rechargeable battery that CFO Brian Shiffman claims can be made "as small as a pencil point or as large as a table." The Cymbet battery's shape can be configured according to need, and boasts as much as 70,000 recharged cycles. Meanwhile, Saint Louis University researchers recently announced the development of a battery that is driven by alcohol and enzymes, although such a device may not be ready for consumer use for several years. Shiffman believes the market for unique batteries is largely virgin territory--and its growth potential is enormous, considering how frustrated users are by notebook batteries' current restrictions.
    Click Here to View Full Article

  • "Wireless Mesh Networking Gathers Momentum"
    EE Times (11/10/03); Mannion, Patrick

    Wireless mesh networking was bolstered by two developments this month, one with the launch of a new startup called PacketHop, and the other being Intel's new business partnership with Zensys for wireless home networking automation. Wireless mesh networking in the commercial realm is a by-product of military research and seeks to make wireless networks decentralized and resilient to failure in the same way the Internet is. Instead of relying on a centralized set of devices, each node would act as a network peer for other devices. The difficult part of designing wireless mesh networks lies in defining how to most efficiently route data, ensure security, and maintain quality of service; because such mesh networks sometimes rely on mobile devices, efficiency is an especially important issue. PacketHop, a spinoff from SRI International, claims its routing protocol provides the best method for mesh networking: The company is initially targeting the Wi-Fi market with its technology, which was first developed to dynamically recreate routes for fixed wireless networks and requires software to be embedded in client devices. Meanwhile, Zensys and Intel are jointly developing a universal home networking platform that would put mundane devices such as thermostats and alarms on the same control network as more advanced equipment such as home electronics, PCs, mobile phones, and personal digital assistants. Zensys' Michael Dodge says the technology is an alternative standard for low-cost home networking, and is already viable, unlike the Zigbee standard that has not been ratified yet.
    Click Here to View Full Article

  • "Can Robots Become Conscious?"
    New York Times (11/11/03) P. D9; Chang, Kenneth

    Whether robots can truly become conscious is a matter of debate, given the profound lack of elemental understanding about consciousness itself. The basic measurement of machine intelligence, the Turing test, has yet to be passed, although many think that this goal will be reachable in a few decades. Dr. Hans Moravec of Carnegie Mellon University defines consciousness as a reflection of behavior, in which case people are little more than complex machines and the development of a machine that can precisely emulate human behavior is therefore within the realm of possibility. Dr. David J. Chalmers of the University of Arizona's Center for Consciousness Studies considers consciousness to be an "irreducible" characteristic, and makes no fundamental distinction between people and machines, arguing that human-machine conversations about subjects ranging from sports to philosophy will make people "certain [machines] are conscious as other people." In his book "Shadows of the Mind," Oxford mathematician Dr. Roger Penrose takes a contrarian position, claiming that consciousness is the product of quantum mechanics in people's brains that no computer can ever hope to match.
    Click Here to View Full Article

  • "Agenda Lacking as U.N. Info Society Summit Looms"
    Associated Press (11/08/03); Jesdanun, Anick

    Though more than 50 heads of state are set to take part in the upcoming WSIS, the issues for discussion are not yet finalized pending further review at meetings this week. Interests involved have indicated that establishing basic governance standards will be difficult. Some of the major issues set for discussion are whether ICANN should continue to handle key decisions about the Internet, to what extent the WSIS should support freedom of expression, whether the group should establish a distinctive funding source to address the digital divide, and how to handle questions about security, spam, and publishing of scientific journals. U.N. Secretary-General Kofi Annan's special adviser to the summit Nitin Desai says, "Probably what will happen is more a sketch of what needs to be done." Some critics are more pessimistic about the summit. Andrew McLaughlin, formerly of ICANN, says the meeting will be a "blabberfest that is not likely to produce results."
    Click Here to View Full Article

  • "Virtual TV Studio Gets Real"
    New Scientist (11/10/03); Fox, Barry

    German and Italian researchers have spent the past two years developing the Origami virtual TV studio with the BBC, which has established such a facility in Surrey, England, and is planning to debut the system at the First European Conference on Visual Media Production in March. Origami is designed to allow flesh-and-blood actors to interact with artificial characters and environments in real time, while eliminating the drawbacks of "chromakey" special effects. The traditional approach, in which an actor stands before a blank background studded with reflective beads while filmed by a stationary camera equipped with blue LEDs that bounce light off the background and silhouette the actor, precludes the actor and director's viewing of the scene as it is filmed. The Origami scheme places the actor within an "acting space" watched by a dozen cameras that produce a 3D virtual simulation of the actor; computer-generated landscapes and characters are projected onto the walls and floor so the actor can better interact with them, and collision-detection software allows actors to realistically manipulate virtual objects. The scene is recorded through a primary camera that can change position during the shot, while a smaller camera directed toward the ceiling monitors the main camera's position and angle in relation to a series of stationary reflective discs. This enables the computer to facilitate perspective shifts in the digital environment, and the director can view the results in real time. The BBC says that two Origami studios could allow actors separated by great distance to interact in the same artificial landscape.
    Click Here to View Full Article

  • "Scientists Seek to Plug Gaps in Computer Security"
    NZZ Online (11/10/03); Capper, Scott; Tonkin, Samantha

    The Swiss Federal Institute of Technology is setting up a new computer security center that will focus on developing more firewalls, detection systems, and cryptography. Students doing research at the new Zurich Information Security Center (ZISC) will work full-time to stay one step ahead of would-be hackers. Although some hackers attempt to break security for the challenge, others do so in order to inflict damage; in Switzerland, the Swiss Post was recently struck by a computer worm that shut down online services and cash points across the country. Similarly, the Swiss Federal Railways' Web site and ticket machines was attacked by a computer worm last month. There are fears that organized crime or terrorist organizations could exploit computer security flaws in the future. ZISC scientist Paul Sevinc says computer security research currently requires a reassessment of fundamental design tenets for security software. ZISC intends to build the most secure system components, such as unbreakable cryptography, but Sevinc says the only true test for any computer security system is in the public domain: If a security measure is compromised, then ZISC researchers will look at how to improve on their original idea. The center is backed by Credit Suisse, IBM, and Sun Microsystems, but Sevinc says PC users will benefit as well through improved firewall and virus scanner products.
    Click Here to View Full Article

  • "100,000 Ballots to Be Cast Online"
    Canadian Press (11/10/03); Horsey, Jen

    CanVote President Joe Church calls the Internet-based voting in the Ontario municipal elections by residents in the easternmost areas of the province a success. "I believe we're the first to do a real full Internet election in North America," says Church, head of the eastern Ontario startup that developed the technology for casting ballots over the Internet. A conventional ballot was not used. Registered voters in 11 area municipalities were given the opportunity to vote in the municipal elections using the Internet or the telephone, and about 100,000 voters in Prescott-Russell, Stormont, Dundas, and Glengarry counties registered to cast their votes online. Voters received a PIN number with their registration card, and have been pointing their Web browsers to the CanVote Web site to cast their ballots since Nov. 5. More than 30 percent of eligible voters in the participating areas had voted by Nov. 9. The CanVote system uses security measures that are similar to those used in online banking and credit card transactions. Church plans to make his system available for provincial and federal by-elections, and to introduce the technology in the United States.
    Click Here to View Full Article

  • "Bad Journalism, IPv6, and the BBC"
    CircleID (11/07/03); McLaughlin, Andrew

    Andrew McLaughlin asserts that as a former Democratic lawyer in the U.S. House of Representatives he saw that much of the information published as truth by the press was actually not accurate. McLaughlin contends that reporting on the Internet seems to suffer the from same trends today--and points specifically to a recent BBC story by Ian Hardy concerning IPv6 that concluded that Internet Protocol numbers would run out sometime in 2005. McLaughlin counters that no evidence supports the assertion that IPv4 addresses will be gone by 2005 and that RIPE NCC, other regional address registries, and the IANA all could offer information showing this is not the case. Evidence to counter the claim includes recent numbers released by the regional address registries and the paper "IPv4--How Long Have We Got?" by Geoff Huston. McLaughlin also strongly objects to the BBC piece's assertion that experts are working to establish IPv6 to generate 64 billion additional IP addresses, because IPv6 began implementation four years ago and could provide a far larger number of addresses than 64 billion. Other statements objectionable to McLaughlin include that any person who logs onto the Internet will automatically get an IP address, that global distribution of existing IP addresses is biased with most in the United States, that more than two-thirds of all IP addresses have been acquired by U.S. firms, that Level Three Communications owns more addresses than all Asia, that purchases of devices in Asia will make IP addresses unavailable, and that everyone will require a permanent IP address as 3G wireless computing becomes more widespread. These assertions, writes McLaughlin, range from slightly misleading to inaccurate. McLaughlin worries that the effect of such an article will be to falsely worry people even as the real issues surrounding IPv6 are interesting and relevant.
    Click Here to View Full Article

  • "Rage Against the (Chess) Machine"
    Wired News (11/10/03); Kahney, Leander

    Chess champion Garry Kasparov is set to do battle with the world's preeminent chess program this month in New York in what has become an annual test of man versus machine. The Fritz chess program has been outfitted this time with a 3D interface so that Kasparov, equipped with 3D glasses, will view a 3D version of the board and chess pieces on a monitor and give voice commands to move pieces and start and stop the clock. Despite the gimmick, chess writer Mig Greengard says the four-game tournament is another serious test of human skill against sheer computational power. Greengard said the Fritz program, a version of which is available for PCs, is the standard in computer chess and used by the top chess players worldwide. The version Kasparov will face off against is running on a server with four Xeon processors capable of processing approximately 3 million moves per second. Fritz drew against Kasparov earlier this year, and against chess champion Vladimir Kramnik in the "Brains of Bahrain" match of 2002. Despite the continuing software and hardware improvements made to chess-playing machines, Greengard cites evidence that human players are beginning to adapt and develop better strategies for playing against computers. ChessBase's Jeff Sonas analyzed chess matches between computer and human players and found the trend of humans losing to computers reversing. In Kasparov's match, his concentration may be hampered by the 3D interface, but on the other hand, the relatively short four-game tournament should not tax his stamina too much.
    Click Here to View Full Article

  • "The Science of 'He Shoots, He Scores!'"
    Vancouver Sun (11/10/03); Ward, Doug

    The Acquisition, Querying, and Prediction of Motion Trajectories is a project at the University of British Columbia to develop a computer system that can track the movement of National Hockey League players so that coaches can see how players respond and how these responses contribute to the success or failure of certain plays. The system, whose development is being funded by the Institute for Robotics and Intelligent Systems, could also prove useful for sports broadcasters, gaming and film industries seeking more realistic graphical renderings of people, and robots that mimic human movements. The system's software component was developed by Japanese UBC graduate student Kenji Okuma, who has studied hundreds of hours of videotaped hockey footage. The broadcasts, which afford only a side or end view of the rink, are digitized by the software into a top-down perspective; Okuma explains that this point-of-view allows maneuvers to be analyzed and future patterns anticipated. "You can see that if the players move one way, other players react a certain way," he says. Okuma notes that computers, unlike humans, can track all players on the ice at the same time with little difficulty. Hockey's popularity in British Columbia and the existence of computerized movement tracking systems for other sports was the reason why the sport was chosen for the UBC project, according to Okuma, who collaborated with Jim Little, David Lowe, Robert Woodham, and Raymond Ng on the system.
    Click Here to View Full Article

  • "Awareness of Computer-Security Threats Is Still Inadequate, Report Warns"
    Chronicle of Higher Education (11/14/03) Vol. 50, No. 12, P. A12; Carnevale, Dan

    Education-technology consortium Educause has released a new study that provides recommendations that colleges could use to secure their computer networks. Titled "Information Technology Security: Governance, Strategy, and Practice in Higher Education," the report provides a benchmark of practices that colleges are using to protect their computer systems from hackers and viruses. "IT security, in the end, comes down to people's behavior," according to the report. "Most hazards facing higher education fall into the gray area of unintended mistakes made by colleagues within our institutional bounds." The report, which is based on a survey of 435 colleges, interviews with technology administrators, and case studies with four institutions, reveals that colleges would do well to create awareness programs, which could remind computer users to log out of secure networks, as well as embrace some practices used in the business world, such as scanning all PCs connected to a network. However, the report shows that some professors are concerned that stronger network security would have a negative impact on their research and their ability to share ideas, if firewalls are designed to block transmissions from regions that have many viruses and worms, such as Asia. Although 62 percent of colleges require their computers connected to campus networks to have no security holes, only 33 percent provide security-awareness programs for the campus community.
    Click Here to View Full Article
    (Access for paying subscribers only.)

  • "FrankenPatch"
    CIO (11/01/03) Vol. 17, No. 3, P. 100; Berinato, Scott

    Shoddy patching is often blamed for outbreaks of Slammer and similar worms, when in fact patching's inherent complexity and fragility is responsible. Experts such as Cigital CTO Gary McGraw say the disclosure of software vulnerabilities is practically an invitation for hackers to exploit them, and they are more than likely to succeed because few people install patches. The rush to create patches after a flaw has been disclosed means that there is little time to test the patches to make sure they work and do not conflict with other patches, while Shawn Hernan of the Software Engineering Institute's CERT Coordination Center notes that writing patches is usually the responsibility of entry-level maintenance coders who, in addition to being under the gun, lack the authority to audit code or check for recurrences. Patch infrastructure and patch management also suffer from a lack of standardization, which only serves to confuse customers more. "It would take an industry body--a nonprofit consortium-type setup--to create standard naming conventions, to production test an insane number of these things, and to keep a database of knowledge on the patches so I could look up what other companies like mine did with their patching and what happened," comments W.P. Carey CIO Mykolas Rambus. Although he suggests that the Slammer attack has spurred vendors to try to improve the patch process, he points out that it is still the customers who ultimately pay. There are two schools of thought about solving the patching muddle: One is to accelerate patching through automated measures such as patch management (PM) software; the second maintains that only a small percentage of vulnerabilities lead to attacks, so it is more effective for companies to institute best practices and hire third parties to determine which patches they need to deploy. Both approaches, however, feel like a Hobson's choice--PM carries the risk of installing buggy patches, while the patch less strategy uses an outdated form of risk analysis and generates feelings of insecurity.
    Click Here to View Full Article

  • "Securing Teleworker Networks"
    Business Communications Review (10/03) Vol. 33, No. 10, P. 28; Phifer, Lisa

    Teleworking offers convenience for employees and more productivity and less costs for employers, but it also constitutes a security risk: The cost-cutting measures of getting teleworkers to use personal PCs instead of leased laptops results in a loss of corporate control over IT resources, and reduced trustworthiness of remote nodes at the far end of the virtual private network (VPN) tunnel. Also adding to the risk is the growing likelihood that the node is a remote network, or at least part of a remote network. Furthermore, international "war-driving" efforts reveal that at least two out of three wireless local area networks (WLANs) are insecure. A key corporate strategy to improving network security is increasing teleworkers' awareness of the risks inherent in always-on broadband, WLANs, network resource sharing, virus and worm infections, and trojan horses, and educating them on measures to patch such vulnerabilities. Security threats and at-risk resources must be thoroughly checked, while teleworker security policies must be reevaluated to see if any updating is necessary, especially as new threats emerge. Implementing countermeasures such as wireless-aware SOHO firewalls and routers, airlink security, desktop security software (anti-virus scanners, enterprise-grade desktop firewall suites, etc.), and VPN clients can lower security risks. A teleworker network can be fortified by using MAC access control lists, authentication, and encryption to thwart WLAN intrusions; another helpful strategy is to disregard NetBIOS probes, block sharing via WLAN and the Internet, and share locally while actively linked to the VPN. A company must balance out cost and effectiveness by taking into account implemented measures, business requirements, and security goals.

  • "Everyone's a Programmer"
    Technology Review (11/03) Vol. 106, No. 9, P. 34; Tristram, Claire

    Intentional Software co-founder and billionaire Charles Simonyi proposes a radical rethinking of software, which has become too complex to understand--a disadvantage that has led to unfixable software failures, abandoned projects, and tens of billions of lost dollars. Simonyi believes the solution lies in a simple yet powerful coding methodology both programmers and users can understand. The first step to accomplishing this is to examine the flaws in existing programming practice, and Simonyi thinks most failed software projects stem from developers who must do three jobs simultaneously: They must comprehend the clients' needs, which are often complicated; they must convert these requirements into computer readable algorithms and interfaces; and they must produce flawless, bug-free code with machine-like precision--an impossibility, given that human beings are inherently error-prone. Simonyi seeks to strip down the first two tasks to the bare essentials and eliminate the third task altogether by automating programming's drone-like elements and creating an intuitive programming interface, thus leaving programmers free to focus on program design. Key to this is the development of a software "generator" instructed through a simple interface or "modeling language;" Simonyi's goal is to tightly integrate the model and the programming so that users and programmers can shift between many different perspectives of the models and change the programs as they wish through slight modifications. The core infrastructure of intentional programming is aspect-oriented programming, a method that lets programmers quickly alter all instances of related commands. Simonyi says the new program generator model resembles a PowerPoint interface, with which people can compose presentation slides by pasting text, images, or charts into specific areas of an intuitive virtual environment. Simonyi's colleagues and rivals are taking different approaches to software modeling: James Gosling of Sun Labs supports a technique to plug existing code into a graphical modeling interface. Simonyi predicts that Intentional Software will not roll out a commercial intentional programming product for at least two years, but once such products are on the market, programmers will be able to build far more complex programs than can be made using current techniques.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

 
 
[ Archives ] [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM