HomeFeedbackJoinShopSearch
Home
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM.

To send comments, please write to [email protected].

Volume 5, Issue 547: Friday, September 19, 2003

  • "Study: IT Worker Unemployment at 'Unprecedented' Levels"
    Computerworld (09/17/03); Thibodeau, Patrick

    A new report from the nonprofit Commission on Professionals in Science and Technology (CPST) concludes that IT worker unemployment reached an all-time high of 6 percent this year, a percentage that aligns with the applications IT managers have received in response to vacancy announcements. The IT unemployment rate was 4.3 percent last year and as low as 1.2 percent in 1997. The study estimates that approximately 150,000 IT jobs were lost in 2001 and 2002, with programming accounting for about two-thirds. Randy Rosenthal of Southwest Securities Group noted that there is no longer a dearth of qualified workers for certain jobs, as evidenced by the huge number of applicants with multiple degrees his company has seen. At an AFCOM conference in Dallas, several managers reported layoffs or outsourcing as a result of data center consolidation, and predicted that the emergence of improved automation and "self-healing" applications will also affect IT careers; one manager recommended that prospective IT professionals build up their management skills. Still, the CPST study finds that the number of IT jobs has grown 300 percent between 1983 and 2003. IT growth was accompanied by increased dependence on foreign workers, which currently account for one-fifth of all U.S. IT labor, according to the report; the rise in the foreign workforce was fostered by expansion of the H-1B and L-1 visa programs. Despite the destabilization of the IT job market, the CPST notes that in the immediate future, regular turnover by itself will create more opportunities for prospective IT professionals. However, the report cautions that though the United States does not lack competent people, "those people may not be willing to conclude that long-run demands for their services will be good enough to support IT as a sensible career choice."
    Click Here to View Full Article

  • "New Findings Shake Up Open-Source Debate"
    NewsFactor Network (09/18/03); Martin, Mike

    A study by Oxford University theoretical physicists Damien Challet and Yann Le Du analyzes a "microscopic model of software-bug dynamics" in cathedral, bazaar, and closed-source software initiatives, and finds that such projects can arrive at a bug-free state even with lesser-quality programmers working on them. The researchers identify six elements of software-bug dynamics that shape the interaction between a program, those who use it, and those who program it; the probabilities that dictate these elements are the project's size, the number of users, the number of programmers, the number of reportedly "buggy" parts, and the role of the "maintainer," who judges whether or not the code is improved by a patch. The model defines each project as either open or closed, and operates on the assumption that cathedral open-source projects and closed-source projects share the same dynamics. Challet and Du have extrapolated two phases from the model: An early phase in which users detect and report many glitches, maintaining that the reportedly buggy parts outnumber the programmers; and a second phase in which a decline in the average number of bugs is accompanied by a slowdown in the debugging process, whereby open-source projects come out ahead of closed-source projects. The researchers conclude that "The quality of bazaar open-source project programmers does not need to be as high as those working on closed-source projects in order to achieve the same rate of convergence to the bug-free state." Challet and Du note that closed-source software developers could trump open-source developers by heeding two surprising conclusions the study makes--that closed-source programmers should disregard bug reports on code that has already been modified, and the process of verifying the presence of bugs slows down the second phase.
    Click Here to View Full Article

  • "H-1B Hearing: Companies Say Foreign Workers Needed"
    IDG News Service (09/17/03); Gross, Grant

    The H-1B visa cap will fall from its current limit of 195,000 to its pre-dotcom-boom level of 65,000 on Oct. 1, and industry representatives testified before Congress on Sept. 16 to argue against such a reversion. Intel and Ingersoll-Rand officials told the Senate Judiciary Committee that such visas, which allow U.S. employers to bring in foreign IT workers, are necessary because there is a lack of qualified U.S.-born talent to fill critical IT positions. On the other hand, Institute of Electrical and Electronics Engineers president-elect John Steadman reported that hundreds of thousands of "degreed and capable U.S. engineers" are out of work, and the H-1B program is denying them jobs. Elizabeth Dickson of Ingersoll-Rand insisted, "It is hard to displace U.S. workers when you don't have any U.S. workers to choose from," and warned that the American IT industry could lag behind other nations unless it can continue to bring in the most talented professionals from across the globe. Committee Chairman Sen. Orrin Hatch (R-Utah), who sponsored the legislation that raised the H-1B cap to its current level, argued that the reversion would result in a drop-off in revenue derived from H-1B application fees--revenue that has been used to fund American training and scholarship programs. Steadman found an ally in Sen. Dianne Feinstein (D-Calif.), who cited examples in which the H-1B and L-1 visa programs were abused, while Sen. Jeff Sessions (R-Ala.) said the United States wants to avoid a market glut and at the same time treat all talented IT workers fairly. Hatch, meanwhile, wondered whether H-1B abuses alone were responsible for the U.S. unemployment situation, and was skeptical that allegations of employers hiring cheap labor via the H-1B program align with actual data.
    Click Here to View Full Article

  • "Data Privacy, Emergency Response, Weather Prediction to Benefit From IT Advances"
    Newswise (09/17/03)

    The National Science Foundation's Information Technology Research awards this year doled out over $169 million to eight major projects and hundreds of smaller projects, to be carried out over the next several years; the projects focus on collaborative, multidisciplinary work and aim for high-risk, visionary goals. A privacy infrastructure program based at Stanford University seeks to unify privacy protection by placing sensitive data in a central database, which has strict rules governing the use of such data by different parties. The project will seek to protect privacy while at the same time providing access to law enforcement for legitimate purposes. The largest of a slew of homeland security proposals comes from the University of California, Irvine, where researchers intend to create a better emergency response system that provides responders on the ground and decision-makers with unified access to pertinent information. A team led by the University of Oklahoma is developing grid computing weather forecasting technologies that will be more adaptive to the specific type of weather system and other circumstantial factors than are current modeling technologies, while UCLA researcher William Kaiser was given money to pursue mobile, self-aware sensor networks that monitor an outdoor area while moving along a network of light cables. Carnegie Mellon University is leading an effort to create a next-generation broadband infrastructure providing millions of American homes with high-speed Internet access that is more secure, scalable, and reliable, and leverages future applications. The University of New Mexico is heading a collaborative effort to map the historical genetic blueprint of life on Earth using supercomputers, and the University of California is leading a separate push to better study proteins and complex molecules in living cells. Finally, Rice University will build better software tools for writing grid computing applications.
    Click Here to View Full Article

  • "Self-Policing Added to Spam Bill"
    Washington Post (09/18/03) P. E1; Krim, Jonathan

    A provision recently inserted into antispam legislation sponsored by Reps. Richard Burr (R-N.C.), W.J. Tauzin (R-La.), and F. James Sensenbrenner Jr. (R-Wis.) would make bulk emailers exempt from penalties if they agree to regulate themselves. The requirement would involve the formation of a self-regulatory organization that uses an independent third party to give "legitimate" senders of commercial email an electronic seal of approval, but certain consumer groups, legislators, antispam organizations, and state prosecutors balk at the prospect. "[Bulk emailers] are writing the law so that it places them where they think they belong: Above it," declared Jason Catlett of Junkbusters. The bill the provision has been added to requires bulk emailers to comply with consumer requests to stop receiving unsolicited commercial messages and criminalizes both the electronic "harvesting" of email addresses and the masking of spammers' locations. Tauzin representative Ken Johnson argued that the provision is an improvement because it allows individual consumers to direct complaints to an authorized body rather than attempting to contact law enforcement agencies that may not respond. But a representative of Rep. Heather A. Wilson (R-N.M.), who is pushing for stricter antispam legislation, declared that the provision "continues to protect spammers at the expense of consumers." The antispam bill was drawing criticism even before the addition of the self-regulation provision: Some lawmakers and consumer organizations are worried that the bill would displace more stringent state regulations and prevent consumers from launching civil suits against spammers.
    Click Here to View Full Article

  • "Government, Industry Debate the Value of Common Criteria"
    Government Computer News (09/18/03); Jackson, William

    Although the Common Criteria security software evaluation standards are useful, they cannot ensure that such software will be trouble-free, according to government and industry representatives at a Sept. 17 hearing of the House Government Reform Subcommittee on Technology, Information Policy, Intergovernmental Relations, and the Census. The Common Criteria, which are designed to assess security software to see if it fulfills vendors' claims or user requirements, are followed by authorized private labs and acknowledged by 14 countries. In charge of the project is the U.S. National Information Assurance Partnership, a joint venture between the National Security Agency and the National Institute of Science and Technology (NIST). Edward Roback of NIST's Computer Science Division reported that the standard is "not a measure of how much protection the claimed security specification provides, nor does it guarantee that the product is free from malicious or erroneous code." Eugene Spafford of Purdue University noted that Microsoft's Windows 2000 operating system is consistently breached by worms despite its Common Criteria certification. Government-based national security systems and the Defense Department are required to use security products evaluated by the Common Criteria, but most witnesses at the hearing did not recommend instituting such a mandate government-wide, instead arguing for applications to be decided on a case-by-case basis. Robert G. Gorrie of the Defense Department's Information Assurance Program said that his department is collaborating with Homeland Security to consider the wider application of Common Criteria certification. The value of Common Criteria assessment depends on the size of organizations and how much of a burden the current evaluation process is for them.
    Click Here to View Full Article

  • "Disputes Erupt Over Service for Poor Internet Typists"
    New York Times (09/18/03) P. C3; Olson, Elizabeth

    Critics are coming down hard on VeriSign, which just unveiled its Site Finder service--designed to send users to an advertising-supported site when they make errors in typing domain names into their browsers. The nonprofit Internet Software Consortium is already offering a software patch to counteract Site Finder, saying it was responding to complaints that VeriSign is commandeering "mistake" traffic in hopes that users will click on advertiser-paid sites instead. Some critics want VeriSign stripped of its franchise, and major Internet portals are also displeased, since they offer similar services. VeriSign's Brian O'Shaughnessy says that Site Finder is supposed to be a tool to help users surf more effectively. Critics say that Site Finder hinders some ISPs' filtering of spam, and People for Internet Responsibility co-founder Lauren Weinstein contends that the service compromises Internet privacy. Many say that VeriSign has abused its role as the somewhat-official administrator of .com and .net. "You don't make changes that fundamental on the Internet without consulting with those who are running it," says Carnegie Mellon University professor David Farber, who wants VeriSign's franchise revoked.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Embedded Guru Advocates 'Bug-Free' Software"
    EE Times (09/16/03); Mokhoff, Nicolas

    Embedded system designer Jack Ganssle delivered a keynote address at this week's Embedded System Conference in Boston in which he blamed "benign negligence" for system failures attributed to shoddy software. He said three factors contribute to these failures: Engineers writing poor code, software being fundamentally unsound, and a basic gap between a software programmer and management's pressure to ship the product before it is truly ready. Ganssle pointed to many system failures that resulted from buggy embedded software, some of which had deadly consequences--he reported that more and more pacemakers were recalled between 1990 and 2000, and some users were killed by these devices; meanwhile, NASA's Clementine and NEAR space missions failed due to identical errors. "In both cases, a glitch in a sequence of events caused all the fuel of the thrusters to be dumped--all because of insufficient testing and the need to meet unreasonable schedules," Ganssle said. The space agency suffers a disconnect between mission schedules and the intricacy of those missions, which means that more complicated systems are more likely to fail when they follow a tight mission timeline. Ganssle acknowledged that perfecting systems is an unreachable goal, but advocated that the software industry must get in the habit of learning from failures, exchanging the data gleaned from those failures with others, and incorporating improved solutions into future projects. The system designer also recommended that developers consider a certain amount of acceptable risk.
    Click Here to View Full Article

  • "E-Voting Audit Ready for Public"
    Wired News (09/18/03); Zetter, Kim

    Maryland Gov. Robert Ehrlich called for an audit of touch-screen voting systems from Diebold Election Systems by Science Applications International (SAIC) in response to a disclosure of major software security holes, and the audit is now complete and ready to be posted to the public. The report could become available on the Maryland state Web site by Friday or next week, while information that malicious hackers could exploit will be excised. Shareese DeLeaver of the governor's office says the state's Department of Budget and Management and the State Board of Elections are reviewing the 200-page report, while Board of Elections official Jim Pettit notes that all involved parties must settle "legal, procurement and technical" issues before the report is published. He does not say what specific recommendations the report makes, but acknowledges that all the recommendations need to be deployed before Maryland proceeds with a $55.6 million installation of Diebold e-voting systems throughout the state. Pettit added the probability that SAIC will re-assess the software after the changes have been implemented, which could be done by the time the March 2004 primary election rolls around. Maryland is required by law to have a state-wide unified voting system in place by 2006. An alternate system will have to be selected by March if the department of budget and management decides the Diebold machines are not up to the task, according to Pettit. The software flaws that prompted Gov. Ehrlich to commission the report were revealed by Johns Hopkins and Rice University researchers.
    Click Here to View Full Article

    For information about ACM's activities in regard to e-voting, visit http://www.acm.org/usacm/Issues/EVoting.htm.

  • "Breaking the Speed Barrier: The Frontside Bus Bottleneck"
    TechNewsWorld (09/17/03); Hook, Brian R.

    Computer processor makers Intel, AMD, Motorola, and others continue to find innovative ways to shuttle data from between the main processor and other components via the frontside bus (FSB). Though the flow of data between these different pieces of computer architecture determines overall performance, computer makers have instead focused on processor speed in their marketing because it improves regularly. FSB performance, by comparison, has been lagging, though experts say chipmakers have done a fairly good job with new bi-directional buses and multiple data paths. Semico Research analyst Greg Fawson explained that system balance was also important, and that the FSB speed needs to match with memory bus speed. Part of Apple's claim that the Power Mac G5 is the world's fastest PC is based on its 64 bit bi-directional FSB--one for each of the Power Mac's dual processors--which integrates clock signals with the data stream to achieve bus speeds up to 1 GHz, or 8 Gbps in aggregate bandwidth. Hewlett-Packard advanced technology initiatives manager David Heisey points out that every interface that carries data is a potential performance bottleneck, not just the FSB. Hewlett-Packard has pursued a multiple-paths approach linking the processor and memory. University of Southern California computer scientist John Granacki says recent work at his school on "smart memory," or processor-in-memory (PIM) technology, eliminates a large part of bus traffic by placing some processing functions on the memory chip; however, Granacki says the increased cost and adequacy of current performance for most applications would probably prevent widespread adoption.
    Click Here to View Full Article

  • "A Sugar Cube, Please: I Need to Charge My Cellphone"
    New York Times (09/18/03) P. E6; Eisenberg, Anne

    University of Massachusetts researchers have developed a microbial fuel cell that feeds on sugar and releases electrons that can be converted into an electrical current. The biological component is Rhodoferax ferrireducens, an "iron-breathing" microorganism that can transfer over 80 percent of the electrons in the sugar, according to research leader and environmental microbiologist Dr. Derek R. Lovley. Other microbial batteries can only transfer around 10 percent of the electrons in sugar, and many boost their efficiency with a special compound that allows them to enter the microbe, gather the accumulated electrons, and convey them to an electrode. "But Dr. Lovley's bug does the work all by itself, without the intermediate components we all put in to facilitate electron transfer," notes G. Tayhas R. Palmore of Brown University. Remote-area sensors, household devices that tap into sugar-based waste, and special military applications may be able to use such fuel cells, but University of Texas chemical engineering professor Adam Heller does not think they would be able to generate enough power for an entire grid. As the microbe consumes sugar, it converts it to carbon dioxide while at the same time producing electrons that are deposited on an electrode and flow along an external circuit to a second electrode. "We need only a small number of organisms because they gain energy and rapidly increase in number," explains Lovley. As the microbe replicated and proliferated over the surface of the electrode in the laboratory, it generated stable long-term energy for as many as 25 days while yielding a current strong enough to power a calculator; the organism has no trouble metabolizing glucose, sucrose, fructose, or xylose.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Information Technology Field Loses Diversity, Research Finds"
    Boston Globe (09/14/03); Lewis, Diane E.

    The Information Technology Association of America reports that between 1996 and 2002 the number of women and African-Americans working in the information technology industry has declined. The association's Blue Ribbon Panel on IT Diversity report reveals that women held 41 percent of IT jobs in 1996, but held only 25.3 percent of IT jobs last year. And the percentage of African-Americans in the IT industry fell from 9.1 percent in 1996, to 6.2 percent in 2002. The report says woman hold 46.6 percent of jobs in the United States, while African-Americans represent 10.9 percent of the workforce. The study suggests that fewer woman and African-Americans are pursuing tech-related degrees in college, but did not analyze the impact that layoffs or the recession might have had on their participation in the industry. The association also found that the number of Asian professionals has risen from 8.9 percent to 13.4 percent, and the number of Hispanics has risen from 5.4 percent to 6.3 percent. Age is also a factor in IT employment; Americans over the age of 45 represent 37.6 percent of the U.S. workforce, but 29.4 percent of the IT industry.
    Click Here to View Full Article

  • "At Rate Tech Is Going, No One Will Need Talent for Singing or Housework"
    USA Today (09/17/03) P. 3B; Maney, Kevin

    Technological developments that have the potential to transform everyday life include new labor-saving devices from iRobot, a car that can park without human intervention, and software that can dramatically enhance a person's singing voice. IRobot has already attracted attention with the rollout of the Roomba, a self-guiding vacuum cleaner with artificial intelligence software. Thirty iRobot engineers are developing other robotic household products, and the company announced that machines capable of more complex chores will debut within a year. Japanese drivers do not have to trouble themselves with the headache of parking if they drive the Toyota Prius, which features a self-parking option that uses sensors to measure parking spaces, video cameras that keep visual boundaries such as curbs and white lines in sight, and a computer that extrapolates the optimal "turn-in" point from the sensory input. Meanwhile, Purdue University professor Mark Smith has led the development of voice-enhancing software that converts vocals to mathematical models that are processed through algorithms that simulate the sound of great voices. The algorithms adjust the voice to produce the most desired qualities while eliminating unwanted elements such as warbling. London venture capitalist John Taysom argues that these developments constitute "the beginning of the transition from information technology applied to business process to information technology applied to the products of business." Kevin Maney writes that IT is following a transformative path similar to that of electric power 70 to 100 years ago.
    Click Here to View Full Article

  • "Testing Information Systems During Development Will Prevent Problems"
    EurekAlert (09/16/03)

    Penn State researcher Dr. Sandeep Purao has taken a systematic approach to applying the more than 350 existing metrics to object-oriented systems. Purao, associate professor of information sciences and technology, and co-researcher Vijay Vaishnavi, professor of MIS at Georgia State University, grouped the metrics that apply to any IT development project using the object-oriented approach and found gaps and overlaps in the stages of project requirements; design specifications; implementation; and operation. "No one had proposed a systematic approach to deciding when to apply specific metrics whether at an early stage, throughout the development or at the end," says Purao. Developers should be able to evaluate projects at various stages, rather than having to wait until the project is completed to conduct a test. Knowing which metrics to apply at different stages of software development can lead to fewer problems. For example, a software problem has emerged in the new $16 million computer system of the school district in Baltimore, Md., forcing the school system to pay about $500,000 a month to address the software issue. Purao and Vaishnavi believe their research will help reduce the IT snafus that lead to overruns in cost and delays in the implementation of systems.
    Click Here to View Full Article

  • "Security Standards Could Make Anti-Piracy Easier"
    New Scientist (09/16/03); Knight, Will

    New programming standards from the Trusted Computing Group (TCG) designed to boost PC security could also support the development of stronger anti-copying software. The standards, which will be issued on Sept. 16, will connect software to tamper-proof hardware modules that contain cryptographic keys and other tools. This development will "raise the bar significantly" for people attempting to circumvent Digital Rights Management (DRM) software, according to Julian Midgley of England's Campaign for Digital Rights. He adds that average users will find DRM systems built atop the new TCG standards to be impenetrable, as physically breaching the security chip would be the only recourse. The standards will be embedded in a 2004 version of Microsoft's Windows operating system, while IBM and Hewlett-Packard released the first computer systems to employ the hardware several months ago. Jane Price of the TCG acknowledges that DRM software could be based upon the standards, but TCG and Microsoft insist that preventing digital piracy was not the primary purpose of the system.
    Click Here to View Full Article

  • "Supercomputer-Based Neural Net to Mimic the Brain Planned"
    KurzweilAI.net (09/15/03)

    Artificial Development CEO Marcos Guillen announced at the Accelerating Change Conference on Sept. 14 that his firm will build CCortex, a "massive spiking neuron network emulation" designed to mimic the human cerebral cortex. The network will consist of 20 billion layered neurons and 2 trillion 8-bit links while running on a 1,000-processor supercomputer cluster with 1.5 TB of RAM and 4.8 TB of storage capacity. Conference attendee and Apex NanoTechnologies CEO Ramez Noam expressed doubt that the project will work, stating that an individual neuron does far more computation than the Artificial Development researchers assume. Another skeptic was artificial intelligence expert and computer scientist Dr. Ben Goertzel, who noted that no neuroscientists are currently involved in the project, which means the initiative suffers from a lack of understanding into how the human brain works and its role in intelligence. Guillen explained that CCortex can "tune vast populations of neurons and the information they hold to complex spiking patterns, adding a much higher level of complexity to a highly realistic simulation." It does this, he said, through the addition of a time-sensitive, analog model of spike shapes.
    Click Here to View Full Article

  • "What's in a Cybername? Plenty"
    National Journal (09/13/03) Vol. 35, No. 37, P. 26; New, William

    Many interests have been encouraging the U.S. Department of Commerce (DOC) to push for a clean-up of the Whois database, as the department faces the expiration of its memorandum of understanding (MOU) with ICANN. However, Commerce General Counsel Theodore Kassinger recently spoke before a House subcommittee on plans to extend the MOU, indicating that pushing for betterment of Whois data would not be a primary aspect of the MOU. At the recent House Judiciary's Subcommittee on Courts, the Internet, and Intellectual Property, Chairman Rep. Lamar Smith (R-Texas) argued, "Despite the demonstrated need and obligation of the Department of Commerce, ICANN, and registrars to provide access to accurate Whois data, there is an astonishing lack of enforcement of these contractual terms." Smith decries both the fact that registrars have not been denied accreditation when they have fallen short meeting Whois obligations, as well as the DOC's plan to create a seven "milestone" guideline for evaluating ICANN in the future that would not explicitly address Whois, enforcement of contracts, or shielding intellectual property. Cardoza School of Law professor Susan Crawford has wondered what authority oversees ICANN if the DOC does not. As comments by James Farnan of the FBI Cyber Division and John LoGalbo of the Justice Department Computer Crime and Intellectual Property Section indicate, many groups depend on or utilize the Whois database. Other entities such as the Electronic Privacy Information Center have noted that the Whois database also serves as a source of information for other parties, including those interested in fraud and other illegitimate activity. Kathy Kleiman of the law firm McLeod, Watkinson and Miller has pointed out the benefits of anonymity to Internet users living under repressive governments. As the debate continues, ICANN's government advisory committee also is working with the various positions nations hold on management of personal data. ICANN anticipates having a Whois "best practices" workshop at its upcoming board meeting.
    Click Here to View Full Article

  • "WhereWare"
    Technology Review (09/03) Vol. 106, No. 7, P. 46; Pfeiffer, Eric W.

    Location-based computing, in which wireless mobile devices can keep track of their owners' whereabouts, could offer an array of applications and services that promise to fatten providers' coffers and enhance travel, safety, shopping, and convenience for consumers; emergency services' operations could also be augmented with the technology. But before these benefits can emerge, many issues must be addressed, including privacy and technical concerns, and the establishment of data format and network navigation standards. Qualcomm's Arnold Gum thinks the privacy conundrum could be solved if users are given the power to deactivate their devices' location-finding features if they wish. The pluses and minuses of various location-aware products and services are a reflection of their accuracy, price, and power efficiency. The most accurate existing outdoor tracking technologies are cellular networks and the Global Positioning System (GPS), but each comes with limitations: GPS relies on line of sight, which makes it harder to get accurate results in large urban areas, while cellular networks are too inaccurate for viable navigational services; ESRI manager Jonathan Spinney thinks assisted GPS will become a global standard because it overcomes the urban inaccuracy problem. On the other hand, neither assisted GPS nor cellular networks function well indoors, but Wi-Fi is a promising candidate in this regard, though it is not especially secure and it is vulnerable to signal interference and other factors. Ultrawideband offers even more promise. Establishing interoperability across wireless technologies and different classes of networks requires the deployment of software standards as well as the resolution of technological and business issues. The adoption of location-finding technologies is wider in Europe and Asia, where their convenience is emphasized, than it is in America, where safety is the primary driver.

 
                                                                             
[ Archives ] [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM