HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 722:  Monday, November 22, 2004

  • "NSF Awards C.U. $1.6 M Grant"
    Cornell Daily Sun (11/22/04); Holmes, Casey

    The National Science Foundation awarded a $1.6 million grant to Cornell University computer science professors Andrew Myers, Ken Birman, and Fred Schneider to develop "trustworthy" computers under the aegis of its $32.2 million CyberTrust program. Information Assurance Institute director Schneider says their project was one of 33 proposals selected by the NSF after evaluation by a panel of faculty professors, who based their decision on team credentials and idea quality, and Birman thinks Cornell's visibility as a leading center for security and fault tolerance over the last 15 years also played a part in their winning the grant. Birman notes that the NSF grant will be used to fund graduate students to participate in the project, cover researchers' travel expenses, and support the development of new materials for computer science courses. Schneider says the team is attempting to build a system that can automatically secure data so that programmers do not have to specify exactly how the data should be protected, while Myers says fault-tolerant programs that protect information without causing a complete network crash even if one network element is malfunctioning is another component of trustworthy computers. The antagonistic nature of good security and good fault tolerance requires the establishment of compatibility between these two objectives, and Myers says creating the underlying technology to support such compatibility is the goal of the CU research team. "Hopefully, in 10 to 15 years this will be the way people build secure systems," he projects. Birman wants Cornell to offer courses that focus on computer programming and cybertrust's ethical and technical ramifications.
    Click Here to View Full Article

  • "They've Made a Start, But Q&As With PCs Have a Long Way to Go"
    Wall Street Journal (11/22/04) P. B1; Gomes, Lee

    Search engines can answer questions using natural language on a rudimentary level, although ordinary people unaware of the effort researchers have put into such systems may not be very impressed. The answers search engines provide to such queries are typically canned, and the systems boast little actual intelligence beyond the narrow range of facts programmed into their databases. The National Institute of Standards and Technology's annual Text Retrieval Conference (TREC) showcases cutting-edge Q&A technology by having computers from all over the world compete to see which of them can best answer hundreds of natural-language questions based on thousands of newspaper articles. TREC administrator Ellen Voorhees says the participating systems perform best with simple "factoid" questions in which the information is often carried in a single sentence in an article; "list" questions requiring computers to draw facts from several sentences are much harder, while causality and reasoning is still far beyond their capabilities. Computer systems participating in the contest are typically designed to support a little knowledge about a lot of subjects, while another strategy is to select a subject and then enable the computer to "discuss it" in detail. The best-performing system at last year's TREC was owned by Language Computer, but it was only 75 percent correct on even the simplest factoid questions. Language Computer President and University of Texas-Dallas professor Dan Moldovan says some corporate clients employ the software to let workers inquire about human-resources manuals, for instance. The ultimate goal of a search engine that scans its entire repository of knowledge for the answer to any typed-in question will remain science fiction for a long time, even though companies such as Microsoft are attempting to rebuild enthusiasm for such an achievement.

  • "Open Source's Next Frontier"
    CNet (11/22/04); LaMonica, Martin

    A growing number of open-source software services firms are offering enterprise application packages that resemble those sold by major vendors. The application stack offered by Gluecode includes portal and database software and an application server; Gluecode plans to make money off of subscription services. French nonprofit consortium ObjectWeb has also packaged portal, content management, enterprise messaging, and grid software, and is working on Java server software as well. Though these offerings will help companies avoid hundreds of thousands of dollars in licensing fees, they require extensive integration work and compatibility-checking with other applications; in addition, these open-source enterprise application alternatives tend to offer plain-vanilla features, not the advanced features companies need for some mission-critical tasks. Established enterprise software vendors do not seem rattled by the introduction of open-source alternatives to their middleware and other enterprise software: IBM continues work on an open-source Java database project called Derby, and Sun Microsystems says it has thought about open-sourcing some editions its Java application server suite. Analysts say the success of these open-source enterprise application stacks depends on the amount of risk companies are willing to take in procuring services from small firms and possibly having to do substantial in-house integration work. Legal issues are also a worry for open-source software, something that Linux distributors Novell and Red Hat have addressed by offering indemnity for their customers. Forrester Research analyst Henry Peyret says these new open-source enterprise alternatives do not really compete directly with more advanced proprietary software because they will not be used for the same purposes.
    Click Here to View Full Article

  • "Computers as Authors? Literary Luddites Unite!"
    New York Times (11/22/04) P. B1; Akst, Daniel

    Although computer programs capable of writing fiction have been developed, novelist Daniel Akst points out that their chances of writing the "Great American Novel" are slim due to a number of hurdles, most notably the complexity of writing when one has no emotions or life experiences. Harvard University psychologist Steven Pinker also is doubtful that a novel written by a computer would be good, when judged by human standards. Nevertheless, Akst thinks that fiction-writing computers can at least shed new light on the creation of literature, and sees similarities between computers and humans engaged in the process of storytelling. The author reports that economist Herbert Simon's concept of "bounded rationality" applies to both humans and computers. It is pointless for a human writer, once he has written a first sentence, to systematically consider every permutation--every character, phrase, plot twist, etc.--that might branch out from that sentence, and is therefore forced to select whatever is good enough to continue the narrative, usually through serendipity. Likewise, a computer cannot create a narrative through brute-force computation of every alternative. Among the fiction-writing programs Akst points to is StoryBook, described by the computer scientists who created it as "an end-to-end narrative prose generation system that utilizes narrative planning, sentence planning, a discourse history, lexical choice, revision, a full-scale lexicon and the well-known Fuf/Surge surface realizer." Rensselaer Polytechnic Institute's Selmer Bringsjord, meanwhile, intends to make a computer write fiction that uses the problem of evil as its theme by developing a logical framework for the problem.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Cybernetics Combines Many Disciplines"
    Daily Bruin (11/19/04); Chou, Jeyling

    For over 30 years UCLA has offered undergraduates a highly specialized, interdisciplinary major called cybernetics, which combines biology, math, and engineering through its emphasis on biological control and communication processes. Areas of focus include robots, mathematical representations of protein interactions, and cell movement explained through engineering. Only 26 students on the UCLA campus are currently enrolled in the cybernetics program, given the combination of skills and interests required. "It's an intense major, but the benefit is it's very small so you get a lot of personal attention," notes associate professor Steven Engel. Students must take life sciences and engineering prerequisites prior to selecting an area of concentration, and the number of classes cybernetics majors and pre-majors must attend exceeds that of any other program on campus. In the autumn of 2003, cybernetics department Chairman and UCLA professor Joseph DiStefano provided a Cybernetics Commons where students can confer, study, and network; he also runs a biocybernetics lab for graduate students, who mentor undergraduates. Cybernetics program student affairs officer Beth Rubin, who gives students advice on class scheduling and career paths, notes that the program attracts the interest of many companies at career fairs. "They look at the curriculum and see that it's broad-based and has very rigorous basic training in an interdisciplinary nature--this is what they want to see in students today," she explains.
    Click Here to View Full Article

  • "Nanomechanical Memory Demoed"
    Technology Research News (11/24/04); Smalley, Eric

    Physicists at Boston University have constructed mechanical memory using tiny ribbons of silicon that are flexed from 0 to 1 data states with a very small amount of electricity. Unlike other nanomechanical memory cells based on carbon nanotubes or buckyballs, the silicon device at Boston University resembles the silicon nanoscale oscillators that are currently being deployed as super-sensitive sensor and communications components. The implication is that Boston University's nanomechanical memory could be manufactured in high volumes using standard semiconductor fabrication equipment. The key element in the device is a silicon beam just 8,000 nanometers long that is slightly compressed and clamped at both ends; an electric current flexes the beam so that is either convex or concave. The switching frequency with the 8,000-nm beam is 23.5 MHz, but could be switched in the gigahertz range if the silicon beam were shortened to 1,000 nm (current magnetoelectric memory chips can operate at 400 MHz). The mechanical memory can store as much as 100 GBs per square inch--125 times the capacity of normal memory technology. The device uses only femtowatts of energy to switch each memory cell, much less than the microwatts used in traditional memory chips, while another benefit is mechanical memory's resistance to radiation and electromagnetic pulses found in extreme environments, such as space. Boston University assistant professor Pritiraj Mohanty says further work needs to be done in order to improve the read-write signal, shorten beam length, and boost performance, while integration with conventional electronics also needs to be investigated; practical application of the technology should happen in between two and five years, he says. Funding from the National Science Foundation, the Sloan Foundation, and the Defense Department supports the research.
    Click Here to View Full Article

  • "USC GamePipe Lab Plans to be R&D Player"
    USC Information Sciences Institute (11/17/04)

    The GamePipe Laboratory at the University of Southern California's Information Sciences Institute (ISI) will focus on increasing interactive games' usability, educational value, power, and verisimilitude while also speeding up and simplifying game development under the leadership of game creator Michael Zyda. He describes the lab's goals as "research, development, and education on the grand challenges of radically transforming the game production process--from dramatically shortening the production timeline, to developing the supporting technologies for increasing the complexity and innovation in produced games." Zyda, whose accomplishments include the origination of the MOVES Institute at the Naval Postgraduate School and the creation of the "America's Army" video game for the U.S. military, says the GamePipe facility has the potential to make USC an authority for the development and application of next-generation game technology. GamePipe R&D will emphasize four principal areas: Infrastructure (next-generation software and hardware and enablement technologies for massively multiplayer online games), cognition and gaming (the modeling and simulation of game characters, story, and emotions, as well as story/pedagogy integration), immersion (advanced graphical, auditory, and haptic user interfaces and a theoretical framework for the game engagement process), and serious games (research on how games and interactive media can be used in education and training; new health, communication, and policy applications for games; and the development of such games). Zyda reports that game development is still sluggish and chaotic despite industry growth, while problems are becoming more pronounced as games become more sophisticated.
    Click Here to View Full Article

  • "For Developers, It's Not All Fun and Games"
    CNet (11/18/04); Frauenheim, Ed

    Video game programmers have started to complain that their employers are demanding unreasonable work hours, and are clamoring for change. A blog posting last week implied that the video game industry typically demands employees to spend 60 hours per week at work, and pays little attention to helping them achieve a work/family balance. An earlier survey conducted by the International Game Developers Association (IGDA) concludes that game development "is all too often performed in crippling conditions that make it hard to sustain quality of life and lead too many senior developers to leave the industry before they have had time to perform their best work." A number of former Electronic Arts employees accuse their company of regularly goading them to put in 80 or more hours per week, and an alleged failure to pay overtime wages has made EA the target of a lawsuit as well. EA reports that it conducts a survey twice a year where workers can anonymously suggest improvements, but one current EA employee says work schedules are worsening, despite his and other co-workers' complaints about long hours in one such survey. IGDA program director Jason Della Rocca sees no sense in oppressive work schedules, arguing that happy, unstressed workers are more creative and more productive; he says the game industry's failure to deeply research a game's truly fun aspects at the beginning of a project leads to "crunch time" for workers as deadlines approach. Della Rocca also says the industry suffers from a shortage of management training to address the increasing complexity and cost of games and game development. The IGDA's board of directors published an "open letter" on Nov. 16 indicating that game developers must take some responsibility for working conditions, while last week's blog has apparently instigated discussions about game industry unionization.
    Click Here to View Full Article

  • "Nano Fabric May Make Computers Thinner"
    NewsFactor Network (11/18/04); Martin, Mike

    A team of British and Russian researchers led by University of Manchester professor Andre Geim have removed individual atomic sheets of carbon from graphite crystals, and Manchester's Jo Grady reports that this single-atom-thick nano-fabric exhibits stability, strength, and high levels of conductivity and flexibility. The material, known as graphene, could form the basis of ultra-fast-switching transistors that will dramatically ratchet up computer speed. "Ultimately, scientists envisage transistors made from a single molecule, and this work brings that vision ever nearer," notes Grady. Graphene functions as a transistor under ambient temperature and pressure conditions through a "ambipolar field effect" that the researchers demonstrated using standard micro-fabrication methods employed in silicon chip manufacturing. Geim believes that thousands of potential applications for carbon nanotubes can extend to graphene, since nanotubes are essentially graphene to begin with. Physicist K.M. Novoselov supports the plausibility of inch-sized graphene wafers: "All the omens are good, as there are no fundamental limitations on the lateral size of carbon nano fabric," he explains. Manchester Innovation's David Glover says graphene has the potential to compete with current semiconductors, particularly in niche markets dominated by gallium arsenide. He suggests that graphene could gain an edge through its energy efficiency and high electron mobility.
    Click Here to View Full Article

  • "UH Research to Redefine Data Storage"
    The Daily Cougar (University of Houston) (11/19/04) Vol. 70, No. 64; Wolfford, Bryan

    University of Houston electrical engineering professors Jack Wolfe and Dmitri Litvinov are striving to create nano-patterned medium recording (N-PMR) with the help of a $1.1 million National Science Foundation grant. The resulting product would be able to store 1 terabyte of data per square inch, equivalent to storing the content of 1,500 CDs on a postage stamp-sized area. The technology could greatly benefit the magnetic storage industry, whose data storage capacity is approaching its physical threshold after five years of doubling capacity each year: "The system that we are now developing will allow us to extend this limit by a factor of 10, maybe more," Wolfe explains. Seagate Technologies, Euxine Technologies, and Molecular Imprints are among the N-PMR project's corporate sponsors. N-PMR involves reducing the amount of crystallites on which data is recorded; current recording involves between 50 and 100 crystallites, but Wolfe says his team should eventually shrink that down to an area as small as a few nanometers. Key to this breakthrough is UH chemistry and engineering professor T. Randall Lee's work with self-assembling nanoparticles that can be employed in photolithography. "First we have to prepare the nanoparticles so that they are all the same shape and size and then coat them with a special material so that they assemble in a regular pattern," Lee notes. He adds that his team has already tackled the challenge of specifically tailoring self-synthesizing thin films of nanoparticles, and eliminating defects is the only remaining problem.
    Click Here to View Full Article

  • "Connection Blues"
    Scientific American (11/15/04); Grossman, Wendy M.

    Bluetooth devices are vulnerable to hacking techniques that take advantage of improperly implemented security configurations. The problems are not caused by weaknesses in the Bluetooth protocol itself, which encrypts transiting data and can be configured to only communicate with a list of approved devices; rather, some device manufacturers have left vulnerabilities in the way they use Bluetooth. AL Digital security researcher and Defcon co-organizer Adam Laurie uses utilities found on the Internet to hack Bluetooth-enabled mobile phones in an attack called Bluesnarfing, where an attacker disguises their own device profile so as to gain illicit access. Laurie warns that although mobile phones today do not have the wealth of data available on people's PCs, in the future they will likely carry information such as email, recordings, and other personal data. Salzburg Research security expert Martin Herfurt discovered an even more troubling hacker attack than Bluesnarfing: The so-called Bluebugging attack connects to devices' serial port profile, enabling the attacker to send "AT" commands such as those used over dial-up modems. Through such commands, the hacker can remotely manipulate the phone's functions, such as sending SMS or connecting to the Internet. In a spy-versus-spy situation, the hacker could call themselves using the victim's phone so as to listen in on conversations. The vulnerabilities have been fixed by some of the affected device manufacturers, and Laurie is working with Bluetooth technical staff to help develop better security in future versions of the technology.
    Click Here to View Full Article

  • "Love for Robots Conjures Dreams of Helping Others"
    Daily Texan (11/18/04); Stowell, Cindy

    University of Texas computer science professor Benjamin Kuipers is being funded by the National Science Foundation to develop Vulcan, a robotic wheelchair capable of navigation, collision avoidance, and execution of spoken directives. "The design of the wheelchair needs to increase rather than decrease the autonomy of the driver," explains Kuipers. UT graduate student Joseph Modayil says the wheelchair features a pair of range-finding lasers, infrared sensors, a Global Positioning Satellite sensor, and optical binoculars, while Kuipers notes that lines of code written primarily in C++ allow Vulcan to interpret sensor input so that it can map out and navigate small rooms while evading large objects. Before Vulcan is ready for practical use, researchers will need to address a number of challenges, including the wheelchair's current inability to avoid descending staircases, its limited range-finding capability, and voice interface issues. The device must also be programmed to choose the optimum route to a destination, follow directions, contend with moving objects, and learn to navigate in environments that have not been preprogrammed into it. Modayil is working with a pair of smaller, more resilient sensor-equipped robots in Kuipers' lab to research the detection of moving objects. His work will enable Vulcan to establish the location of doors and determine whether objects can be pushed. Other students on Kuipers' team are studying speech recognition and how people relay directions and rendering that data as a model. "We hope to be doing prototype work with people with disabilities in the next three years," says Kuipers.
    Click Here to View Full Article

  • "No Role for UN in ICANN"
    Australian IT (11/16/04); Hayes, Simon

    ICANN CEO Paul Twomey said the United Nations will not be granted a mandate to oversee the corporation once its contract with the U.S. Department of Commerce is up in 2006. Instead, ICANN will operate as a private organization after the contract expires, without having to answer to any international organization. "At the heart of the way the Internet works is that it grows quickly through the private-sector model," said Twomey. "It's not formulated by international treaty." In a 63-page strategic plan set to be released this week, ICANN predicts a budget of $19.5 million for additional activities in the fiscal 2005-06 year as compared to $15.8 million in 2004-05, an increase that could be raised by placing higher fees on registrars and country code administrators. ICANN plans to encourage developing countries to assume greater control over their domain name country codes and will set aside a fund to encourage developing countries to participate in the Internet. ICANN will also create a fund for network security research, though Twomey says, "We don't see ourselves as a major funder of international research, but as an enthusiastic endorser of international initiatives." ICANN says it has already completed 10 of the 35 goals specified in its contract with the United States and is on pace to complete the rest by 2006.
    Click Here to View Full Article

  • "Filling the Supercomputer Gap"
    National Journal (11/20/04) Vol. 36, No. 48, P. 3545; New, William

    Computer industry experts want the government to broaden supercomputer access across all industries and research sectors, invest more heavily in supercomputing research and development, and outline a long-term "road map" to maintain the competitiveness of the United States' supercomputing efforts. Council on Competitiveness VP Suzy Tichenor reports that innovation--the linchpin of America's global economic leadership--cannot accelerate without high-performance computing. The federal-industrial technology R&D partnership needs to be refreshed, she says. The Council also conducted a poll of industry executives, who indicated that more powerful and simpler supercomputers could add up to billions of dollars in savings. The Office of Science and Technology Policy coordinated the organization of a High End Computing Revitalization Task Force, which released a strategy and five-year road map for government investments. Office of Science and Technology Policy director John Marburger has promised to prioritize high-performance computing in the budget for fiscal 2006, while the Defense Advanced Research Projects Agency is sponsoring an initiative to close the divide between federal and commercial computing capabilities. In addition, a bill passed by the Senate this week authorizes $165 million over three years for supercomputing R&D, and Rep. Sherwood Boehlert (R-N.Y.) is expected to revive a measure requiring the White House Science Office to create a tech development road map and investment scheme for federal high-end systems. Marburger mandated in June that the President's Information Technology Advisory Committee (PITAC) evaluate federal computational science research, and a recently submitted PITAC progress report lists weather and climate as a major supercomputing application, and points to a disconnect between private-sector and federal-academic computing infrastructure requirements.
    Click Here to View Full Article

  • "Women in Software: Open Source, Cold Shoulder"
    Software Development (11/04) Vol. 12, No. 11, P. 40; Levesque, Michelle; Wilson, Greg

    Despite insistence from advocates that the free, libre, and open source software (FLOSS) community welcomes all genders and persuasions, evidence suggests a strong anti-female bias, according to University of Toronto undergraduate student Michelle Levesque and computer science professor Greg Wilson. The authors point out that nearly all successful FLOSS projects depend on a small group of core developers, yet they have found no female chief architects in any of the widely publicized FLOSS initiatives. There is evidence indicating that female FLOSS participants exist, but are masking their gender online in order to avoid salacious overtures; however, there are also signs that the factors contributing to the overall gender imbalance in computing, as identified by Carnegie Mellon University researchers Jane Margolis and Allan Fisher, are having an even greater effect in the FLOSS community. Such factors include women being less inclined to work in front of a computer for long periods of time because they have a broader range of interests and a greater desire for well-rounded lives than males, and a decline in confidence among women because men have more computing experience. But Levesque and Wilson think the most damaging aspect of the FLOSS community is its prevalent geek/hacker culture, which encourages antisocial behavior and sexism in the absence of an enforcement element. The authors contend that this cultural barrier is detrimental to the FLOSS community beyond the gender imbalance: It encourages concentration on "hard" technical issues at the expense of "human" issues such as user interface design, and supports a public image that discourages decision-makers from taking proponents seriously on legal and technical matters. Worst of all, Levesque and Wilson claim FLOSS' "boys' club" atmosphere limits participation, leading to a stifling of diversity and innovation.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "RFID's Security Challenge"
    InformationWeek (11/15/04) No. 1014, P. 49; Claburn, Thomas; Hulme, George V.; Sullivan, Laurie

    Radio-frequency identification (RFID) technology can dramatically improve supply-chain protection through its ability to precisely locate items in inventory and reduce the risk of insider theft, but adoption is hindered by security vulnerabilities on the RFID tag, network, and data level. RFID tags can be easily hacked, and The Advisory Council reports that this weakness is partly attributable to a lack of point-to-point encryption and a public key infrastructure exchange. Among the solutions being considered to address tag insecurity and privacy is the creation of unique, product-specific EPC codes that would allow a hacker to uncover information for no more than one item; password protection and data encryption provided by the EPCglobal UHF generation 2 protocol standard; and "soft blockers" that enforce consumer-privacy preferences after an item's purchase. Unprotected wireless networks can also be exploited by introducing rogue readers or hijacking the data in transit between company readers and the repository, notes Forrester analyst Laura Koetzle. Authenticating all readers before they can send information to enterprise system middleware and encrypting data traffic between the reader and the back-end system is one solution, while "silent treewalking" can thwart eavesdropping on RFID reader emissions by referencing RFID tag numbers indirectly instead of having the reader broadcast them. Fixing the data's security vulnerabilities is critical as more companies broaden their supply-chain projects and start exchanging data amongst themselves, according to Forrester analyst Christine Overby; however, standards for securing data on the EPCglobal network have yet to be finalized. "The big issue that we face really is that the people driving the applications...don't really understand what level of security they want," or want to pay for, argues Texas Instruments's Tony Sabetti.
    Click Here to View Full Article

  • "The Future of IT: Industry Innovators Speak Out"
    Network Magazine (11/04) Vol. 19, No. 11, P. 24; Kapustka, Paul

    Internet co-founder and TCP/IP stack co-author Bob Kahn, Avici Systems co-founder Larry Dennison, Broadcom co-founder and CTO Henry Samueli, Precision I/O and Packet Design Chairman Judy Estrin, and Pulver.com Enterprises CEO Jeff Pulver concur that innovative thinking is essential to the realization of advanced future networking services, the securing of funding, and the establishment of new businesses in a world where capital and spending markets are more cautious. "People need to know what you can and cannot do with information on the Net," notes Kahn, whose Digital Object Architecture could become the cornerstone of a standards-based infrastructure for identifying and managing digital objects on the Web using unique persistent identifiers that recognize network resources regardless of location. Network maturation has spawned a need to reduce the severity of bottlenecks in server I/O subsystems and other areas through innovation, and Estrin is concerned with addressing the "first inch" problem of how packets move on and off processors. She predicts that solutions to network latency will become more applicable with the mainstream acceptance of voice over IP (VoIP), while innovative embedded sensors will be needed to accommodate communications between ubiquitous computers. Mainstream VoIP penetration is also a key area of interest to Pulver, who thinks industry should prioritize new services and features, such as VoIP over Wi-Fi and VoIP-enabled e911 dialing, over price; he also believes an industry outsider will spearhead mainstream VoIP commercialization. Dennison advocates designing scalability, reliability, and fast service provisioning into core system routers in order to support advanced services that benefit both businesses and customers. Samueli believes the convergence of networks to IP cannot be actualized without converged-functionality chips, a development he is eagerly anticipating.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM