HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 628:  Wednesday, April 7, 2004

  • "Cyberexperts and Engineers Wanted by FBI"
    Wall Street Journal (04/06/04) P. B1; Fields, Gary

    The Sept. 11, 2001, terrorist attacks have forced the Federal Bureau of Investigations to get more serious about its computer skills by increasing its IT hiring efforts and waiving some requirements to lure more IT professionals. The FBI created the Cyber Division two years ago as agency officials faced the pressing issues of monitoring potential terrorists and neutralizing the viruses and worms that plague cyberspace. Cybercrime now trails only counterterrorism and counterintelligence in priority for the FBI, which used the Cyber Division to bring all of its computer experts under one command. The FBI is actively recruiting people with backgrounds in computer science and electrical engineering to fill its new jobs, and the downturn in the IT industry has prompted some IT workers to leave the private sector for government service. The FBI is now a presence at high-tech job fairs and leading tech schools such as MIT and the University of Illinois, says Keith Lourdeau, the Cyber Division's deputy assistant director. Several agents at each of the FBI's 56 field divisions work on cybercrime, but the larger offices in New York and Los Angeles have dozens of cyber agents. Moreover, local police officers and representatives from other federal agencies participate in its 40 cybercrime task forces. The extensive training of agents who work in the cyber unit ranges from Cyber 101 to intense sessions provided by leading IT companies. The FBI also now mandates 20 hours of computer training, such as reading email headers and return addresses, to all agent trainees.

  • "Dodgy Patents Rile Tech Industry"
    Wired News (04/05/04); Asaravala, Amit

    The software industry has helped dramatically increase the number of patent applications at the U.S. Patent and Trademark Office (USPTO), which recently awarded a number of controversial patents that could stifle innovation and ruin thousands of small companies. Among the more egregious patents issued recently by the USPTO: a Networks Associates technology patent for deleting "undesired data" from a computer, a spam and virus filtering process patent for antispam company Postini, a process patent for matching professionals to email addresses and Web subdomains, and an Amazon.com patent allowing the company to charge other Web sites for browser cookies storing data structures. The patent review process relies on the examiner finding prior art that would invalidate a claim, but critics of the current system say the USPTO is stretched far too thin to do a proper search. Examiners also often rely on previous patents in their search for prior art, which is a limiting factor for software patent examiners given their relatively new field. The USPTO's workload has also doubled over the past decade to 355,400 patents examined last year while resources have not been added at the same pace. Lawyer and former patent examiner Harold Wegner says examiners are also pressured to meet disposal quotas and so cannot spend resources to refute difficult claims. USPTO's Brigid Quinn says patent examiners are experts in their field and spend 30 hours researching applications. With more than 300,000 applications and about 25 claims per application, examiners make 7.5 million choices--some of which may need to be reversed later. Challenging patent claims is difficult under the current system, however, and many companies simply pay licensing fees in order to avoid costly litigation. Both Wegner and Quinn say the re-examination system should be changed, perhaps similar to in Europe and Japan, where third parties can more easily request reviews.
    Click Here to View Full Article

  • "Sharing Spectrum the Smarter Way"
    CommsDesign (04/05/04); Mannion, Patrick

    Researchers are working on radical new radio designs that would allow wireless devices to share broad swaths of spectrum with licensed users such as TV broadcasters. Cognitive radio has emerged as the optimal solution, not only adjusting itself to new environments as software-defined radio (SDR) allows, but also learning from the environment and determining how to best serve users. The federal government is also getting involved in promoting new radio schemes; the FCC has moved relatively quickly in recent years to begin changing the existing spectrum allocation structure, or at least to allow new shared spectrum applications such as ultrawideband. Intel wireless research director Jeffrey Schiffer says the FCC is even looking into using the 6MHz-wide licensed spectrum in the UHF band, currently allotted to TV broadcasters, for last-mile Internet access. While the data transmission speeds would not reach Wi-Fi levels, Schiffer says such shared spectrum use would be a good technical option for rural areas. FCC chief engineer Ed Thomas explains the technical reasoning for shared spectrum use, saying that only 5%-10% of bandwidth of the entire radio frequency up to 100GHz is being used at any one time, leaving 90GHz for cognitive or software-defined radios to operate in. The technical foundations for cognitive radio are already established, and commercial applications could appear in five years, according to SDR Forum technical committee chairman Bruce Fette. For example, WLANs already use dynamic frequency selection and transmit power control. Software engineering is required to enable filtering, band selection, and interference mitigation. Cognitive radios will not only need to recognize other devices, but determine how and when to move out of the way of licensed users. Experts say selecting the protocols and communications channels for these functions will mean tough negotiations.
    Click Here to View Full Article

  • "Workers Asked to Train Foreign Replacements"
    USA Today (04/06/04); Armour, Stephanie

    The effect of globalization on the U.S. IT sector is felt most acutely when workers are asked to train their foreign replacements, a practice that is fairly widespread, according to new survey results from the Washington Alliance of Technology Workers (WashTech). That survey reports that one in five IT workers has trained a foreign worker who took over their job or known someone else who has done so. Companies say bringing foreign workers over to learn the tasks they will take over allows for a seamless transition, while some economists and policymakers say the issue is being exploited during an election year. Benefits consultant group Hewitt Associates, however, warns that public sentiment against offshoring, exacerbated by laid-off workers training foreign replacements, could be damaging for companies. A March survey showed 30% of employers believed offshoring hurt morale, while 11% believed offshoring damaged brand image. Former J.P. Morgan Chase contract worker Scott "You feel like you're the guy wearing the red shirt on Star Trek," Kirwin says, referring to the superfluous characters who often died on the show. Kirwin says he willingly trained his Indian replacement so that he could eke out a few more paychecks. Now a computer consultant, Kirwin started the IT Professionals Association of America, which sells T-shirts on its Web site that read: "My job went to India and all I got was this stupid pink slip." Other workers say their predicament is much worse when faced with training foreign replacements. Former WartchMark-Comnitel software tester Myra Bronstein says that when managers told her and her colleagues that about the offshoring of their jobs, it seemed as though she might be fired if she refused to train her replacement. If she was fired or quit, Bronstein would have been ineligible for unemployment benefits. She says, "It was hideously awkward."
    Click Here to View Full Article

  • "Budgets, Mandates Slow Adoption of E-Voting"
    Washington Technology (03/31/04); Emery, Gail Repsher

    Supporters of electronic voting say security concerns have not stopped state and local governments from embracing e-voting systems. They maintain that most state and local government officials have been more concerned about the cost of the technology and whether they will be able to meet future regulatory mandates. For example, e-voting advocates say the mandate in state and federal legislative proposals that e-voting systems include voter-verified paper records has been counterproductive for e-voting implementation in many states. Voter-verified paper records provide a paper trail for verifying election results, but e-voting supporters say it does not improve the security of e-voting systems. Election officials from Lee County, Fla., and the state of Maryland say they test their e-voting systems before elections, ensuring that elections are secure. Most problems involving e-voting system are related to administrative issues, and the technology problems have a comparable error rate to older voting systems. "A lot of people will be surprised that...we have the exact same problems and the same technology people were so embarrassed about in 2000," says Harris Miller, president of the Information Technology Association of America. Around 12% of U.S. counties are expected to use e-voting systems in November.
    Click Here to View Full Article

  • "Helping Business Tool Up for Software Engineering"
    IST Results (04/05/04)

    Component-based software engineering (CBSE) is made easier, faster, and cheaper with the help of a new software toolbox called ECO-ADM, one of the European Commission's Information Society Technologies projects. The ECO-ADM project was coordinated by Centro de Calculo de Sabadell in Spain and is set for commercial release at the end of this year, according to project coordinator Joan Canal. ECO-ADM aims to make CBSE more manageable, providing tools for easy integration of commercial-off-the-shelf components, future updates, and component reuse. CBSE is catching on among organizations as a more cost-effective approach to building applications. Under the CBSE discipline, organizations build and reuse software components, plugging them into a larger software framework to complete projects faster and at less cost than with traditional programming. Canal says ECO-ADM adds to those benefits by providing a methodology programmers other than the original designers can use to understand the application, as well as an easy way to maintain and update systems. Several European companies have already tested ECO-ADM with positive results, including transport group DHL, which used ECO-ADM to develop its cargo shipment tracing system in just half a day instead of four days. Systems integrator Eidos used ECO-ADM in conjunction with its existing CBSE approach to save 30% in development costs, and with half the number of software bugs. Canal says ECO-ADM will be of most benefit to smaller firms which do not have the programming resources available to larger companies.
    Click Here to View Full Article

  • "Electronics Makers, Holders of Copyright Fight Over 'Fair Use'"
    Investor's Business Daily (04/06/04) P. A6; Seitz, Patrick

    Consumer electronics makers and consumer advocates say fair use rights are being eroded as new digital technologies emerge, such as HDTV. The movie industry and other content owners are allied with Microsoft in enforcing more strict copyright protection than was in place in the pre-digital world. Content owners are seeking to control device manufacturers and smaller companies such as 321 Studios, which is appealing a case brought against it by movie studios. The DVD X Copy product in question allows consumers to make copies of DVDs they own on their computer hard drive, but federal courts in California and New York ruled the software violates the 1998 Digital Millennium Copyright Act (DMCA), which makes it illegal to distribute tools circumventing copyright protections. DVD X Copy contains DeCSS code that unscrambles CSS encryption used on DVDs. 321 Studios has argued its product is legal under fair use rules established in 1981 for VCR technology, and notes that copies made with its software are not replicable and contain disclaimers. "The people who get screwed are the next Hewlett and Packard, the next Steve Jobs, and the next TiVo," says Electronic Frontier Foundation senior intellectual property attorney Fred von Lohmann. Another battleground is HDTV, which the Federal Communications Commission mandated must carry embedded broadcast flags that would limit playback and recording of HDTV shows. Home Recording Rights Coalition Chairman and Consumer Electronics Association head Gary Shapiro says the entertainment industry is worried about peer-to-peer file sharing, but the audio and video formats used for HDTV make shows too large to easily trade online. Some legislators in favor of fair use protection have proposed the Digital Media Consumers' Rights Act to counter previous legal encroachments, including the DMCA.

  • "Net Plan Builds in Search"
    Technology Research News (04/14/04); Patch, Kimberly

    Chinese university researchers have developed an Internet search framework that could one day lead to customized Internet search interfaces that are more effective than general search engines. Huazhong University of Science and Technology researchers say their Domain Resource Integrated System (DRIS) allows for more efficient searches of disparate data repositories, including those not regularly included in general search engine dragnets. The DRIS prototype will be implemented on the China Education Network, which links all Chinese universities, and enable users to search Web, FTP resources, and individual databases such as those run by university libraries. DRIS, which utilizes IPv6 features, functions on three levels: the individual domain where a standard search engine crawls and indexes Web pages, a subnetwork level that indexes the individual domains and provides a search interface, and a top-level domain level that indexes the subnetworks and also provides a search interface. This tiered scheme would eliminate network traffic caused by individual Web search engine crawlers and would also allow users to customize their front-end interface and receive tailored results. Results would also be more up-to-date and include far more resources than currently covered by general Web search engines. Huazhong University researcher Wang Liang says DRIS may be extended to all of China if it proves successful on the China Education Network, though he has aspirations to see DRIS included in general Internet infrastructure. "The basic idea is that search should be [an] internal function of [the] Internet and everyone should have his own personal intelligent search engine," Liang says.
    Click Here to View Full Article

  • "Supercomputing's Latest Challenge: Keeping Cool"
    Government Computer News (03/31/04); Jackson, Joab

    Participants of the National High Performance Computing and Communications Conference in Newport, R.I., discussed the need to find new ways to keep supercomputers cool. Silicon Graphics chief technology officer Eng Lim Goh says while processing power continues to soar, data centers are being tested in their ability to cool the processors in supercomputers that generate an enormous amount of heat. New supercomputer facilities "feel like wind tunnels," says Goh, adding "air can only carry so much heat." Virginia Tech has taken a hybrid liquid-air approach to cooling its supercomputer, which uses 1,100 Apple G5 processors. Srinidhi Varadarajan, the architect of the Terascale Computing Facility at Virginia Tech, says space limitations prevented the data center from taking the traditional air-cooling approach to reduce heat. Instead, Virginia Tech sends 750 gallons of cooled liquid a minute through pipes underneath the raised floor of the facility, and fans blow the air off the pipes. The system costs less than a million dollars, compared with the $5 million price tag of an all-air cooling system. Japan's Earth Simulator Center uses blasts of air to help cool the supercomputer's 5,100 processors, which also have their own individual cooling unit.
    Click Here to View Full Article

  • "Robo-Cars Make Cruise Control So Last Century"
    New York Times (04/03/04) P. 1; Hakim, Danny

    Automakers are increasingly adding electronics to cars in order to increase their safety as more traditional measures yield diminishing returns. New technologies allow cars to sense their surroundings and avoid collisions with other cars, adjust vehicle settings for increased safety, or even park themselves. Industry executives and engineers say the next decade will endow cars with more and more autonomy in order to increase safety and driving burdens. Car and Driver editor Csaba Csere says the new technologies could lead to a self-driving car in the future, with different technologies guiding the car, including GPS, radar, cameras, and transponders that can communicate with similarly equipped vehicles. Driving the new features are advanced sensory technologies that monitor mechanical components such as brakes, tire pressure, and even steering wheel rotation, as well as surroundings on the road such as driving lanes and other vehicles nearby. Many of the new technologies have already been introduced in luxury cars. The Lexus LS430 sedan comes with a radar hidden behind the grille, sensing possible collisions and tightening seat belts, lowering the suspension, and increasing braking pressure in response. Infiniti says its 2005 FX sport utility vehicle comes with a camera in the rearview mirror that will alert drivers when they drift out of their lane on the freeway; Honda offers a similar feature in Japan that actually steers the car back on course. With "adaptive" cruise control, luxury models adjust the cruising speed according to the speed of the car in front. Not all the new technology is going into luxury cars though, as Ford's Freestar minivan also includes a flexible weight scale in the front passenger seat that turns off the airbag if a small child is seated there. With more parts and software on board, vehicle recalls are becoming more common than in the past, though automakers say the added safety is worth the cost.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "The Myth of the Secure Operating System"
    TechNewsWorld (04/03/04); Halperin, David

    No operating system or software program is inherently more secure than another, says Yankee Group analyst Laura DiDio, who says all programs connected to the Internet are at risk. Nevertheless, recent bulletins by U.K.-based security consultancy mi2g Intelligence Unit suggest that there has been an increase in direct attacks--as opposed to worms or viruses--on Linux-based servers, while Microsoft platforms are the target of fewer direct attacks, and the Open Source family of BSD (Berkley Software Distribution) and the Mac OS X based on Darwin is safest and most secure online server operating system. DiDio says administration problems are a big reason why Linux is having security issues, which she blames on a lack of training and knowledge; vulnerable third-party applications running on Linux servers are often problematic. DiDio says the human element of administering a fortress-like system correctly must be taken into consideration. However, security consultant Richard Forno, author and former chief security officer at Network Solutions, believes OS X has a safer architecture, in that installed applications do not patch the kernel at low levels, and because it has Unix underpinnings. Mi2g's findings raise the question whether OS X and BSD appear safer because they are niche players that are not popular targets as a result of not being as prevalent as Windows. The high level of connectedness is more of a concern for DiDio because a successful local attack can spread quickly to a wider area. DiDio notes that there are "a lot of interconnected networks globally," and that "in today's networked environment, the most important parameter is the popularity and connectivity of the operating system."
    Click Here to View Full Article

  • "The Pervasive Computing Community"
    P2PNet.net (04/01/04)

    In the future, people will interact with hundreds of tiny computers embedded in their environment in natural and convenient ways, say researchers from the Cambridge-MIT Institute (CMI), a strategic alliance between Cambridge University in the United Kingdom and the Massachusetts Institute of Technology's Computer Science and Artificial Intelligence Lab. Computer users today, however, are still slaves to their machines and forced to learn special languages, mannerisms, and artificial inputs such as the keyboard and mouse. Computers themselves are kept in air-conditioned rooms and require expert maintenance. The CMI's Pervasive Computing Community will gather together academic and industrial resources to help get from today's situation to a truly networked wireless world where computers genuinely enhance people's quality of life. The CMI has already invested 800,000 British pounds in pervasive computing research projects and is now putting 2.2 million British pounds toward establishing the Pervasive Computing Community with more investment expected to come from future partners. Researchers involved say computers have already advanced rapidly in terms of miniaturization, power, and networking capabilities, but that there are still significant barriers to overcome before computers can be used easily in everyday activities such as education, work, and play. Security and privacy will become an even greater issue once networked computers are embedded in the environment, for example, and smaller, lower-power batteries will be needed to make devices even smaller. With such tiny computers, new ways will have to be developed to interact with them, such as speech recognition and computer vision. MIT researcher and Community leader Victor Zue says the research will enable a nomadic lifestyle where information is instantly available no matter the location.
    Click Here to View Full Article

  • "World Resists One-Size-Fits-All Web Laws"
    Toronto Star (04/05/04); Geist, Michael

    A new global survey of 277 companies indicates that the issue of Internet jurisdiction is considered a greater risk by businesses in North America than it is in Europe and Asia, writes Internet legal expert Michael Geist. Businesses based out of Canada and the U.S. are increasingly changing their approaches to Internet usage in response to conflicting national laws governing Internet jurisdiction. In 2003, the American Bar Association's Cyberspace Law Committee worked with the International Chamber of Commerce and the Internet Law and Policy Forum to conduct an international study assessing the impact of Internet jurisdiction issues on business practices. They found that while many North American companies feel that Internet jurisdiction has become a problematic issue that will likely worsen in the next year, Asian and European companies generally reported a decline in jurisdiction issues and predicted further improvement by 2005. Those companies that expressed concern over Internet jurisdiction said litigation risk was their primary fear, followed by regulation, taxation, and privacy issues. Some of these companies are now employing a variety of methods to mitigate these risks, such as using country-code domain names, including legal language on their Web sites, and avoiding certain jurisdictions that are likely to cause conflict. Companies are employing technical approaches to thwart interactions with "risky" jurisdictions, including blocking access to Web sites or requiring registration; some firms also use geo-locational technologies, but that is still rare. Geist says the rise in risk-aversion practices point to big changes for the online businesses of tomorrow.
    Click Here to View Full Article

  • "WPFC Releases Position Paper on Internet Governance"
    World Press Freedom Committee (04/01/04)

    On March 10, 2004, the World Press Freedom Committee (WPFC) released a position paper on Internet governance in preparation for the second World Summit on the Information Society, which will be held in Tunis, Tunisia, in November 2005. Some countries have proposed that the United Nations (UN) take over a measure of responsibility for Internet governance. The WPFC position paper advocates that ICANN retain some sort of role in any new governance system led by the UN. ICANN has refrained from weighing in on controversial Internet content such as hate speech, pornography, and fraud, but other governments seem to prefer the approach outlined by Europe's Cybercrime Convention. The WPFC position paper argues that the Internet should remain free and open, noting that the leadup to the first WSIS showed that the world's authoritarian governments would be more than happy to place international restrictions on Internet content. The position paper argues that any changes in the Internet governance system should be guided by five principles, the first of which prohibits content controls or changes to the Web's architecture to allow censorship, and the second of which must include a direct commitment to Article 19 of the Universal Declaration of Human Rights. Likewise, censorship should not be permitted under the guise of considerations of ethics, and normal forms of communication, such as news, should be differentiated from subjects like fraud, pedophilia, hate speech, and conspiracy for terrorism. Lastly, all legal actions should be resolved in the jurisdiction of the dispute's origination, or in any forum agreed upon by the disputing parties.

  • "Frontline Defenders"
    Computerworld (03/29/04) Vol. 32, No. 13, P. 23; Verton, Dan

    Symantec manages the frontline computer defenses for 600 companies at its Security Operations Center (SOC) in Alexandria, Va.; dozens of analysts and security engineers rotate shifts, monitoring and investigating disturbances in global Internet activity, and sending customers alerts within 15 minutes of a major new outbreak. SOC manager Grant Geyer says the main benefit of his company's offering is the ability to tap into such a large analytical operation and gain from the experience of other clients. Signing on for managed security services is like joining a neighborhood watch program, where suspicious activity observed in one area immediately puts everyone else on alert. Symantec not only monitors global Internet traffic, but also tracks network attacks directed against their customers in real time; if attacks penetrate security devices such as the firewall, then Symantec employees send an alert to the customer with detailed information about the intrusion and specific recommendations on how to deal with it. "We bring in the source IP address, the source port, the destination IP address, destination port, protocol, and the rule option," says Symantec cybersecurity analyst Tim Hillyard, describing how each of the hundreds of daily attacks are detailed. If closer inspection is required, analysts can pull up raw data captured by the customer's intrusion-detection device. Analysts and engineers at the SOC can also take action on behalf of customers, configuring customers' firewalls and intrusion-detection systems, for example. ID tokens are used to authenticate communication between Symantec and clients, including phone conversations, and global security analyst Corilynn Arnold says another benefit is getting advance notice of new threats weeks before software firms announce vulnerabilities. This is possible because of how Symantec scrutinizes aberrations in global Internet activity, looking at the forest as opposed to the individual trees of customer networks, says Arnold.
    Click Here to View Full Article

  • "IBM, Dutch Scientists to Explore First Moments of Universe"
    Enterprise Networks and Servers (03/04) Vol. 10, No. 3, P. 3

    IBM and Dutch astronomy organization ASTRON say they will use IBM's Blue Gene/L supercomputer technology in developing a new kind of radio telescope used to examine the beginnings of stars and galaxies not long after the formation of the universe. The Blue Gene/L system, expected to be one of the world's fastest supercomputers, should be completed by the middle of next year, and will provide ASTRON with the speed and flexibility it requires to collect and analyze data from its Low Frequency Array (LOFAR) software telescope network. LOFAR is basically a large number of FM radios harnessed by the Blue Gene/L computer as a single telescope. "The challenge of processing unprecedented data volumes requires us to raise our technology to a new level," says IBM director William Pulleyblank. "Blue Gene/L provides the flexibility and power to enable us to meet this grand challenge." ASTRON will harness over 10,000 radio antennas and interpret them through high-speed calculations, detecting radio waves that show the universe as it was 13 billion years ago. ASTRON director Professor Harvey Butcher believes that the collaboration will pave the way for more applications in geophysics and precision agriculture. The Netherlands Ministry of Education, Culture, and Science is supporting the development of the technologies for the telescope. A wide variety of research groups, universities, and companies are participating in the project. ASTRON professor Harvey Butcher says, "Discovery in astronomy goes hand in hand with innovation in technology."
    Click Here to View Full Article

  • "My Avatar, My Self"
    Technology Review (04/04) Vol. 107, No. 3, P. 50; Kushner, David

    Simulated worlds such as There and Linden Lab's Second Life are environments where people can socialize through digital avatars, but accommodating increasing numbers of users in these ever-expanding simulations is a constant challenge. There, which was put together by a clique of programmers, Hollywood animators, and Silicon Valley tycoons at a cost of about $37 million, provides seamless interaction among users via software that tracks all objects as a series of points in time, rendering user experiences on an individual basis to convince participants that they are synchronized; There's servers segment the virtual world into portions that are streamed to the user's computer, using population density as the model. Continuity in Second Life is provided by "thin client" software that subscribers download, while the rest of the data that goes into the simulation resides on a server grid in San Francisco. Linden Lab founder and CEO Philip Rosedale has invented a 3D streaming technology that facilitates real-time rendering of all objects in Second Life over the Internet, while a physics engine mimics real-world interaction between objects and avatars. Software partitions Second World into 6.5-hectare tiles, each of which is maintained by a Pentium 4 computer running Linux; as a user moves from tile to tile, each individual tile's server provides data to smooth the transition. There and Second World employ technology that allows avatars to socialize, such as preprogrammed gestures to reflect the user's moods. The simulations can also enable participants to make the most of the worlds, such as by designing and purchasing virtual clothes, toys, and accessories--with virtual money, of course--that their avatars can use to increase their status.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)



    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM