Association for Computing Machinery
Timely Topics for IT Professionals

ACM TechNews (HTML) Read the TechNews Online at: http://technews.acm.org
ACM TechNews
June 8, 2007

Learn about ACM's 2,200 online courses and 1,100 online books
MemberNet
CareerNews
Unsubscribe

Welcome to the June 8, 2007 edition of ACM TechNews, providing timely information for IT professionals three times a week.


HEADLINES AT A GLANCE:

 

Could U.S. Repel a Cyberattack?
Christian Science Monitor (06/07/07) P. 1; Arnoldy, Ben; Lubold, Gordon

The two-week cyberattack against Estonia that flooded government Web sites, shut down a bank's online services, and slowed Internet services across the country, provided U.S. defense officials with a real-life example of what could happen if the United States' Web infrastructure was attacked. While Estonia reacted well, experts say the U.S. States may be more likely to suffer mass disruptions of banking, telecommunications, and government services due to a lack of coordination, funding, and centralized authority. Protecting the nation from a cyberattack requires extensive coordination between the government and the private sector and expensive research and preparation, but US-CERT, the small group within the Department of Homeland Security (DHS) that is responsible for such efforts, is underfunded and holds little authority, experts say. "The part of the U.S. government that has responsibility for this doesn't have the authority to command attention from within other parts of the government, and it doesn't have the money to get the work done that is on its plate," says cybersecurity expert Bill Woodcock, who traveled to Estonia to help during the attack. Jerry Dixon, acting director of the DHS' National Cyber Security Division, which runs US-CERT, says the situation is improving, citing the increased number of incident reports from the private sector and from government agencies reporting suspicious Internet activity, but that a great deal of work is still needed, particularly in developing state-level preparedness efforts and in preparing for a simultaneous attack against several major networks.
Click Here to View Full Article
to the top


Computer Expert Urges Identity Verification Safeguards for Employee Eligibility Systems
AScribe Newswire (06/07/07)

At a Congressional hearing on Thursday focusing on security and privacy issues affecting efforts to verify employee eligibility, Peter G. Neumann, representing ACM's U.S. Public Policy Committee, testified that the systems requiring employers to submit identifying information on current and potential employees, as outlined in pending legislation, contain many risks. Neumann, an ACM Fellow and Principal Scientist in the Computer Science Laboratory at SRI International, urged Congress to develop incentives for operators and employers to maximize the achievement of U.S. immigration laws mandating employee eligibility verification while simultaneously minimizing privacy and security risks to individuals. Employee Eligibility Verification System (EEVS) expansion is tied to several bills in the House and Senate proposing national systems to verify employment eligibility. Neumann said the computer database applications required by these bills are vulnerable and risk exposing the system and the data. Neumann presented a detailed set of recommendations to ensure the verification system is designed, constructed, and operated securely. "These potential pitfalls to security, integrity and privacy must be anticipated from the beginning and reflected throughout the design, implementation, and operation of the systems planned to implement the EEVS expansion," Neumann said. "We should not expect easy technological answers to inherently difficult problems." Neumann warned that information sent and stored in EEVS would include primary personal identifiers and that any compromise, theft, leak, destruction, or alteration would have severe consequences, including identity theft and impersonation. "Privacy and security are inextricably linked," Neumann said. "One cannot ever guarantee complete privacy, but the difficulties are severely complicated by systems that are not adequately secure."
Click Here to View Full Article
to the top


Designers Pitch 'Wild and Crazy' Ideas at DAC
EE Times (06/06/07) Mokhoff, Nicolas

Some encouraging ideas that have not reached the stage of the technical paper were proposed for computer architectures and design methodologies during the "Wild and Crazy Idea" session at this week's Design Automation Conference. Stanford University researcher Alex Solomatnikov discussed the prospects of having a chip multiprocessor generator act as a flexible, universal computing platform that goes beyond the microprocessor. Configuration and programming of the flexible computing framework for running applications of a desired performance would be carried out by designers, and would be followed by the system compiling the program and configuration to adapt the original framework to develop a chip for the specific applications. "Thus, the user gets the reduced development costs of using a flexible solution with the efficiency of a custom chip," explained Solomatnikov. Rice University researcher Farinaz Koushanfar said EDA tools could be used to integrate unique identification keys into gate-level circuits as security protocols. "In the near future, the key design dilemma will be providing security solutions that would cover all aspects of the design--from design reuse methodology, to architecture and to implementation," said Koushanfar. Software piracy could be addressed by using secure IDs to make software that only runs on a specific IC. Also, Blaze DFM researcher Puneet Gupta said line-end shortening (LES) could lead to faster devices, and Columbia University researcher Stephen Edwards addressed the possibility of a "precision timed" (PRET) machine.
Click Here to View Full Article
to the top


Goodbye Wires...
MIT News (06/07/07) Hadley, Franklin

Massachusetts Institute of Technology researchers have experimentally demonstrated a wireless power transfer method that they say could one day be used to charge laptops, cell phones, MP3 players, and other electronic devices. The solution the MIT researchers developed, known as WiTricity, used coupled resonant objects to safely and efficiently light a 60-watt light bulb using a power source seven feet away, with no physical connection between the source and the light. The MIT researchers focused on magnetically coupled resonators. The researchers were able to identify the strongly coupled regime in the system, even when the distance between the resonant objects was several times larger than the objects themselves. Magnetic coupling is particularly suitable for real-world applications because magnetic fields interact weakly with most common materials, including biological organisms. The design used two copper coils, each a self-resonant system. The coil attached to the power source fills the space around itself with a non-radiative magnetic field oscillating at MHz frequencies. The non-radiative field controls the power exchange with the receiving units coil, which is specially designed to resonate with the field. The resonant process ensures strong interaction between the two units and a weak interaction with the rest of the environment. The non-radiative field contains the majority of the power not picked up by the receiving coil, preventing energy from being lost. Smaller resonant objects would have a shorter energy transfer range, but laptop-sized coils would be able to send power over room-sized distances efficiently and almost omni-directionally.
Click Here to View Full Article
to the top


Grant Applications to Attend SC07 Due June 29
HPC Wire (06/06/07)

SC07, to be held Nov. 10-16 in Reno, Nev., will include the Broader Engagement initiative, which aims to increase the involvement of individuals in groups that have traditionally been underrepresented in high performance computing and networking. The initiative provides grants to support participation in the SC07 Technical Program, which encourages technology submissions, and the development of networks through a formal mentoring program and informal contacts at the conference, sponsored by ACM and IEEE. Applications for grants to participate in the SC07 Technical Program through the Broader Engagement Initiative will be accepted through June 19. Grant recipients will receive complementary registration and reimbursement for lodging and travel expenses, up to an agreed amount. Applications are encouraged from anyone in computing-related disciplines, including those with backgrounds in research, education, and industry, but students in particular are encouraged to apply. Primary consideration will be given to applicants from demographics that have traditionally be underrepresented in high performance computing, such as African-Americans, Hispanics, Indigenous People, and women. Successful grant applicants will have a proven background related to high performance computing, networking, storage, or analysis, demonstrate keen interest in integrating innovative technologies into the curriculum or work environment, and commit themselves to taking full advantage of this unique opportunity to meet international leaders in the field of high-performance computing and networking.
Click Here to View Full Article
to the top


Researchers Chart Internet's 'Black Holes'
Wired News (06/07/07) Singel, Ryan

University of Washington computer science Ph.D. candidate Ethan Katz-Bassett and his project partner Harsha V. Madhyastha said more than 10 percent of the Internet disappears every day. At a meeting of the North American Network Operator's Group in Bellevue, Wash., Katz-Bassett introduced Hubble, a network of deep cyberspace probes scattered across the Internet. Over the course of two weeks, Hubble sampled 1,500 Internet prefixes, small subsections of the Web, every 15 minutes and found that 10 percent of those prefixes could not be reached from certain areas of the Internet. Katz-Bassett said that sometimes certain areas of the Internet were completely unreachable, and at other times traffic originating from particular areas of the Internet disappeared into a "routing black hole." This phenomenon occurs when packets being sent from one computer to another, such as an email or a request for a Web page, are somehow diverted to the wrong location and are lost forever. "We've found a lot more reachability problems than we expected to see, with some prefixes being unreachable from several vantage points across multiple days," Madhyastha said. The two researchers hope to build a tool that will track these "black holes" in real time by monitoring the exchange that occurs between routers about the best route for traffic, and by building a permanent system of remote sensors that can send out signals, or pings, from various point across the Internet. "A single unresponsive ping is likely to mean there are widespread problems," Katz-Bassett said. The system that Katz-Bassett hopes to build over the summer would treat an answered ping as a canary in a coal mine, and instantly send multiple probes to the area.
Click Here to View Full Article
to the top


Online Shoppers Will Pay Extra to Protect Privacy, Carnegie Mellon Study Shows
Carnegie Mellon News (06/06/07) Spice, Byron; Watzman, Anne

A new Carnegie Mellon University study shows that online consumers are willing to pay more when shopping at an online retailer with an understandable and strong privacy policy. Participants in the study used a Carnegie Mellon search engine called Privacy Finder, which automatically evaluates a Web site's privacy policy and displays the results on a search results page. The study found that people were more likely to buy from online merchants with good privacy policies, as identified by Privacy Finder, and were willing to pay about 60 cents more on a $15 purchase when shopping at a site with a privacy policy they liked. The study, led by Lorrie Cranor, director of the Carnegie Mellon Usable Privacy and Security Lab, is the first to indicate that online consumers are willing to pay a premium to protect their privacy. "People can't act on information that they don't have or can't understand," Cranor says. Privacy Finder is a search engine Cranor and her students developed to address the problem. Privacy Finder uses the Platform for Privacy Preferences (P3P), a technical standard used to create machine-readable privacy policies. Cranor says that about 10 percent of all Web sites, more than 20 percent of e-commerce sites, and about one-third of the top 100 most-visited sites use P3P. Privacy Finder automatically reads and evaluates the policies of Web sites that use P3P, and displays the information as a series of colored squares that indicate if the site's policies match the user's preferences.
Click Here to View Full Article
to the top


Patent Office to Test Peer Review of Computer Tech Patents
Computerworld (06/07/07) Rosencrance, Linda

The U.S. Patent and Trademark Office announced the launch of the Peer Review Pilot, a project that is intended to improve the process of examining applications for computer technology patents. The pilot project, scheduled to start June 15 and run for one year, allows computer technology experts to submit technical references related to the claims of a published patent application before an examiner reviews the application. "Studies have shown that when our patent examiners have the best data in front of them, they make the correct decision," says patent office director Jon Dudas. "Examiners, however, have a limited amount of time to find and properly consider the most relevant information. This is particularly true in the software-related technologies where the code is not easily accessible and is often not dated or well-documented." Technical experts will review and submit information on up to 250 published patent applications. Patent applicants who volunteer for the project will be required to give their permission for the patent office to collect comments, because the current law prohibits the public from submitting commentary without the permission of the applicant, according to a statement released by the patent office.
Click Here to View Full Article
to the top


My Car Is Watching Me
Associated Press (06/06/07) Pearson, Ryan

Researchers and automakers are developing steering wheel and dashboard-mounted sensors that monitor a driver's face to gain insight into the mood and awareness of the driver in the hopes of preventing accidents. The system works by using a computer algorithm to map the body and facial features of the driver, primarily the eyes and mouth. The distance between those points will change should the driver become sleepy, distracted, or even drunk. Some examples of physical changes may be leaning backwards, becoming slack mouthed, or eyes becoming bigger or smaller. Sensors gather the information and combine it with data from the vehicle, including speed and hand position on the steering wheel. Jeremy Bailenson, director of Stanford University's Virtual Human Interaction Lab, says that with enough advanced warning, either the car or the driver will hopefully be able to prevent an accident. Bailenson says his research found that a system of algorithms and sensors could increase the chance of predicting an accident by about 30 percent. Part of the research at Stanford examines how humans respond and interact with speaking computers, such as the ones that may eventually warn drivers. Stanford professor Clifford Nass says that people treat technologies like people. "So you have to use the tricks that people would use with other people to help them do the right thing," he says. Nass says that by being able to recognize emotion and context through facial recognition technology, cars can respond in appropriate and helpful ways. For example, if a driver is drowsy, the car may start an audio program that teaches a language, forcing the driver to be more attentive. Nass predicts that such technology will be included in consumer-ready cars within five years, at least in Japan.
Click Here to View Full Article
to the top


New Record for Quantum Cryptography
Technology Review (06/08/07) Savage, Neil

The dream of secure communications based on quantum cryptography came a little closer to realization when a team of European researchers successfully transmitted a quantum encryption key across a distance of 144 kilometers--a new record--from a laboratory on La Palma in the Canary Islands to an observatory on Tenerife. The scientists used the phenomenon of quantum entanglement, in which two photons are bound together so that they mirror each other's actions, to create the key. A powerful laser beam was focused through a crystal, resulting in the generation of two entangled photons for every photon that was injected into the crystal. One half of each entangled pair was bounced off a mirror to a light detector on La Palma, while the other photon was routed through a lens to be captured by a telescope on Tenerife and transmitted to another detector. If the signal's reach can be extended just slightly, the transmission of quantum-encrypted data around the world via satellite will become feasible. The researchers' work was detailed in the June 3 edition of the online journal Nature Physics. The scientists are members of SECOQC, a European consortium of about 20 groups developing secure quantum communication, and they are planning a test of a secure system in Europe in 2008.
Click Here to View Full Article
to the top


Net Attack
Wall Street Journal (06/05/07) Mannes, Aaron; Hendler, James

University of Maryland Ph.D. student Aaron Mannes and Rensselaer Polytechnic Institute computer science professor James Hendler warn that the cyberwarfare era is upon us, as evidenced by numerous incidents that include an assault on six of the 13 "root servers" comprising the Internet's backbone in February. Such attacks threaten the global economy, and signify the pressing need to strengthen the Internet against criminals. The authors note similarities between various politically charged online attacks, such as the defacing or shuttering of prominent Estonian commercial and government Web sites that followed the relocation of a Soviet World War II memorial in April. These disruptions, as well as the strike against the Internet root servers, take the form of Distributed Denial of Service (DDoS) attacks, in which malware is installed on a computer and directed to swamp a targeted system with messages, which can be crippling when such floods are unleashed en masse by large networks known as botnets. DDoS attacks are becoming more frequent because the tools to launch them are easy to acquire and use, and they are difficult to trace given the global scope of botnet networks. Still, breaching a system to pilfer information or launching an assault that targets real-world infrastructure requires a hacker of substantially greater skill, and Mannes and Hendler note that the few publicly disclosed incidents in this vein have been perpetrated by insiders. But although botnets lack the means to technically hamstring the Internet, they are threatening its trustworthiness and openness through the dissemination of malicious software and spam. The authors point out that establishing international standards to address cybercrime while defending civil liberties is a continuing challenge, but even more formidable is coaxing countries to comply with these standards through the implementation and enforcement of anti-cybercrime laws.
Click Here to View Full Article
to the top


Scientists Discuss Use of DNA and Information Technology
Memphis Commercial Appeal (TN) (06/08/07) Connolly, Daniel

The 13th International Conference on DNA Computing, taking place this week at the University of Memphis, featured discussions from computer scientists from around the world on the next generation of computers and medicines. The conference is based around the idea that nature is better at creating complex systems than human beings. DNA computing was created in 1994 at the University of Southern California when computer scientist named Len Adleman wrote a paper outlining his efforts to use biological methods to solve the traveling salesman puzzle, which requires finding the shortest route between several cities. Adleman used a complex method utilizing snippets of DNA to solve the problem, and while scientists have concluded that DNA is not the most efficient way to solve mathematical problems, biological elements could still be used in a process called "self-assembly," according to University of Memphis computer science professor and event organizer Max Garzon. He says proteins and other biological items could be combined to create complex, tiny machines. "The holy grail, if you wish, it to build computational devices, intelligent devices, out of DNA molecules and other molecules," Garzon says. Duke University graduate student Urmi Majumder, who spoke at the conference about her research, says, "What you're doing is copying nature and building stuff from the bottom up."
Click Here to View Full Article
to the top


UCF Researchers Studying How Virtual Reality Can Influence Fire Policies
University of Central Florida (06/06/07) Binette, Chad

University of Central Florida researchers are developing a study to examine if interactive, virtual reality simulations of wildfires will make residents more willing to invest in fire prevention. The interactive simulation will depict a wildfire spreading through Volusia County, Fla. Participants will decide how much they want to invest in prescribed burns and insurance, which will be compared to decisions made by people receiving only written information about the danger of wildfires. Economic researchers Glenn Harrison and Elisabet Rutstrom, electrical engineering and computer science researcher Charles Hughes, philosophy research Stephen Fiore, and the Institute for Simulation and Training hope the project will demonstrate how virtual reality can be used as an effective public policy tool that allows residents to experience the long-term effects of economic and political decisions. "This technology could empower ordinary citizens to make decisions that may be comparable in quality to experts' and save society from making bad decisions," Fiore says. The National Science Foundation provided $680,000 for the project, which is scheduled to start simulations within six months. The entire study will take about two years, but the first results should be available at the end of 2007. Participants will experience 30 years simulated over the course of an hour and will control how they view the environment, such as flying over the forest, walking through it, or being guided on a predefined path. Hughes says he believes these types of simulations may become common in museums, classrooms, and other venues, especially because the cost of technology required for such simulations has dropped significantly in recent years.
Click Here to View Full Article
to the top


Interactive Paper Sounds Exciting
New Scientist (06/04/07) Simonite, Tom

The next generation of paper will be interactive, according to researchers in Sweden. Mikael Gulliksson, a researcher at Mid Sweden University in Sundsvall, and colleagues have prototype billboards on display at the university that play clips of music and spoken dialogue of comedians when they are touched. The researchers used conductive inks containing particles of silver to print touch sensors and speakers onto the paper material of the billboards, and the interactive paper is said to be inexpensive to assemble and easy to recycle. "We've used the roll-to-roll methods used by industry to process paper materials," says Gulliksson. With a 3-centimeter thick back layer of Wellboard forming its base, the two-meter high billboards have a sheet of paper screen-printed with conductive ink placed on the base, a second sheet with the billboard design placed on top, and a middle conductive layer that is connected to the power supply and simple microelectronics. A fine pattern of conductive lines of current flow is used to make the touch sensors, and electromagnets printed out of conductive ink and stretched over a cavity like a speaker cone behind the billboard is used for speakers. "The result looks and feels like paper but has electronic, interactive features," says Gulliksson.
Click Here to View Full Article
to the top


Japanese Robot Likes Sushi, Fears President
Reuters (06/05/07)

A research team at Japan's Meiji University's School of Science and Technology has built a robot that is able to make 36 different facial expressions when it hears certain words. Kansei can display expressions that can range from happiness to sadness to anger to fear. The robot, for example, can smile when it hears "sushi," and appear disgusted when it hears "president." Kansei uses a program that makes word associations from a 500,000-word online database that updates itself. A silicon face mask covers 19 movable parts of the robot, which could hear the word "president," then trigger the online database to find associated words such as "Bush," "war," and "Iraq," and finally show the appropriate facial expression. "What we are trying to do here is create a flow of consciousness in robots so that they can make the relevant facial expressions," says Meiji University professor and project leader Junichi Takeno. "I believe that's going to be a key to improving communication between humans and robots." Takeno says there are plans to provide Kansei with speech abilities and enable the robot to convey feeling.
Click Here to View Full Article
to the top


Sounding Out the Future of SMC
IST Results (05/30/07)

The European Commission's Information Society Technologies (IST) research initiative has drafted a roadmap to establish Europe as a leader in the field of sound and music computing (SMC). Part of the Sound to Sense, Sense to Sound project (S2S2), the roadmap identifies, characterizes, and proposes strategies to approach the key research challenges in SMC over the next 10 to 15 years, including uniting current fragmented efforts and creating a common research agenda for European output. Composer Nicola Bernadini, coordinator of the two year IST-funded S2S2 project, says SMC research is trailing behind the music industry, and that the roadmap is intended to show what research into SMC could provide. The roadmap highlights five key challenges--designing better sound objects and environments; understanding, modeling, and improving human interaction with sound and music; training multidisciplinary researchers in a multicultural society; improving knowledge transfer; and addressing social concerns. Following three scenarios, the roadmap explains how SMC research will impact European society and economies in the future. Bernadini says the scenarios show how advances in SMC technology, such as sonic environments, interactive music devices, and expert music companions, will change our surroundings.
Click Here to View Full Article
to the top


NCSA, Library Science Receive $1.2 Million Mellon Foundation Grant
NCSA (National Center for Supercomputing Applications) (05/31/07)

The National Center for Supercomputing Applications (NCSA) and the Graduate School of Library and Information Science (GSLIS) at the University of Illinois at Urbana-Champaign will use a $1.2 million grant from the Mellon Foundation to make it easier for humanities researchers to turn unstructured data into structured data. Michael Welge, the project's chief investigator and head of the NCSA's Data Intensive Technologies and Applications Division, says an automated tool is necessary because so much time is spent searching for information that is not proprietary and not in databases that can be easily searched. "Someone who wants to research 19th century novels or the works of Cervantes has a wealth of information available to them, but without tools to help them they'll spend a long time searching that haystack for their particular needle," says Welge. The NCSA/GSLIS team plans to build a Software Environment for the Advancement of Scholarly Research (SEASR), which will allow researchers to find, extract, analyze, and manage data. Pronounced "Caesar," the software will be something of an extension of NCSA's D2K software and IBM's Unstructured Information Management Architecture. The NCSA/GSLIS team will make SEASR modular and easy to use for humanities and social science researchers. The SEASR team believes other researchers in the sciences, engineering, and national defense could benefit from such software, and suggests it could be adapted to fit their needs down the road.
Click Here to View Full Article
to the top


We Can See Clearly Now
Government Computer News (06/04/07) Vol. 26, No. 13, Marshall, Patrick

Though face recognition technology experienced a series of failures in the early 2000s, those setbacks prompted a focused research effort, the results of which are now evident. Scores from the 2006 Face Recognition Vendor Test (FRVT) were 10 times better than the scores from the previous FRVT test in 2002. In addition, the face recognition software was found to be more accurate than humans by the 2006 FRVT. Whereas face recognition algorithms were originally based on single, still images of faces, researchers today use 3D images, allowing algorithms to gather data on how features look under various lighting conditions and viewing angles, thereby generating more precise measurements. Microfeature analysis, the identification of patterns in skin texture, is another valuable development, thanks to new, higher-resolution cameras. Skin texture patterns are so unique that even identical twins differ, making microfeature analysis a "secondary signature" of the face, along with the geometric signature, says Joseph Atick of L-1 Identity Solutions. Patrick Flynn, one of the FRVT 2006 investigators and a professor at the University of Notre Dame, notes that FRVT only assessed the technology's performance in controlled, cooperative identification situations, throwing doubt on whether it can function as well in uncontrolled conditions with uncooperative subjects. At a minimum, Flynn anticipates a growing adoption of face recognition technology in controlled surveillance situations, such as verifying employees at the elevator.
Click Here to View Full Article
to the top


Applying Lean to Application Development and Maintenance
McKinsey Quarterly (05/07) Kindler, Noah B.; Krishnakanthan, Vasantha; Tinaikar, Ranjit

Application development and maintenance (ADM) eats up about 50 percent of a business' average IT budget, and a reduction of staff must be complemented by an increase in the productivity of the remaining personnel in order to lower ADM costs while raising efficiency. The authors estimate that applying lean manufacturing principles to ADM can yield a 20 percent to 40 percent upgrade in efficiency and enhance the quality and speed of execution. Such principles include flow processing, which matches input rhythm with production flow to lower overcapacity or excess inventory; standardization, which can be implemented to slash the waste yielded when requirements are classified in a makeshift manner; load balancing, which spurs the organization to deploy development staff across multiple sites, along with outside vendors; complexity-based project segmentation that helps managers send projects to the appropriate resources and avoids needless overhead for simple jobs; and quality ownership that is broadened to include the business, designers, coders, and testers. "A lean transformation requires simultaneous changes in the technical systems (changes to tools, methodologies, standards, and procedures), the behavioral system (convincing staff of the value of these changes), and the management system (new roles, metrics, and incentives to encourage the shift)," the authors write. In the first transformation phase, the level of waste in ADM processes is evaluated diagnostically, using interviews and questionnaires to visualize the movement of information and materials through the system. Once waste is uncovered, its root causes are traced and opportunities to augment productivity are determined, and major factors in ADM waste include the lack of a clear and effective project prioritization methodology, and an anarchic and inefficient process for defining project needs. Executives identify the biggest challenges of a lean transformation as facilitating a shift in behavior, attaining a general rather than specific focus on principles, and establishing the proper metrics and incentives.
Click Here to View Full Article - Web Link May Require Free Registration
to the top


To submit feedback about ACM TechNews, contact: technews@hq.acm.org

To unsubscribe from the ACM TechNews Early Alert Service: Please send a separate email to listserv@listserv.acm.org with the line

signoff technews

in the body of your message.

Please note that replying directly to this message does not automatically unsubscribe you from the TechNews list.

ACM may have a different email address on file for you, so if you're unable to "unsubscribe" yourself, please direct your request to: technews-request@ acm.org

We will remove your name from the TechNews list on your behalf.

For help with technical problems, including problems with leaving the list, please write to: technews-request@acm.org

to the top

News Abstracts © 2007 Information, Inc.


© 2007 ACM, Inc. All rights reserved. ACM Privacy Policy.

About ACM | Contact us | Boards & Committees | Press Room | Membership | Privacy Policy | Code of Ethics | System Availability | Copyright © 2014, ACM, Inc.