Read the TechNews Online at: http://technews.acm.org
ACM TechNews
May 16, 2007

Learn about ACM's 2,200 online courses and 1,100 online books
MemberNet
CareerNews
Unsubscribe

Welcome to the May 16, 2007 edition of ACM TechNews, providing timely information for IT professionals three times a week.


HEADLINES AT A GLANCE:

 

H1-B Visa Reform Gains More Support
InternetNews.com (05/15/07) Mark, Roy

Senators Joe Lieberman (ID-Conn.) and Chuck Hagel (R-Neb.) on Tuesday introduced the Skilled Worker Immigration and Fairness Act of 2007, a H1-B visa reform bill that would retroactively increase the cap to 115,000 in 2007, and would add a flexible adjustment mechanism that would allow the cap to be raised to as many as 180,000 visas, depending on market conditions. In April, the 2008 allotment of 85,000 H1-B visas was consumed in only one day. "To remain competitive, American companies need access to highly educated individuals," Lieberman said in a statement. "But today's system makes it difficult for innovative employers to recruit and retain highly educated talent, which puts the U.S. at a competitive disadvantage globally." The bill also contains measures to improve H1-B visa fraud prevention and enforcement, including prohibiting employers from advertising jobs as being exclusively open to H1-B visa holders. The bill also raises the H1-B petition fee by $500 to pay for enhanced enforcement and ensure the program is self sustaining. The bill would also exempt foreign nationals who hold U.S. graduate degrees or non-U.S. graduate degrees in science, technology, engineering or math. Earlier this year, Microsoft Chairman Bill Gates told a Senate panel it is unsound policy to restrict foreign-born, U.S. college graduates from working in the United States. "It makes no sense to tell well-trained, highly skilled individuals, many of whom are educated at our top universities, that they are not welcome here," Gates said. "We have to welcome the great minds of this world, not shut them out of our country."
Click Here to View Full Article
to the top


Report: Tech's Gender Gap Widened by Uninviting Workplace
eWeek (05/14/07) Perelman, Deborah

The lack of women in technology related fields is typically blamed on a lack of women interested in pursuing computers and engineering, but the experience of women working in such fields is rarely discussed. The "Women in Technology 2007" report, published by Women in Technology International, finds that the vast majority of women working in their field enjoy their jobs, and that of the 2,000 female respondents, 75 percent would encourage other women to pursue similar interests. However, the report found that many female tech workers have mixed feelings about their companies' climates. Only 52 percent of women believed their company offers a favorable environment for women. "There is a kind of conventional wisdom that goes around that maybe women don't like technology. So, for us to learn through this research that they do like it and do find it to be a place where they can make a difference and would go as far as to recommend it to others is very telling," said Compel President and report co-author Patricia Schaefer. "What was very intriguing was that such a large percentage of women said that they didn't find their organizational climates to be very inviting to women. They're saying that they don't feel that their voices are heard and it causes them to question whether this is an environment that they wish to stay in." Almost half of respondents, 48 percent, felt that their views were not as acknowledged or welcomed as those of their male counterparts, and 44 percent said that women in their company were given fewer opportunities to participate in and lead large projects. While the majority of female tech workers, 73 percent, felt that they could influence their bosses, only 53 percent described themselves as broadly influential in the organization, and only a little over half of respondents said they felt in charge of their careers.
Click Here to View Full Article
to the top


Worm Attacked Voter Database in Notorious Florida District
Computerworld (05/16/07) Friedman, Brad

A variant of the infamous SQL Slammer Worm struck the computer database infrastructure of Florida's Sarasota County on Oct. 23, 2006, and crippled the network by overwriting the system's administrative password. Oct. 23 was the day people voted early on the U.S. House race in the state's 13th Congressional district between Vern Buchanan (R) and Christine Jennings (D); the Republican candidate was ultimately declared the winner, but the electoral outcome has been contested, and questions about the worm's possible impact on the results--and the disclosure or non-disclosure of the incident to the parties challenging the election--remain. An incident report filed by Sarasota County reported that the worm infected a server on the county's database system, which then "sent traffic to other database servers on the Internet, and the traffic generated by the infected server rendered the firewall unavailable." The report indicated that the server had never been patched for Slammer, and its operation during the election was something of an embarrassment because the machine was slated to be decommissioned, according to Suncoast Technology Center information security analyst Hal Logan. The manufacturer of the touchscreen voting systems used in Sarasota issued a bug warning that the county ignored, and submitted a series of stipulations to the county before its release of the source code to a panel of computer scientists organized to probe the incredibly high number of undervotes recorded on the touchscreen machines in the District 13 congressional election in Sarasota. Both documents were withheld from the legal counsel representing the election's challengers by the Sarasota Election Supervisors office. Logan admitted to the possibility that a worm could be used to hack into the voting system, but strongly doubted that the Oct. 23 attack would have been successful had that been the goal. "Our network doesn't share copper or wire with the Supervisor of Elections' network," he explained.
Click Here to View Full Article
to the top


Free Tool Offers 'Easy' Coding
BBC News (05/14/07) Fildes, Jonathan

The Massachusetts Institute of Technology's Media Lab has unveiled Scratch, a free programming tool for both Windows and the Macintosh aimed primarily at children that enables users to blend images, sound, and video to create their own animated stories, video games, and interactive artwork. Scratch requires no prior knowledge of computer languages and uses a simple graphical interface that allows programs to be assembled like building blocks. MIT professor Mitchel Resnick, one of the researchers at the Lifelong Kindergarten group and the inventor of Lego Mindstorms, says Scratch is a new, more accessible type of programming language that doesn't require technical expertise or experience. Objects and characters can be selected from a menu, created in a paint editor, or cut and pasted off the Web, and animated by snapping together different "action" blocks into stacks. "They don't have to worry about the obscure punctuation and syntax common in most programming languages," Resnick says. Scratch is inspired by the way DJs use other people's music to create new sounds. "We want people to start from existing materials--grabbing an image, grabbing some sound, maybe even bits of someone else's program and then extending them and mixing them to make them their own," Resnick says. British Computer Society President Nigel Shadbolt, a professor at the University of Southampton, says Scratch provides a good introduction to computational thinking and could inspire the next generation of computing professionals.
Click Here to View Full Article
to the top


Colleges Pushing Computing Profession
The State (SC) (05/15/07) Hammond, James T.

University of South Carolina's computing department chairman Duncan Buell said South Carolina students are turning down lucrative salaries by not studying computing, one of the highest-paying and most in-demand degrees for new college graduates. Enrollment in computing studies at USC has dropped 61 percent over six years, from 688 undergraduates in 2000 to 265 in 2006. Clemson University has seen undergraduate enrollment in computing studies shrink by 42 percent, from 796 in 2000 to 466 in 2006. Meanwhile, demand for computer science graduates is extremely high. The American Electronics Association says the U.S. technology industry added 150,000 jobs in 2006, and new college graduates with a computing degree can earn about $52,000 a year. Buell said he is fighting the myth that America is outsourcing all of its high-tech jobs, and that he hopes to see enrollment rebound to about 500 students. Buell said there is $50 billion in computing salaries available in the United States, and while many lower-level programming jobs have been outsourced to low wage countries such as India, demand for people with bachelor's degrees has been growing in recent years. Buell said most of the new jobs for people with computing degrees are in information management, and while fewer in number, people with a knowledge of scientific computing are in high demand. Buell also said that high school guidance materials and courses do not adequately prepare students entering college-level computer studies. To attract more students to computer studies, USC is hosting two one-week camps for high school students, one in games programming and one in media computing and animation.
Click Here to View Full Article
to the top


Walk Like an Egyptian - or a Roman
Engineering & Physical Sciences Research Council (05/15/07)

A project funded by the Engineering and Physical Sciences Research Council will bring together computer scientists and cultural heritage researchers to assess if today's increasingly advanced 3D computer technology could be combined with the most recent historical evidence to create significantly improved visual reconstructions of churches, palaces, and other ancient sites. The collaborative project could help historians, students, and museum visitors gain a better perspective on how such sites were perceived and used by the people who inhabited them in the past. Brighter, more vivid color and better contrast between light and dark areas can be used to create a much more realistic simulation, which could provide better insight as to how people lived in such spaces. Another benefit of creating computer reconstructions is that they are less expensive than physical models, and it is far easier to store and update the data than it is to transport and change the physical model. These techniques could be used, for example, to conduct a chemical analysis of an ancient lamp to determine the type of fuel used, which could then be used to determine what type and how much smoke would have been produced, the amount of light in the space, and how objects would have looked under the light. This information could then be incorporated into the computer-generated simulation of the site, perhaps creating a different perception of how the space appeared. Within a few years, these techniques being assessed could provide the basis for 3D computer displays in museums that show how artifacts would have appeared in their original settings. The project is being conducted by researchers from the Warwick Manufacturing Group and the new Warwick Digital Laboratory at the University of Warwick.
Click Here to View Full Article
to the top


Sizing Up the Coming Robotics Revolution
CNet (05/15/07) Lombardi, Candace

In an interview, director of MIT's Computer Science and Artificial Intelligence Lab (CSAIL) and iRobot co-founder Rodney Brooks discusses how robotics technology will advance in the coming years, and what breakthroughs are needed for real-world science to realize technologies that currently exist only in the realm of science fiction. He says the gap between reality and fiction in robotics stems from a misappraisal of how certain things function, which gives rise to unrealistic ambitions for AI. Brooks lists four objectives that scientists and engineers must achieve before an artificially intelligent and affordable major-domo can be created: Robots must be imbued with a two-year-old child's object recognition skills, a four-year-old's comprehension of language, a six-year-old's manual dexterity, and an eight-year-old's social understanding. The CSAIL director foresees a gradual ceding of control of automobiles to automated systems, noting that automakers are laying the groundwork for this trend by adding assistive technologies to their products. He says the most interesting work being undertaken in robotics, as far as the military is concerned, is making the machines rugged enough to withstand abuse, while the concentration in household robotics is on delivering satisfactory performance at an affordable price, which relates directly to the four challenges Brooks outlines. He thinks robots will become very popular for dangerous, drudgery-heavy jobs such as mining, as well as complex tasks that require exacting precision and control, such as brain surgery. Brooks posits that people are already surrounded by such cool AI applications as search algorithms and statistical machine learning, and he believes robotics will change people's expectations of their world.
Click Here to View Full Article
to the top


Computer Guru Prepares to Grasp the Future with Both Hands
Vancouver Province (05/15/07) Jamieson, Jim

Toronto-based computer design-interface guru and Microsoft researcher Bill Buxton, speaking at the International World Wide Web Conference, predicted that a more versatile user interface will arrive sooner rather than later to replace the mouse and the keyboard. Buxton said the reason the mouse has lasted as long as it has is because it was an excellent idea when it was first invented in the mid 1980s. "It's a good example of how the better the idea, the more difficult it is to displace," Buxton said. Creating a new interface is not about replacing the mouse, according to Buxton, but creating a new tool that complements it. Buxton said Microsoft is developing the idea of interactive surfaces, where a computer can be a desk or wall or window. "As soon as you start getting larger surfaces, you start noticing you have four fingers and a thumb," Buxton said. "I can put 10 points on the screen and grasp and move them with gestures." Buxton said the current Graphical User Interface is based around using a single point, adding, "You have the gesture vocabulary of a fruit fly."
Click Here to View Full Article
to the top


Using 'Offensive Technologies' to Secure Networks
Network World (05/14/07) Brown, Bob

Stanford University computer science Ph.D. graduate student Tal Garfinkel is a program chair for the First Usenix Workshop on Offensive Technologies (WOOT), which takes place in Boston on Aug. 6. Garfinkel says the term "offensive technologies" applies to developers researching and understanding techniques for exploiting software weaknesses, reverse engineering, information gathering, evading detection, and similar activities. By understanding both offensive technologies and traditional defensive strategies such as intrusion detection, access control, and bug detection and prevention, one better understands computer security. Garfinkel says many computer experts read "black hat" magazines and the code used in attack tools. The problem is that the editorial quality of offensive technology journals is often low and with little peer review and the veracity of claims in such media are often questionable. This lack of quality writing on offensive technologies is a major reason for the WOOT conference, where people with different backgrounds and experiences with attack technologies can share their knowledge and expertise. Garfinkel says future malware will target high value targets such as business intelligence and intellectual property that can be sold offshore where litigation and enforcement could be a challenge. Garfinkel also says the "Wild West atmosphere" of massive amounts of botnets everywhere is bound to give out at some point, and that the most important thing people can do is to influence CIOs and others purchasing programs to put pressure on vendors to build more secure products.
Click Here to View Full Article
to the top


Inventing the Future of Business Technology
InformationWeek (05/15/07) Claburn, Thomas

Accenture Technology Labs spends an average of $250 million a year at its four research labs trying to figure out how businesses can work better. Luke Hughes, director of research at Accenture's Palo Alto, Calif., lab, said over the next five years, Accenture will focus on eight significant business technology trends: virtualized infrastructure; seamless IT interoperability; process-centric (modular); closed-loop analytics; fluid collaboration platforms; Web 2.0 as a mass participation platform; mobility; and industrialized software development. Hughes said a more immediate project his researchers are working on is the Business Event Advisor, a program that attempts to automatically answer business intelligence questions by translating news events into actionable information using limited computer intelligence. For example, the resignation of a supplier's CFO might indicate looming financial trouble for that supplier, indicating there may need to be a change in supply arrangements, or a competitor's recall may provide an opportunity to capitalize on their predicted lost revenue. Hughes expects the system to be on the market in three years because, while the technology is available today, the specific data modeling must be made for each industry segment and text analytics is still maturing. Among Accenture's other projects is the Virtual Corridor, a twist on traditional videoconferencing, where instead of scheduled videoconferences in a meeting room, an always-on camera system links to locations, encouraging impromptu collaboration.
Click Here to View Full Article
to the top


Robo-Quandary
Milwaukee Journal Sentinel (05/08/07) Johnson, Mark

Marquette University assistant philosophy professor Keith A. Bauer questions where humans will draw the line when it comes to how far will we allow technology to change our lives and our bodies. In the paper "Wired Patients," due to be published this year in the Cambridge Quarterly of Healthcare Ethics, Bauer describes a possible future where children are genetically engineered to be smarter and well behaved, adults will live 20 years longer than today, wireless links connect our brains to email transmitters, and humans are given biological upgrades such as night vision. While these predictions are in the future, current technology is not that far behind. Already, about 200 Americans have received an implant called the VeriChip that stores medical information that can be retrieved by a doctor with a scanner. Other technologies described in the paper include electrodes that can be implanted into patients brains to help them regain functions lost to strokes and spinal cord injuries, implantable heart monitors that can collect information at home and send it to a doctor in the hospital, artificial hearts that prolong life, and so-called bionic limbs that replace those lost through war or an accident. Meanwhile, many Americans are unaware of the debate over transhumanism, a movement that supports using new technology to expand the capabilities of the human mind and body. Supporters believe that humans have always desired to improve the human species and that not utilizing technology to do so is admitting defeat to the slow changes in evolution. Those opposed believe that redesigning ourselves and our children will widen the gap between the privileged and the underprivileged, changing the lives of future generations and assuming God-like powers. Bauer expressed his concern that these modification may not only change ourselves, but will alter the species, and that any government effort to control such technology lags behind the advancements of science.
Click Here to View Full Article
to the top


Quick on the Draw
Globe and Mail (CAN) (05/15/07) P. B6; Johne, Marjo

University of Toronto computer science professor Karan Singh has developed graphics software that could radically transform the architecture, engineering, and construction industry by allowing drafters to create three-dimensional, computer-aided renderings in a matter of minutes, instead of days as with traditional computer-aided drafting. The software could not only accelerate the design and drafting process, but reduce building errors and allow for more accurate budget planning as well. Paul Teicholz, a consultant and founder of the Center for Integrated Facility Engineering at Stanford University in Stanford, Calif., said the industry is ready for a major change, as productivity in the construction industry has dropped at an average compound rate of .6 percent over the past 40 years. By comparison, other industries saw their productivity increase by almost 2 percent over the same period. The new software could help shorten the architectural phase of a project, says Sketch2 Corp. CEO Colin Graham. The software could eliminate the need for drafters to redo work the architect did in another medium, and allow architects to create renderings themselves. The software enables users to use a stylus on a computer screen, much like a pen on paper. The software starts with a blank floor plan, and the user tells the program what type of room to make and can then start sketching components of the space using the stylus. The program recognizes patterns, such as a line for a wall, and suggests materials such as brick or drywall. Then the designer can add furniture from a database of office products. Graham predicts the program could provide savings in construction costs of 25 to 30 percent, largely due to a more efficient system in which architects and designers focus on creating better and more cost-effective designs.
Click Here to View Full Article
to the top


Microsoft Research Aims to Make Computing Ubiquitous
eWeek (05/15/07) Galli, Peter

Microsoft's chief research and strategy officer Craig Mundie said Microsoft is researching multiple opportunities to bring computing into every aspect of people's lives and improve the way people communicate. For example, to improve health care Microsoft is experimenting with optical recognition technology that can help people ensure they are taking the correct medication at the correct time. Optical recognition software could also be used to increase the text size of printed mail and documents to make them easier to read. Microsoft is also developing technology to help illiterate people perform computing tasks using video and icons, which could be particularly useful in regions where the majority of people can only gain access to computing technology through cell phones. Microsoft is working to make such scenarios possible through its research and its Unlimited Potential initiative, which recently expanded to include the Student Innovation Suite software package, which is available to governments and students in emerging countries round the world for just $3. Mundie said Microsoft is also working on heterogeneous multicore processors, but faces challenges in overcoming concurrency and complexity as systems become far more distributed but also parallel and asynchronous. "Many new technologies are going to have to be brought forward that are loosely coupled, asynchronous, concurrent, composable, decentralized and resilient. A great deal more constructive thinking is going to have to be done around how we build these systems in the future," Mundie said.
Click Here to View Full Article
to the top


How to Avoid Software Black Holes
SD Times (05/01/07)No. 173, P. 20; Worthington, David

Salon.com founder and "Dreaming in Code" author Scott Rosenberg discusses successful software development in an interview, using insights derived from the problems encountered by Mitch Kapor's Chandler project to create an open source alternative to Microsoft Outlook. In Rosenberg's opinion, Chandler's problems did not necessarily stem from too many engineers struggling to contribute to its development, but rather from the fact that the project "was being led by someone who is not primarily a programmer, whereas open source projects are much more typically driven by programmers." Rosenberg contends that engineers' constant desire for specificity is a need that no project can fulfill, which is why miscommunication between engineers and business people is frequent; the Salon.com founder says regular communication between these two camps and close attention to what people are saying can help avoid this problem. Rosenberg recommends caution when it comes to vocabulary and terminology, in view of the possibility that programmers and nonprogrammers might not share the same meanings. He says the Chandler team devoted an excessive amount of time to the abstract, arguing that "Working from prototypes and dealing with things that are partially functioning, so that business people can have something to go on, provides more clarity [than an abstract product]." Rosenberg says the incremental principle that underlies any creative act applies to software development, and recommends that projects should start small with an achievable objective. By studying the Chandler project's progress, Rosenberg concludes that "the lesson [for open source developers] is to put out a small piece of a product that is useful enough to inspire people. Do it earlier, rather than later."
Click Here to View Full Article
to the top


Compilers and More: Precision and Accuracy
HPC Wire (05/11/07) Vol. 16, No. 19, Wolfe, Michael

It is the job of numerical analysts to ascertain the accuracy and precision of a computed answer and determine the best way to formulate the computation to augment these two qualities, while The Portland Group computer engineer Michael Wolfe writes, "The rest of us address the problem by going to double precision and hoping for the best." He notes that the order of computations can be altered and the answers delivered impacted by a compiler, and observes that the code generation scheme used by compilers to vectorize summations has remained unchanged for three decades. Wolfe points out the relative sluggishness of floating point operations, and says high-performance computing would benefit from a lowering of the multiply count, the substitution of addition for multiplication, or the replacement of division by multiplication. The author recalls that IEEE supported an initiative to standardize on floating point operations in the 1980s, and all current mainstream processors follow this scheme. Wolfe observes that performance can also be affected by precision control, noting that numerical library efforts by Oak Ridge National Labs and the University of Tennessee utilize the fact that single precision floating point operations run considerably faster than double precision on numerous contemporary processors.
Click Here to View Full Article
to the top


Sensor Sensibility
Science News (05/05/07) Vol. 171, No. 18, P. 282; Klarreich, Erica

The ubiquitous proliferation of "smart dust"--tiny sensors that can self-organize into wireless networks--is considered an inevitability by engineers, but its practical application requires the construction of a global perspective from the data supplied by all the individual sensors. Researchers are using topological, or shape-based, methods to address this challenge and others. "Figuring out the structure of wireless sensor networks is the kind of problem topology was meant to solve," explains University of Illinois at Urbana-Champaign mathematician Robert Ghrist. Topology differs from geometry in that it considers features and properties that do not change when a shape is stretched and its geometry warped; examples include the Euler characteristic and homology, which can solve many problems associated with wireless networks. With a given sensor network, mathematicians can examine a theoretical shape called the Rips complex that diagrams how sensors communicate with each other, and from this form a picture of which sensors must remain operational in order to maintain full area coverage, and which sensors can be deactivated. The homology of a 10,000-sensor network can be computed in less than a second by a standard laptop, thanks to advances in computer speed and homology algorithms. Challenges associated with the management of data generated by sensor networks are also being tackled via topology, an example being the problem of obtaining an accurate count of objects in a covered area. In the real world, some objects are counted by more sensors than others, so topological techniques, particularly those involving the Euler characteristic, are being tapped to make each object contribute equally to the total.
Click Here to View Full Article
to the top


Software's Rules Have Changed
InformationWeek (05/07/07)No. 1137, P. 43; Weier, Mary Hayes

A McKinsey/Sand Hill Group poll of nearly 500 senior IT and business executives on current and future software trends, expectations, and spending habits indicates that business owners are demanding the more rapid rollout of innovative software, with an emphasis on usability and integration as defining characteristics of innovation. For the first time in a long while, software costs are not a major issue for companies, while IT organizations are being more permissive in terms of allowing business units to invest in smaller, more original software projects. The results of the survey also show that business users will circumvent the IT department to fulfill their software needs, if necessary. Respondents listed the software trends that carry the biggest implications for their businesses, and the most highly desired products from vendors, in descending order, as innovation, software as a service, service-oriented architecture, and open source software. However, 22 percent of respondents claimed industry innovation is already peaking, while 8 percent argued that software's innovation high point has come and gone. Sixty-nine percent of those surveyed reported that their software budgets are centrally controlled, and 40 percent said business units will have more control over the budgets in two years' time. A survey author says this shift will be a boon to small software vendors, and larger software companies must plan accordingly; 29 percent of polled IT executives thought large vendors can meet demands for innovation, while only 15 percent of non-IT executives felt the same way. A shift in the way companies purchase software is highly probable, with business technology executives expecting around 40 percent of their software business to be paid for via subscription pricing and other alternatives to traditional licenses over the next few years.
Click Here to View Full Article
to the top


Mobiscopes for Human Spaces
IEEE Pervasive Computing (06/07) Vol. 6, No. 2, P. 20; Abdelzaher, Tarek; Anokwa, Yaw; Boda, Peter

Mobiscopes are networked sensing applications that count on multiple mobile sensors to perform global tasks, and the authors contend that their emergence is being facilitated by the proliferation of affordable mobile devices capable of processing and sensing, along with ubiquitous network connectivity's rapid expansion. By their extension of the traditional sensor network model, mobiscopes present challenges in data management, data integrity, privacy, and network system design. There are a number of existing applications that fall within this category, but they offer customized one-time solutions to what are basically the same series of challenges. The authors argue that the time is ripe to consider a general mobiscope architecture that recognizes common challenges and offers a systematic process for future mobiscope design. Common requirements for mobiscope applications include the assurance of data persistence even when sensing nodes exit the data collection or mobile nodes are absent; real-time information delivery; consideration of social constraints on system behavior because the system exploits sensors and mobility sources already present in the environment; and the consistent applicability of metadata across networks. Heterogeneity is both an advantage and a disadvantage of mobiscopes, and heterogeneous sensing systems are resistant to the vulnerabilities of sensing modalities and even more resilient against flawed, absent, or malevolent data sources than carefully designed homogeneous systems. "In general, because future mobiscopes' main goal will be information distillation from raw data, system designers will need theoretical foundations for obfuscating the raw data in a way that reconciles privacy requirements on individual measurements with the ability to compute certain aggregate properties of the collective," the authors write. A mobiscope architecture must introduce communication protocol and data management interfaces, as well as programming interfaces for in-network computing.
Click Here to View Full Article
to the top


To submit feedback about ACM TechNews, contact: [email protected]

To unsubscribe from the ACM TechNews Early Alert Service: Please send a separate email to [email protected] with the line

signoff technews

in the body of your message.

Please note that replying directly to this message does not automatically unsubscribe you from the TechNews list.

ACM may have a different email address on file for you, so if you're unable to "unsubscribe" yourself, please direct your request to: technews-request@ acm.org

We will remove your name from the TechNews list on your behalf.

For help with technical problems, including problems with leaving the list, please write to: [email protected]

to the top

News Abstracts © 2007 Information, Inc.


© 2007 ACM, Inc. All rights reserved. ACM Privacy Policy.