Read the TechNews Online at: http://technews.acm.org
ACM TechNews
June 7, 2006

MemberNet
The ACM Professional Development Centre
Unsubscribe

Welcome to the June 7, 2006 edition of ACM TechNews, providing timely information for IT professionals three times a week.

Sponsored by Information, Inc.

http://www.infoinc.com/sponsorthenews/contactus.html


HEADLINES AT A GLANCE:

 

New Project Lets Real Users Gang Up on Software Bugs
UW-Madison (06/05/06) Mattmiller, Brian

The Cooperative Bug Isolation Project, led by University of Wisconsin computer science professor Ben Liblit, attempts to bring statistical precision to the murky process of post-deployment software debugging. Due to the mounting complexity of software and its numerous interactions with hardware, networks, other software, and especially humans, it is impossible to detect or anticipate every bug in a program, Liblit says. "That behavior is so dynamic that it becomes useful to look at [software programs] almost like they were some sort of organic system, whose complete behavior is unknowable to you," Liblit said. "But there are behavior trends you can observe in a statistically significant way." Liblit infuses binary code with an instrumentation that generates a random sample of thousands of software programs in use by real people, while ensuring that privacy is safeguarded. The feedback reports are funneled into a database where Liblit uses statistical modeling techniques to identify bugs that occur with sufficient frequency to disrupt many users. Liblit then prepares a bug report that he sends to the software engineers who can actually fix the problem. ACM recognized Liblit's doctoral dissertation on cooperative bug isolation as the best in the world last year among the nominated engineering and computer science dissertations. IBM and Microsoft have expressed interest in Liblit's research, which has already been implemented by some strains of the open-source community. Liblit believes that the real value of his system is its potential to dramatically improve the currently ad hoc process of post-deployment software debugging. Because companies race to bring their products to market, some software applications are never subjected to thorough pre-deployment testing, instead leaving end users and tech support personnel to deal with the bugs.
Click Here to View Full Article
to the top


Fed Plan for Cybersecurity R&D Released
Government Computer News (06/02/06) Wait, Patience

In its first step toward developing a coordinated focus on basic cybersecurity research, the government has released its Federal Plan for Cyber Security and Information Assurance Research and Development. The plan identifies several trends that are likely to exacerbate the need for new cybersecurity tools in the coming years, including the growing complexity of IT networks and systems, the migration of telecommunications infrastructure toward a unified architecture, and the increasing uptake of wireless technology. The plan does not address policy or budget issues, but instead makes 10 general policy recommendations. The plan suggests that research funding should be allocated to projects that complement the long-term projects being developed by the private sector. Research funding should be used to develop innovative tools that defend against the most pressing threats, according to the plan, which also recommends that cybersecurity and information assurance research and development become an independent agency and a higher budgetary priority. In addition to increased coordination among agencies, the plan calls for funding to ensure that developers make security a consideration from the outset, as well as an examination of the security implications of such emerging technologies as quantum and optical computing. The plan also calls for new analytical tools to measure systems vulnerabilities. Under the plan, government and industry would partner to develop a cybersecurity roadmap and remain in communication about their respective projects. A broad collaboration among government, the IT industry, the research community, and end users would also work to develop, test, and implement a next-generation Internet with improved security.
Click Here to View Full Article
to the top


Unpacking Pecking Orders to Get the Gist of Web Gab
USC Viterbi School of Engineering (06/05/06) Mankin, Eric

Researchers at the University of Southern California's Information Sciences Institute (ISI) have developed a system that can statistically determine the contents of online conversations by identifying the dominant parties. The system is among the first in the discipline of natural language processing that accounts for the fact that online discussions are structured interactions among many users. The researchers hope that the technology could be used to automatically compile reports and meeting summaries, as well as providing chat-room participants with a method for statistically measuring their influence in the discussion. Culling statements from text is simple, says ISI's Eduard Hovy. It is much more difficult, however, for a machine to be able to understand the nuances of human interaction, such as temporal sequencing, references to previous comments, and contextual clues. Hovy's team used the Hypertext Induced Topic Selection (HITS) algorithm to give Web pages a rank and classification based on their relationships with each other. Substituting connections between conversation participants for Web links, the researchers used the algorithm to determine the best answer to the questions discussed in a USC undergraduate computer science course. The HITS application, which combines speech act analysis, lexical similarity, and the trustworthiness of the poster, was correct in 221 out of 314 questions. The researchers first coded part of the data themselves, and then trained the machine and gave it the same data, which it coded with between 65 percent and 70 percent accuracy, though that figure is likely to improve as the technology matures.
Click Here to View Full Article
to the top


Study: Don't Legislate DRM
IDG News Service (06/05/06) Kirk, Jeremy

A group of lawmakers has recommended against the British government enacting legislation that would require the use of digital rights management (DRM) technology. While the All Party Parliamentary Group (APPIG) is unaware of any European initiative to mandate DRM, some publishing groups favor such a law, which the lawmakers warn would restrict access to digital resources. Book and music publishers vigorously support the use of DRM to protect against piracy, while opponents say the technology is overly restrictive for legitimate users and that it impedes libraries' archiving efforts. The APPIG recommends that Ofcom, the U.K. communication regulator, publish guidelines that warn companies about the legal ramifications of installing intrusive DRM technologies in their products. APPIG leader Derek Wyatt cited the recent incident when security researchers discovered that Sony had installed copy-protection software that contained spyware on some 15 million CDs. The APPIG report also calls for a U.K. Department of Trade and Industry investigation into the effect of DRM on market activity. Apple's iTunes, for instance, has different pricing structures in the U.K., Europe, and the United States. Apple also uses DRM technology to ensure that songs purchased at the iTunes store can only be played on Apple's iPod. The heated debate over DRM boils down to the point at which content owners are willing to abandon their rights for the sake of opening resources to the public, according to Lynne Brindley, chief executive of the British Library. The debate over DRM and copyright issues has also drawn in Google, which is being sued in the United States for its controversial Book Search project to scan the full text of books at five U.S. libraries and one in England. Google claims that by offering information on where to buy the books that it is actually creating book sales.
Click Here to View Full Article
to the top


New Wireless Networking System Brings Eye Care to Thousands
UC Berkeley News (06/06/06) Greensfelder, Liese

Researchers from the University of California, Berkeley, and Intel have partnered with an Indian hospital to develop a low-cost wireless network that enables eye doctors to examine patients in remote clinics through a high-quality video conference. The project has been so successful that it will be expanded from five to 50 remote clinics that are likely to see half a million patients a year. Technology holds vast promise for developing nations, but most efforts have been too costly or complicated to have a practical impact in poor, rural areas, said Eric Brewer, professor of computer science at Berkeley. "What we've done here is develop a simple, inexpensive software and hardware system that can provide villages with a high-bandwidth connection to computer networks in cities as far as 50 miles away." Patients visiting one of the clinics meet first with a nurse trained in eye care and then consult with a doctor via a Web camera for about five minutes. The patient receives a hospital appointment if the doctor determines that further examination or an operation is necessary. The technology addresses the shortage of doctors in remote areas and ensures that patients who visit one of the clinics will actually receive treatment if they have to go to a hospital. The project is one of many under the Intel-Berkeley TIER project, or Technology and Infrastructure for Emerging Regions. Brewer and his team set out to create software that would overcome the distance limitations of today's Wi-Fi connections. By linking their software with directional antennas and routers, they have been able to obtain speeds up to 6 Mbps at a distance up to 40 miles, roughly 100 times farther than the reach of Wi-Fi. "If they can find a partner with high-speed networking within 50 miles, this is a great solution for communities around the world to get connected," Brewer said.
Click Here to View Full Article
to the top


Study Finds Offshoring's Impact Overstated
eWeek (06/05/06) Rothberg, Deborah

Employment opportunities in the United States for high-end IT positions continue to look good, according to American Sentinel University chief economist Jeremy Leonard. Since 2002, employers have had the greatest demand for network systems and data communication analysts, and jobs for computer software engineers, computer and information scientists, and computer systems analysts have grown considerably. For example, jobs for software engineers fell 4 percent as the overall job market began to slow, but openings have grown 25 percent in recent years. Leonard takes an in-depth look at the IT employment picture since the last recession in his study, "Offshoring of Information-Technology Jobs: Myths and Realities," and says most of the jobs that have been moved overseas were low-end positions that were labor-intensive, easy to duplicate, or had little need for face-to-face contact. Computer programming positions have taken the biggest hit, followed by job losses involving database administration, coding, and support. "Studies from consulting groups during the stagnant job growth years of 2002 to 2004 stoked very pessimistic views of the future for IT professionals," says Leonard in a statement. "Once the current economic expansion took hold, however, we found that the 2000-02 job losses had little, if anything, to do with jobs moving overseas."
Click Here to View Full Article
to the top


Almaden Research Center Celebrates Two Decades of Whim, Wonder
San Jose Mercury News (CA) (06/04/06) Quinn, Michelle

Long a haven for quirky, cutting-edge innovations in storage technology, nanotechnology, and data management, IBM's Almaden Research Center in South San Jose recently celebrated its 20th anniversary. "We create more technology than anyone would know what to do with," said Mark Dean, the center's director. Almaden, one of eight IBM research centers around the world, has begun to explore services science, an emerging field devoted to helping organizations deal with the proliferation of digital content and improve workflow efficiencies. The center has also created the Healthcare Information Infrastructure to examine how technology could help with the management and sharing of medical records. The Almaden center is looking to hire cognitive scientists, anthropologists, and business and computer experts to supplement its research staff of roughly 400. With a total staff of 600, Almaden is the second-largest research center in the United States. Dean says the center competes with the likes of Google, Yahoo!, and Microsoft for top talent in Silicon Valley. Inventors at the Almaden center have received more than 500 patents in the past five years, during which the center has partnered with Stanford University to create a spintronics facility, created a prototype of a machine that condenses data for applications in environments where space is short, and fashioned the world's smallest functioning computer circuit by colliding molecules into each other.
Click Here to View Full Article
to the top


Robotics Seen as Growth Area for Defense Department
National Journal's Technology Daily (06/06/06) Davis, Michael

Panelists at a Heritage Foundation event warned that policy is falling behind technology as the Defense Department continues to explore the field of robotics. The Defense Department has been actively considering military applications for robotics technology, but fundamental obstacles remain, such as securing the necessary funding. "Mechanical engineering [for robots] is much more difficult than we thought," resulting in a worldwide gap in funding for robotics projects, said NASA's Vladimir Lumelsky. The panelists called for increased investment in basic university research, noting that robots could significantly improve U.S. troops' ability to maneuver in the Middle East. Unmanned vehicles, such as the RQ-4A Global Hawk, are one of DARPA's central initiatives. The Urban Challenge, scheduled for Nov. 7, 2007, will draw teams from across the country whose robotic vehicles will compete in a race that will take them through traffic with only two commands: "start" and "stop." With an eye toward the Middle East, the panelists hope the race will prove the viability of unmanned vehicles in real-life settings. "The robotics technology that we have available today can save lives," said John Leonard, professor of mechanical and ocean engineering at MIT. Most research and funding in the field remains focused on the commercial market, however, though that will change if the Defense Department projects materialize.
Click Here to View Full Article
to the top


Made in the USA: The World Wide Web
ABC News (06/05/06) Yeransian, Leslie

ICANN has come under fire for its rejection of the .xxx domain and delays in integrating IDNs into the domain naming system, with several critics pointing a finger at the federal government and the perceived sway it has with the organization. In the most recent round of domain name applications, ICANN gave the go-ahead to .jobs, .mobi, .aero, and .travel, while rejecting .xxx, .post, .mail, and about half of the proposed TLDs. Many in the industry believe more domains are needed to simplify navigation of the Web as it proliferates. There are now 18 TLDs approved by ICANN, one of which is .travel, run by Tralliance Corp. "Google is the first to admit it can only catalog 5 percent of the Internet," says Tralliance's Ron Andruff. "It's the logical expansion of the Internet...Cataloging will take you more efficiently and rapidly to the information you need." Andruff says the U.S. government fears losing control of the Internet, a trait particularly marked in delays in implementing IDNs. "That's more of a political minefield," says Milton Mueller, partner of the Internet Governance Project, noting that the addition of non-Roman characters would give countries such as China and India a greater say in Internet governance. But that's a good thing, says Mueller. "It adds a little more diversity and a little more competition; it doesn't threaten anyone's control of the Internet." ICANN's Memorandum of Understanding with the federal government is up in September.
Click Here to View Full Article
to the top


Morton Kondracke: New 'Report Card' Shows Congress Must Act on Science
Examiner.com (06/05/06) Kondracke, Morton

Sen. Lamar Alexander (R-Tenn.) hopes the latest national report card on science will convince his colleagues to quickly move President Bush's competitiveness initiative through Congress. Results of the National Assessment of Education Progress (NAEP) were released last week, and the performance of U.S. students in science remains troubling with no improvement in scores among eighth-graders from 1995 to 2005 and a decline for 12th-graders. The NAEP results "illustrate the urgency for Congress to pass comprehensive competitiveness legislation this year," says Alexander. However, the legislation has not been marked up in the Senate or the House, and floor time has yet to be scheduled by the leaders of the chambers. President Bush wants to offer more scholarships as a way of attracting another 10,000 science teachers a year, and double the budget of the National Science Foundation and other U.S. research programs. U.S. competitiveness has become a concern, and while some interests want to turn to foreign scientists and graduate students, others say more can be done to attract U.S. children to the sciences. National Center for Women and Information Technology CEO Lucy Sanders says women are shunning information technology for other fields of science. "We've got to change its image from 'geeky' to 'challenging,'" says Sanders.
Click Here to View Full Article
to the top


ELearning Is Music to Your Ears
University of Leicester (06/02/06)

Researchers in the United Kingdom continue to study the use of "podcasting" as a potential tool for enhancing the learning experience of students. E-learning experts at the University of Leicester say their pilot program for downloading audio onto personal MP3 players shows that students have embraced the idea of using podcasting to improve their education experience. The researchers developed a podcast model consisting of a current news item that is relevant to their course each week, give-and-take on learning and collaborative team work during the week, and a light-hearted segment such as a joke or rap, with each part lasting 10 minutes. The students listened between lectures, during commutes, and while they performed other tasks, says Gilly Salmon, a professor of e-learning at Leicester. Students added they were able to study at their own pace, rewind whenever they wanted, and contact classmates while they studied. They also lauded podcasting because it made learning informal and prevented them from missing anything. Salmon now heads a 12-month project called "IMPALA: Informal Mobile Podcasting and Learning Adaptation," which includes researchers from the University of Gloucestershire and Kingston University as participants, to focus more on using such technology to bring learning resources to students.
Click Here to View Full Article
to the top


Supercomputers to Transform Science
University of Bristol News (06/06/06)

The University of Bristol has recently installed three supercomputers that at peak capacity can perform more than 13 trillion calculations a second, promising new advances in climate modeling, drug design, and other cutting-edge research areas. "This initiative puts Bristol at the forefront of high performance computing," said David May, professor of computer science. "The HPC impact will be enormous--right across all disciplines--turning data into knowledge." The largest of the computers is anticipated to be one of the 100 fastest of its kind in the world. Thanks to the HPC cluster, Bristol physicists will be among the first in the world to evaluate data collected from the Large Hadron Collider once it becomes operational, offering new insights into the origins of the universe and the structure of space and time. Bristol professors believe that HPCs are a critical piece of infrastructure that universities must have in place in order to remain at the forefront of scientific research, as well as lending credibility to scientists' sometimes questionable predictions in areas such as climate modeling. Professor Paul Valdes, a Bristol climatologist, says, "These HPCs will allow us to develop a new generation of numerical models that have a much more sophisticated representation of the climate system." Researchers will be able to access the computers throughout Bristol's campus research network.
Click Here to View Full Article
to the top


MSpace Mobile: Exploring Support for Mobile Tasks
University of Southampton (ECS) (06/06/06) Wilson, Max L.; Russell, Alistair; Smith, Daniel A.

The authors compare the mSpace Mobile and Google Local Web applications interfaces' ability to support Web-based location discovery and planning tasks on mobile devices while stationary and while traveling, as detailed in the Proceedings of The 20th BCS HCI Group conference in cooperation with ACM. Google Local's performance under both fixed and mobile conditions was less than that of mSpace Mobile, leading to the theory that the latter is a better performer because it transcends the current page model for presenting network-based content, and thus enables new and more powerful interfaces to be employed to support mobility. These findings point to conditions that play important roles in the effective execution of planning activities of mobile devices, especially when the user is mobile. Such conditions include persistent displays of information, rapid data transfer, less need for text entry, and less need for activities in which a target must be acquired as well as held. The researchers outline a non-paged-based paradigm in which persistent domain overviews are foregrounded, and from which choices can be made via direct manipulation methods. The authors recommend an exploratory user interface with a multipaned, zoomable focus+context view in order to optimize screen space. They posit from their study that the mSpace Mobile interface beats many state-of-the-art Web applications that support similar planning tasks, especially while on the move.
Click Here to View Full Article
to the top


Xerox Funds 10 Research Projects at Leading Universities
Business Wire (06/07/06)

Xerox has awarded roughly $200,000 in grants to 10 academic research projects in the United States, Canada, and Europe this spring. The awards are part of the nearly $1 million that Xerox annually pledges to fund research and the $13 million that it contributes each year to educational and nonprofit initiatives. The 10 research projects include a project at the University of California, Berkeley, that will examine the core interactions between ink and printheads. Researchers at the University of California, Santa Barbara, will attempt to develop new algorithms to improve image quality and enable automated image processing. At the University of Maryland, scientists will compare different life cycle models for software development. A basic research endeavor at the University of Massachusetts will explore colloid surface science and engineering. Purdue University researchers will investigate the ways that a computer can classify images to improve its speed and performance. A University of Rochester project will examine the forces that influence consumer purchasing decisions. In Canada, the University of Toronto will continue a project studying how ink interacts with solid surfaces, and researchers at the University of Windsor will attempt to create new electrically conductive molecules for the next generation of electronics. At the University of Bremen in Germany, scientists will explore new tools for developing faster and cleaner software architectures. Finally, a project at the University of Lancaster in the United Kingdom will conduct research on the practical work of leadership with particular emphasis on the use of documents. "The full range of technology found in Xerox products is being studied at universities as well as in our research labs, and this grants program helps strengthen the bridges between our organizations and our people, while advancing science at the same time," said Gregory Zack, who chairs Xerox's University Affairs Committee.
Click Here to View Full Article
to the top


Projects in the Microsoft Research Labs
Computerworld (06/05/06) Anthes, Gary

At Microsoft Research (MSR), hundreds of projects are in the works at its six laboratories. One, the Dense Array of Inexpensive Radios (DAIR) project, is meant to aid in the management of corporate Wi-Fi networks. The DAIR project attempts to use inexpensive USB-based wireless adaptors to convert the spare processor and disk resources found in abundance at most companies into sensors that monitor the performance of a corporate network and provide troubleshooting support. The sensors relay information about the network's behavior to an inference engine that evaluates it and sends out alerts or other responses. A preliminary version of the system has been deployed at MSR to scan for denial-of-service attacks and rogue connections. In the future, DAIR could be used for site planning, load balancing, and recovering from a wireless access point failure. The makeshift sensors used in the DAIR system are more numerous than wireless access points, and much less expensive than dedicated equipment such as spectrum analyzers. Another MSR project is exploring mobile note-taking using the microphone, camera, and GPS sensors commonly found in cell phones to capture information on the fly. The Windows Mobile smart phone also responds to voice commands telling it to remind the user to run an errand or make a phone call. Microsoft researchers also conducted an email-monitoring program at 42 companies and universities that found that one in 140 emails they sent simply disappeared. These "silent losses" can be attributed variously to disk crashes, botched server upgrades, unusually aggressive spam filters, as well as simple performance overload, among other factors. The SureMail system posts a tamperproof notification on the Internet each time a message is sent, enabling email recipients to periodically search for notifications of lost messages.
Click Here to View Full Article
to the top


Phishers Could Start Using the Personal Touch
New Scientist (05/27/06) Vol. 190, No. 2553, P. 30; Biever, Celeste

Researchers at Indiana University in Bloomington have discovered a new way phishers can target unsuspecting Internet users. Researchers Markus Jakobsson and colleagues found that hackers can take advantage of the way Web sites communicate with browsers--to store the URLs of recently visited Web pages--to determine which Web sites a computer user has visited. The hacker could then use the information to send a phony email message from the individual's bank, asking the person to provide their personal information. The researchers say a phisher could spam thousands of computer users with an enticing link that actually directs them to the hacker's server, where they would receive a link to a unique URL, and when they click through the history of their browser is queried. Browsers that include a Web page of a bank could be redirected to another link on a phishing site, to the URL of the e-commerce site, and the click through would reveal the email address that is connected to a Web user. At the World Wide Web conference in Edinburgh in late May, Jakobsson encouraged banks and other online businesses to provide online customers with a secret URL that phishers would not be able to identify. Jakobsson says banks should stick the URLs of other banks on browsers that visit their Web sites, as another way to confuse hackers about the surfing histories of customers.
Click Here to View Full Article - Web Link to Publication Homepage
to the top


Data Ports in a Storm
Chronicle of Higher Education (06/09/06) Vol. 52, No. 40, P. A14; Glenn, David

Advocates for the principle of "network neutrality" contend that the domination of broadband services by a small number of phone and cable operators equals near-monopolization, and therefore the overseers of the Internet gateways should adhere to a price scheme in which all content providers and application developers pay the same rates. The argument goes that the preservation of neutrality will encourage inventors to devise more breakthrough online applications and Web sites, while conversely such innovation would be choked off if broadband service providers practice discrimination by setting arbitrary rates. The debate over the network neutrality issue reached a flashpoint last summer with a Supreme Court decision that defined cable broadband providers' offerings as "data services" instead of "telecommunications services," thus theoretically allowing providers to block their rivals' offerings and enter into special agreements with content providers. This provoked a backlash that spurred some members of Congress to propose legislation designed to ensure neutrality. Stanford law professor Lawrence Lessig and Columbia University's Timothy Wu are united in their concern that a lack of net neutrality will allow major broadband providers to turn the Internet into a medium consisting of homogenous content that users passively consume instead of actively create, and thus destroy the Net's openness. Opponents counter that new technologies allowing households to access the Internet wirelessly or through their electric lines will make broadband providers more numerous and the market more competitive, leading to the discouragement of price-gouging and anti-competitive behavior. An unwieldy regulatory framework needed to enforce net neutrality is another argument that skeptics frequently cite.
Click Here to View Full Article - Web Link to Publication Homepage
to the top


Cybersecurity: A Job for Uncle Sam
CIO (06/01/06) Vol. 19, No. 16, P. 46; Scalet, Sarah D.

Orson Swindle, a Republican appointed by President Bill Clinton to head the FTC in 1997, is an ardent free marketeer, though he admits that when it comes to cybersecurity, some government oversight is necessary. Swindle discussed his thoughts on data protection, disclosure, and security in a recent interview. Though he admits that there is a very real danger, Swindle argues that media reports often overstate the severity of data breaches. A headline might declare that 40 million credit card numbers have been compromised, though in reality a much smaller number, if any, will actually be used fraudulently. Swindle is acutely aware of the danger, however, and has counseled business leaders who claim there is little return on the investment in security that there is no way to calculate the intangible damage a highly publicized breach can do to a company's reputation, to say nothing of the possibility of legal action. Swindle believes companies should be required to notify their customers if their information has been compromised, but timing remains a question. It is unreasonable to ask companies to immediately notify every customer whose information could fall into the wrong hands because of a missing laptop, particularly when the laptop could simply be misplaced and turn up a few days later. Swindle believes companies should be responsible for the personal information they store just as banks are responsible for their customers' money. As more states follow California's lead and adopt mandatory notification laws, competing compliance requirements could emerge, which is why Swindle advocates a national disclosure law. Being a civil law enforcement agency, however, the FTC often has difficulty enforcing existing laws, and must seek help from the Justice Department if it intends to pursue a criminal case. Swindle praised the FTC for the $15 million fine against ChoicePoint, and issued a general call for current and future politicians to make technology and security high-profile issues.
Click Here to View Full Article
to the top


Dependable Software By Design
Scientific American (06/06) Vol. 294, No. 6, P. 68; Jackson, Daniel

The reliability of software design will be evaluated and ensured by powerful analysis tools such as Alloy, which blends a language that simplifies complex software design modeling with an analysis engine that looks deeply and automatically for conceptual and structural faults. Alloy was produced by the MIT Computer Science and Artificial Intelligence Laboratory's Software Design Group. Alloy and similar tools use automated reasoning that treats software design problems as huge puzzles to be deciphered. Alloy adopts the principle of considering all possible scenarios that model checking is based on, but rather than analyzing whole scenarios in a piecemeal fashion, the tool searches for a failure-generating scenario by filling in each state automatically, bit by bit, in no specific order. A design-checking tool's usefulness lies in its ability to find counterexamples that show how a system could fail to behave as expected, thus signaling a design flaw. Alloy's analysis engine is equipped with a SAT or satisfiability solver that allows it to run through all possible scenarios in the search for counterexamples. To use Alloy, a software engineer outlines an exact model of the system that clarifies its mechanisms and specific behaviors, facts that constrain the proper functioning of such components, and finally the constraints or assertions that are expected to originate from the facts; by writing this out, design limitations become straightforward and engineers are forced to carefully consider the most optimal abstractions. Improving software reliability is an increasingly critical issue, as computer software plays a greater and greater role in daily life, while current software testing methods frequently miss basic design flaws.
Click Here to View Full Article
to the top


To submit feedback about ACM TechNews, contact: [email protected]

To unsubscribe from the ACM TechNews Early Alert Service: Please send a separate email to [email protected] with the line

signoff technews

in the body of your message.

Please note that replying directly to this message does not automatically unsubscribe you from the TechNews list.

ACM may have a different email address on file for you, so if you're unable to "unsubscribe" yourself, please direct your request to: technews-request@ acm.org

We will remove your name from the TechNews list on your behalf.

For help with technical problems, including problems with leaving the list, please write to: [email protected]

to the top

News Abstracts © 2006 Information, Inc.


© 2006 ACM, Inc. All rights reserved. ACM Privacy Policy.