HomeFeedbackJoinShopSearch
Home

       HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.

To send comments, please write to [email protected].

Volume 5, Issue 564:  Wednesday, October 29, 2003

  • "Stored Data Doubles in Three Years"
    IDG News Service (10/29/03); Gross, Grant

    A study headed by professors Peter Lyman and Hal Varian of the School of Information Management and Systems at the University of California at Berkeley finds that the amount of new stored information has increased 100 percent in the past three years to total five exabytes, with 92 percent of that data stored on magnetic media. The amount of stored information is growing at a yearly rate of approximately 30 percent; this is ushering a "real change in our human ecology," according to Lyman, who points out that all this stored data is not always accurate. EMC's Gil Press says that Lyman and Varian's study emphasizes the need for companies to structure and manage their information more intelligently. The volume of data stored in 2002 was 3.5 times less than the amount of information generated through electronic channels such as the Internet, radio, TV, and the telephone; Lyman says the bulk of that data was exchanged via voice telephone calls. The USC professors estimate that the biggest percentage of information flow--17.3 exabytes if it were stored digitally--was produced by telephone, while email accounts for roughly 400,000 terabytes of new data annually, and the World Wide Web encompasses 172 terabytes of data on public pages. Lyman's team calculates that the number of terabytes of data committed to paper on a yearly basis rose by 36 percent between 1999 and 2001, while the amount of magnetically stored information increased 80 percent between 1999 and 2002. Lyman notes that there are considerable distinctions in the "accessibility and usability and trustworthiness" of information between various sources, but the USC Berkeley study does not focus on that. He says a follow-up study will concentrate on how people and enterprises consume such huge amounts of information.
    Click Here to View Full Article

  • "Group Lobbies for Domain Buyers' Privacy"
    CNet (10/28/03); McCullagh, Declan

    The ACM joined a coalition of groups from around the world, including the American Library Association, the Australian Council for Civil Liberties, Electronic Frontier Finland, the U.K. Foundation for Information Policy Research, in drafting a letter to ICANN leader Paul Twomey in favor of allowing entities to register domain names without releasing telephone numbers, email addresses, and mailing addresses. ICANN now mandates the public listing of such information in Whois directories, but is weighing issues of privacy and accountability. Last month, Twomey publicized the fact that a Whois workshop had met in June, and ICANN currently is set to consider Whois accuracy and privacy concerns at its meetings this week in Tunisia. "The Whois database was originally intended to allow network administrators to find and fix problems to maintain the stability of the Internet," the draft letter from the coalition states. "Anyone with Internet access can now have access to Whois data, and that includes stalkers, governments that restrict dissidents' activities, law enforcement agents without legal authority, and spammers." A recent European Commission working group report classifies the Whois directory as subject to data protection provisions in the European Data Protection Directive, while the Electronic Privacy Information Center goes a step further in recommending permission for anonymous buying of domain names.
    Click Here to View Full Article

  • "Xerox Claims Chip Innovation"
    Rochester Democrat & Chronicle (10/29/03); Mullins, Richard; Rand, Ben

    Xerox researchers have announced a simple, inexpensive method to print out microchips, a breakthrough that could allow the company to enter the lucrative electronic display market and clear the way for flexible screens and perhaps ubiquitous electronics. At the core of the breakthrough is Xerox researcher Beng Ong's discovery of a way to generate a conductive plastic ink; he presented a theory at a Boston conference that circuits could be drawn on even flexible surfaces with this ink, using conventional printers. Raj Apte of Xerox's Palo Alto Research Center says his lab built such a printer, which employs tiny internal cameras to ensure that the ink droplets are deposited in the proper arrangement. The printer makes multiple passes during the process, each time laying down different circuits. "I see flexible displays on items that wouldn't be possible now because of the cost," notes Carl Schauffele of the University of Rochester, such as refrigerator doors that can display the items inside. Xerox intends to use the printed circuits as control mechanisms for displays manufactured by its Gyricon spinoff. Researchers also believe the chip-printing process could lower production costs for minuscule electronics that do not necessarily need high-powered chips in order to function, one example being transmitters used to tag groceries or other retail items. Researchers from Motorola Labs and Dow Chemical are working with Xerox on the project using a grant from the National Institute of Standards and Technology. Apte says, "I'm thinking about displays that come down from your bedroom ceiling and roll back up. That could be five years away."
    Click Here to View Full Article

  • "E-Vote Protest Gains Momentum"
    Wired News (10/28/03); Zetter, Kim

    When Diebold Election Systems demanded that a student at Swarthmore College take down controversial memos posted online on the grounds that they violated the Digital Millennium Copyright Act, fellow students launched a campaign of civil disobedience that has attracted support from other academic institutions across the United States. The memos indicate that Diebold was aware of security problems with its electronic voting machines for a considerable amount of time before it sold the units to states such as Maryland, Georgia, and California. Though Swarthmore Dean Robert Gross applauded the principles behind the students' protest, he felt the college was legally compelled to comply with Diebold's instructions, a move that earned him considerable enmity. "My concern and I think the concern of the students is to focus attention on electoral fraud," Gross explains. "The copyright stuff is a sideshow." Ivan Boothe of the Why War? student group, which helped launch the memo campaign, says his association received emails of support from Swarthmore alumni, as well as lawyers and professors from other schools; he is also urging more students and schools to post Web links to sites that feature the memos. Boothe's group is part of a nationwide movement to subject e-voting systems to closer inspection in order to avoid electoral fraud, most notably by ensuring that the machines produce a voter-verifiable receipt and requiring voting machine manufacturers to make the machines' source code available to the public. Both conditions are part of a bill introduced by Rep. Rush Holt (D-N.J.) which aims to revise the Help America Vote Act (HAVA), which promises to give money to states that modernize their voting systems to avoid the sort of problems that plagued the last presidential election. Holt says revisions to HAVA are necessary "if you want to have a democratic process that citizens have confidence in."
    Click Here to View Full Article

    To read more about ACM's activities regarding e-voting, visit http://www.acm.org/usacm/Issues/EVoting.htm.

  • "Scottish Universities Plan Speckled Computing Net"
    EE Times (10/27/03); Roman, David

    Developing a distributed asynchronous network of self-powered nodes about a cubic millimeter in size is the goal of five Scottish universities that received a $2.1 million grant from the Scottish Higher Education Funding Council on Oct. 1 to get the project off the ground. The universities teamed up to form the SpeckNet Consortium to flesh out the concept of "speckled computing," which will be applied to two projects initially: One will distribute thousands of nodes onto the chest of a patient with coronary heart disease for diagnostic purposes, and the other will tag individual components of a child's puzzle or an unassembled machine. SpeckNet director D.K. Arvind of the University of Edinburgh's School of Informatics reports that the computing specks will form the basis of ubiquitous network computing, while a major focus will involve how computer performance can be delivered in a resource-limited environment. The speckled computing project bears a certain resemblance to the U.S. Defense Advanced Research Projects Agency's "smart dust" project and Intel Research's ad hoc sensor network, but Arvind boasts that the speckled-computing network will be more sophisticated because it will enable in-situ computation and programmability. The first working demonstration of the speckled-computing network is scheduled to take place in four years, while the SpeckNet initiative itself is conceived as a 12- to 15-year project. "The amount of technology that needs to be developed is huge, in the areas of signal processing, operating systems, sensing technology, battery technology," explains Iain Thayne of the University of Glasgow. "There are high-tech challenges that have to be resolved, and this is one way of getting real neat solutions." The SpeckNet Consortium's member institutions include the universities of Edinburgh, Glasgow, St. Andrews, Strathclyde, and Napier.
    Click Here to View Full Article

  • "UB Researchers Hope to Find the Key to Tiny Electronics"
    Buffalo News (10/27/03); Williams, Fred O.

    University of Buffalo Electronics Packaging Lab researchers are doing critical work in the field of electronics, figuring out how to overcome the packaging bottleneck that prevents computer chips from becoming even more useful and powerful. Although chips themselves rarely fail, the interconnects between them routinely fail due to mechanical stress caused by electrical current and intense heat. Lab director Cemal Basaran says one of his doctoral students recently discovered an important non-traditional way to prevent circuit failure due to current stress: Using a miniature heat sink, a thermal gradient running opposite the current can prevent atoms from moving and creating "voids," or weak spots in the atomic structure. Lab co-director Alexander Cartwright says his group is very important to the U.S. microelectronics industry and that there are only two other academic laboratories like it--at the University of Maryland and Georgia Tech. His lab received $8.4 million in research funding from the likes of Intel, Motorola, and Delphi since starting in 1997, and produces public-domain research that chip companies use to create better products. Intel researcher Ravi Mahajan recently appealed to the National Science Foundation that more research dollars be given to the University of Buffalo lab, and said the important technical advances made there increase U.S. competitiveness against foreign industry. Intel currently uses a laser imaging system and "nano-indenter" that prods circuits to test their durability, both of which were developed at the University of Buffalo lab. Basaran says the mathematical modeling tools being used in research are even more important than the physical testing equipment because they can be used to test virtual designs, and notes that prototype chips can cost as much as a half-billion dollars.
    Click Here to View Full Article

  • "Linux and the Consumer Electronics Industry"
    Linux Insider (10/28/03); Hook, Brian R.

    The CE Linux Forum (CELF), which currently consists of more than 75 companies, supports products for the consumer electronics industry based on the Linux open-source operating system. The forum is designed to be a place where member companies such as Sony, Matsushita, NEC, Hitachi, Samsung, Royal Philips, Toshiba, Sharp, and CACMedia can pool their resources to outline the requirements of emerging technologies, work together on development tactics, and promote the spread of Linux-based digital CE products. "Linux is already in use in...consumer electronics products as diverse as cell phones and personal video recorders--and in that regard Linux already has a place in the consumer-electronics space," notes CELF steering committee chairman Scott Smyers, who adds that the forum's mission is not to impede rival OS platforms. CACMedia, the latest CE company to join CELF, came aboard in order to share technology and breakthrough concepts with other major CE manufacturers, according to CACMedia founder and CEO Ken Nelson, who is convinced that Linux will remain the predominant OS platform. Stratego director Lloyd Switzer believes the low-margin business of consumer electronics will enable CELF to flourish. "If an operating system does not easily allow other devices to connect or third-party software to be written for a device, it will lose," he declares. Analyst Rob Enderle says CELF's chances of success at achieving its goal will depend on the level of funding the group receives, as well as whether the forum can maintain its concentration. He also notes that many CE vendors are averse to sharing their technology and innovations, which could be another stumbling block for CELF.
    Click Here to View Full Article

  • "Artificial Intelligence Pioneer Ponders Differences Between Computers and Humans"
    Stanford Report (10/29/03); Koch, Geoff

    Nils Nilsson, Kumagai Professor of Engineering, Emeritus, of Stanford University, maintains that fundamental distinctions will always exist between human beings and computers, but predicts that the divide between their intellectual and even creative differences will eventually shrink. In his early years at Stanford Research Institute (now known as SRI International), Nilsson's focus on neural networks led to the development of the A* algorithm, which today is used in Web-based mapping applications that furnish point-to-point driving directions online. A later project yielded one of the world's first autonomous robots, a device named Shakey that employed a TV camera, range finder, and sensors to collate data that allowed it to reach goals and follow routes. Breakthroughs in the field of AI, most notably the defeat of world chess champion Garry Kasparov to IBM's Deep Blue supercomputer, have given rise to fears that such technology is people-unfriendly and threatens to render humans obsolete in many capacities. This view is not shared by Nilsson, who thinks AI's labor-saving potential will ultimately boost the quality of life. He believes the key AI challenge of the next half-century will be imbuing computers with human-level capabilities, and says that David Cope of the University of California-Santa Cruz has undertaken very promising research. Cope has devised intelligent software that can imitate the style of famous composers, and compose scores that even music enthusiasts are hard-pressed to realize are the work of a machine when performed live. In an email interview, Cope explained, "I'm not sure that using computers as tools makes composing any...less a 'province of human beings.'" Nilsson also praises "university-style" research, but notes that funding is harder to obtain for such curiosity-driven work since agencies such as the National Science Foundation and DARPA increasingly seek to fund research with specific societal or technical objectives.
    Click Here to View Full Article

  • "Full-Featured PC Fits in Pocket"
    New Scientist (10/28/03); Graham-Rowe, Duncan

    Antelope Technologies plans to debut a pocket-sized device that functions as both a fully-featured PC and a handheld computer in early November. The core unit, which Antelope touts as the first modular computer in the world, can slide into a docking station so it works in tandem with a screen and keyboard as a desktop, or into a mobile "shell" with touchscreen technology that allows it to be used as a handheld. "Modular computers will change the way people use their computer," declares Antelope President Kenneth Geyer. The Modular Computer Core (MCC), which contains a 1 GHz microprocessor from Transmeta, 256 MB of RAM, and a 10 or 15 GB Toshiba hard drive, costs nearly $4,000. Jason Brotherton, a ubiquitous computing researcher at the University College London's Interaction Centre, says, "It's a revolutionary concept, but I'm not sure everyone is ready for it yet." Geyer says the device will initially target corporate users, and contends that the MCC is advantageous because it can eliminate the need to use a desktop, laptop, and personal digital assistant (PDA), as well as save users the time required to transfer data between these three devices. Reaction to the MCC and its promised capabilities has been mixed: Although some users are attracted to the idea of owning a compact, all-in-one mobile PC, others counter that the device offers less power than laptops and less portability than PDAs. Antelope licensed IBM's patented Meta Pad design in order to install all the required components into the small confines of the MCC.
    Click Here to View Full Article

  • "Queries Guide Web Crawlers"
    Technology Research News (10/29/03); Patch, Kimberly

    Contraco Consulting and Software, T-Online International, and Germany's Siegen University have collaborated on an algorithm that improves Internet search results by taking into account the specific objects of people's searches. Contraco partner Andreas Schaale explains that Vox Populi, as the algorithm is called, studies patterns in people's Web searching behavior in order to extrapolate trends, and then orders Web crawlers to comprehensively catalog relevant Web sites. He adds that Vox Populi "answers the question 'What are most of the people searching for?'" Web crawlers come in several flavors: Focused crawlers index pages connected to particular subjects, while adaptive crawlers rank uncrawled pages according to the appropriateness of crawled pages. Schaale says the rising cost of storing and handling data is driving the need for crawlers to be directed by query feedback. Vox Populi also utilizes techniques to curb the appearance of junk email or undesirable content, and Schaale says filtering plays a more important role than the algorithm in making the technique effective. Filippo Menczer of Indiana University acknowledges the potential for context-based approaches such as Vox Populi to greatly improve Internet searches, but points out that the mathematical scheme needs to be perfected. Schaale explains that Vox Populi can be combined with ranking techniques employed by search engines, and used in vertical information systems that query by subject as well as personalized searches that consider a user's points of interest.
    Click Here to View Full Article

  • "As Silicon Valley Reboots, the Geeks Take Charge"
    New York Times (10/26/03) P. BU1; Lohr, Steve

    A number of small companies are thriving in Silicon Valley, staffed with experienced hardware and software engineers that understand business plans are not as valuable as good technology. Tellme Networks' business is finally taking off after four years of searching for a market for its voice recognition technology: Whereas plans for a consumer information service failed with the dot-com bust, Tellme has signed on large business clients who use the technology to save millions in call center costs. Started by Netscape alumni, Tellme's original plan was to allow people's natural language queries to be translated into Web searches over the phone, and the technology is helping AT&T automate directory assistance operations, for example. InterTrust Technologies is another example of the importance of technical prowess trumping business aspirations--InterTrust vice president William Rainey says he had to fire 350 of the firm's 380 employees over the past few years as the company struggled to find a market niche for its digital rights management technology. Finally, the formerly public firm was bought entirely by an investment group including Sony and Philips on the strength of its technology. VMware was largely shunned by Valley investors during the late 1990s because it was "an old-fashioned software company," according to CEO Diane Greene, who founded the firm together with her husband, University of California, Berkeley, computer science professor Mendel Rosenblum. VMware today has a growing following among corporate clients, who use its virtual machine software to increase data center utilization rates. Scalix CEO Julie Hanna Farris said she could not resist starting another company after co-founding three successful firms--Portola, 2Bridge, and Onebox.com; Scalix takes advantage of neglected Hewlett-Packard OpenMail software to create a Linux-based corporate email application.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Hollywood to the Computer Industry: We Don't Need No Stinking Napsters!"
    Salon.com (10/27/03); Manjoo, Farhad

    Arguing that unchecked digital piracy will signal the end of free TV, the Motion Picture Association of America (MPAA) wants the government to require digital TV broadcasts to include broadcast flag encryption, thus requiring hardware makers to design machines that comply with certain copy-protection schemes in order to play these broadcasts. Critics slam this move, arguing that it will stifle innovation. The broadcast flag proposal the MPAA submitted to the FCC for consideration includes a list of safeguards computer manufacturers could incorporate into their digital-TV machines, an approach that, if mandated, would go against the self-certification model that the tech industry relies on. "Under the original proposal, no [copy-protection] technology could be approved without at least two motion picture companies approving it," notes Electronic Frontier Foundation attorney Fred von Lohmann, who adds that the MPAA's scheme would unconditionally outlaw open-source digital TV. The MPAA insisted in its submission that the broadcast flag solution would not impede innovation or the development of open-source software, because the list of approved copy protection technologies would be revised regularly according to market demand. Von Lohmann observes that thousands of digital TVs have been sold already, and these devices will totally disregard the broadcast flag, a situation that he says will make the broadcast flag "absolutely and completely useless." Meanwhile, critics are highly skeptical that HDTV trading will become widespread, as the MPAA asserts, because of the enormous amount of time it takes to download HDTV broadcasts, not to mention the huge storage requirements of HDTV files. Critics also point out that there is no guarantee that HDTV trading would be a detriment to Hollywood's revenues--it could, perhaps, boost the studios' bottom line--as did the VCR, which is another technology the MPAA initially maligned.
    Click Here to View Full Article

  • "The Next Wave"
    Age (AU) (10/28/03); Barker, Garry

    A desktop supercomputer is what aficionados claim the next stage of the Internet's evolution, the "grid," will be like, when vast computing resources will be available to anyone with a Web connection. "The notion here is that a grid of computers can create a flexible computing infrastructure for hosting centers so that users can be assigned as much or as little computing infrastructure as they need," explains Dr. Hugh Bradlow of Australia's Telstra Research Laboratories. Advanced applications the grid is expected to support include 3D holograms for gaming and retail, remote surgery, and virtual tourism, while Daniel Atkins of the U.S. National Science Foundation (NSF) is convinced that grid computing will open up new scientific research vistas. However, even the most enthusiastic grid proponents think that grid computing will most likely manifest itself as improved, faster, and cheaper services from public and private entities connected to the grid. Such services include more rapid development of pharmaceuticals, faster testing of structural integrity for engineering projects, and accelerated development of new construction materials. Peter Freeman of the NSF cited the need for grid computing in a recent report to Congress, arguing that "The calculations and the quantities of information that can be stored, transmitted and used are exploding at a stunning, almost disruptive rate." Although a considerable amount of the data swamping the Internet is malevolent or junk, much more is viewed as critical to engineering, science, and commerce. High-end computers are currently capable of processing terabytes of data each second, and that capability will soon be raised to the petabyte level; and there will soon be a need for machines that can process exabytes--a million trillion bytes--of data per second.
    Click Here to View Full Article

  • "Patchy Years Ahead for Software Users"
    IDG News Service (10/23/03); Pruitt, Scarlet

    Network administrators are finding most of their time taken up with deploying software patches to fix network vulnerabilities or upgrade features, and there appear to be few signs of relief on the horizon, despite announcements from patch vendors that they are aware of the problem and are working to simplify the patching process. Ecora CEO Alex Bakman estimates that applying a patch to each machine in a company's system takes half an hour on average, and notes that recent outbreaks of worms such as Slammer and Blaster have exacerbated the situation. He also says that many companies are not installing essential patches out of concern that they might "break" applications, and they refuse to deploy them during critical times in the fiscal year, such as prior to a major retail or holiday season. Gartner analyst John Pescatore declared at the Gartner Security Summit that patching on the desktop, and its associated problems, have at least two more years of life. Gartner analysts recommended in March that companies institute a patch management strategy in which the most critical security patches are prioritized and the patch installation requirements are thoroughly assessed. Gartner advised companies to test all patches before implementation and to define server and desktop configurations as standard and nonstandard so they can be patched according to their particular requirements; it was also recommended that enterprises only accept official patches and give the patch management infrastructure the same level of protection as their outward-facing Web and application servers. Users say the patching situation is symptomatic of wider software problems, in which new security flaws that must be patched are continuously discovered, adding to the total cost of ownership. Writing flawless software is an impossible goal, since human coders are inherently vulnerable to error.
    Click Here to View Full Article

  • "The Decentralization Imperative"
    Technology Review (10/24/03); Guizzo, Erico

    MIT Sloan School of Management professor of information systems Thomas Malone says cheaper communications are transforming business organization. He says people overestimated how quickly technology would change business in the late 1990s and that the current disillusionment about business technology obscures the important changes underway. Malone expects communications tools such as email and the Web to radically change the way companies operate by giving all employees access to critical information; with this information, workers will be able to make important business decisions for the company and consequently show more creativity, motivation, and dedication in their work. These types of decentralized operations will have the economic benefits of large corporations, but the human benefits of smaller companies. Examples of this type of change can be seen in firms such as electric-power producer AES, which gives low-level employees the authority to make large business decisions since they can access pertinent information and human references throughout the company. In Spain, Mondragon Cooperatives gives every employee a right to vote on major company decisions and on the firm's leadership. Malone sees this type of decentralization replacing the hierarchal systems popular in the last 20 years. In addition, Malone says cheaper communications will lead to smaller, loosely formed groups of companies or even individual freelancers who can compete head-to-head with traditional firms in what he calls an "e-lance" economy.
    Click Here to View Full Article

  • "'Net Security Gets Root-Level Boost"
    Network World (10/27/03) Vol. 20, No. 43, P. 1; Marsan, Carolyn Duffy; Garretson, Cara

    The domain name system (DNS) is stronger than ever, one year since the massive distributed denial-of-service attack that clogged traffic flowing between several root servers and the Internet. Operators of the 13 root servers have been deploying mirror servers around the globe using Anycast technology. Internet requests to a particular root server's IP address block can be handled by any of those servers according to geography. Besides making DNS more resilient to attack, distributing DNS through Anycast has the added benefit of speeding address requests by ISPs. Nominum Chairman and DNS inventor Paul Mockapetris says DNS is "more resilient than it was a year ago by a factor of two," though some enterprise network administrators warn that no Internet security scheme is 100 percent effective. Though the DNS attack last October did not actually crash any root servers themselves and its effect was minimal on general Internet performance, it was the largest such attack and an eye-opener for DNS operators; since the attack, four root server operators have mirrored their server data across the globe, starting with the Internet Software Consortium last November, which began spreading access to its F root to sites as far away as New Zealand, Canada, and Brazil. VeriSign has also mirrored its J root in six U.S. locations and two places in Europe, while networks and information security vice president Ken Silva says the company's other A root still shares infrastructure with Whois and InterNIC FTP systems. Once those legacy links are severed, VeriSign will use Anycast to distribute the A root as well; Silva says Anycast technology was tested for about a year before deployment and has not yet caused any disruptions.
    Click Here to View Full Article

  • "The Future of Software Bugs"
    Computerworld (10/27/03) Vol. 31, No. 49, P. 32; Thibodeau, Patrick

    The continuing threat of software bugs stems from a variety of factors, including software vendors and in-house development teams that rush testing and sacrifice quality so they can rapidly move products to market; academic computer science programs that place more emphasis on development than testing; and legislation that absolves developers of the blame for damages users suffer as a result of faulty products. A major problem is the fact that "Most software projects have not been designed for quality," according to Herb Krasner, director of the University of Texas at Austin's Software Quality Institute. A long-term initiative, partly spurred by the software glitch that led to the destruction of the Mars Polar Lander in 1999, aims to define software quality standards that encompass the properties of reliability, usability, efficiency, maintainability, and portability. The development of such standards is one of the missions of Carnegie Mellon University's Sustainable Computing Consortium (SCC), and SCC director William Guttman says the measurement of security, dependability, and other traits will enable users to make quality-based software purchases. Meanwhile, a project underway at MIT is concentrating on the development of automated software testing processes through the creation of algorithms for generating "inputs," or software instructions. Florida Institute of Technology computer science professor Cem Kaner chiefly blames the dearth of quality software on a lack of legal liability among vendors for defective products, an issue that has become especially urgent in light of highly damaging virus outbreaks that target flawed software. The National Institute of Standards and Technology reported in 2002 that users and vendors spend $60 billion annually because of insufficient software testing.
    Click Here to View Full Article

  • "No Need to Shut Down, Just Pull the Plug..."
    New Scientist (10/18/03) Vol. 180, No. 2417, P. 23; Ananthaswamy, Anil

    Stanford University's Armando Fox believes that building computers to crash would make them much more reliable and robust tools. Fox, who heads the university's software infrastructure group, is working to design a system that allows people to shut off their computers by pulling the plug, while recovery software attends to the problem when it is turned on again. Fox says most operating systems have duplicate recovery routines, in that files must be stored on disc to save information, which takes time, before shutting down the computer. And when the computer is turned on again, most systems are designed to check files and address any problems. Fox says preliminary tests on Windows XP and Red Hat Linux show that allowing systems to crash and recover later saves time. A radical change in software design, the crash-only Web servers Fox is building with an Internet services company could even replicate user session information over computers that communicate frequently with each other, and redirect the user of a crashed machine to another. Also, faults in individual components within servers can be detected, crashed, and rebooted. "We can repair parts of the application on the fly, without having the whole thing come crashing down," says Fox.

  • "Server Consolidation Using Performance Modeling"
    IT Professional (10/03) P. 31; Spellmann, Amy; Erickson, Karen; Reynolds, Jim

    The most successful server consolidation projects are those in which performance is predicted before any changes are made, and this can be done through performance modeling. Performance modeling yields a deeply abstracted set of performance metrics to define a baseline system, identification of an infrastructure's sharing capacity, qualitative analysis of consolidation options, and a complete series of metrics to monitor and evaluate how successful the consolidation is. A new technique couples performance modeling with stepwise refinement, a process that addresses how to employ performance modeling to efficiently and effectively assess consolidation alternatives and performance uncertainties. Consolidation opportunities can be assigned to one of three general categories: Centralization, physical consolidation, and data and application integration. Stepwise refinement classifies a trio of abstraction levels--system scalability, software scalability, and application optimization--that each gather data in the areas of application, execution environment, workload, and resource usage. Centralization can be predicted through system scalability analysis if the current production environment is characterized on the tier level, while analysis and modeling should concentrate on identifying changes in user response times, network connectivity, and application chattiness. The performance impact of a physical consolidation can be anticipated through software scalability analysis, with performance modeling focusing on performance impacts for targeted applications and the optimal server configuration. Finally, predicting the performance impact of data and application integration can be accomplished via application optimization, with emphasis on finding a way to ensure service delivery when applications share technology and recognizing scalability limitations and necessary server sizings.
    Click Here to View Full Article

 
                                                                             
[ Archives ] [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM