HomeFeedbackJoinShopSearch
Home
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM.

To send comments, please write to [email protected].

Volume 5, Issue 579: Friday, December 5, 2003

  • "Help Fix Cyber-Security or Else, U.S. Tells Industry"
    Washington Post (12/04/03) P. E2; Krim, Jonathan

    Tech companies were warned by top homeland security officials at yesterday's National Cyber Security Summit that they will face government regulation unless they make a major commitment to improving the security of the U.S. cyberinfrastructure. The Department of Homeland Security's Robert P. Liscouski commented, "We are not going to let anybody who operates in this space dodge their responsibility." Homeland Security Secretary Tom Ridge and Amit Youran, chief of the DHS' National Cyber Security Division, advised CEOs of all businesses in the United States to make a formal evaluation of their companies' cybersecurity measures. Three major industry trade groups used the summit to unveil a strategy co-developed with the University of Southern California to regularly survey and monitor the progress of U.S. businesses' efforts to bolster cybersecurity. The tech industry wants to evade government regulation by increasing security threat awareness, accelerating response time after cyberattacks, studying ways to improve software development, and prioritizing security in corporate governance. This effort, however, has been criticized by some security experts as an attempt to avoid the consideration of security measures that software and security firms are reluctant to adopt, such as regular security audits and security breach disclosures, the sharing of security data between firms, and imposing liability on software makers for intrusions. SANS Institute director Alan Paller lamented the fact that few representatives of the user community, which usually bears the brunt of cyberintrusion costs, were invited to speak at the summit by the event's organizers. "The challenge is to skew the marketplace...to push either with regulations or the threat of regulations, liability or the threat of liability, standards or the threat of standards," said Solutionary chief security counsel Mark D. Rasch.
    Click Here to View Full Article

  • "Symposium to Promote Voting Machine Clarity"
    Phoenix Online (12/04/03); Sheldon-Coulson, Garth

    A Swarthmore College voting machine and electoral transparency symposium on Saturday will bring together top-thinkers from across the nation, according to organizer and Swarthmore student Steven Bhardwaj. The symposium comes after two Swarthmore students filed suit against voting machine vendor Diebold in early November to refute claims of copyright infringement. The two students had posted online documents suggesting Diebold knew about insecure software on its voting machines. Bhardwaj says the symposium idea was conceived initially to investigate the memos, but controversy has since broadened the scope of the meeting to be a general discussion on how to ensure a more transparent and accurate process. Harvard research fellow, CACM columnist, and e-voting voting expert Rebecca Mercuri says at least 3% of all votes go unaccounted for in every voting system, and that many systems are designed to hide that fact and save face for vendors and officials. A range of opinions is expected at the symposium, says Mercuri, from those who would resort to pencil and paper to advocates of completely electronic systems that produce definitive results quickly. In 2000, electronic voting machines in Florida were suspected of erasing 16,022 votes for presidential candidate Al Gore, but there was no way to verify the claims because no voter receipts were produced and stored. Mercuri herself says e-voting accompanied by verifiable paper ballots is the best recourse. The symposium attendees, which include academics, industry representatives, government officials, and a broad range of ordinary citizens, will vote to create symposium recommendations based on the proceedings, says Bhardwaj, who added national television coverage was likely. Also attending the symposium will be Barbara Simons, former ACM president and current co-chair of ACM's U.S. Public Policy Committee.
    Click Here to View Full Article

    For information about USACM, visit http://www.acm.org/usacm.

  • "IT Stymied in Terror War: Panel"
    InternetNews.com (12/02/03); Mark, Roy

    The Markle Foundation's Task Force on National Security in the Information Age second annual report on homeland defense concludes that acrimonious debates about privacy and civil liberties are preventing the U.S. government from fully leveraging technology to combat terrorism. The private panel of IT and national security specialists who organized the report has urged President Bush to set up rules to delineate the security interests of private data mining programs, and to deploy oversight to ensure that such programs do not needlessly impinge on individual privacy. The panel is disappointed that data mining technology research and development has been prohibited by Congress as the result of pressure from civil liberty and privacy advocates. Markle Foundation President Zoe Baird comments that government rules for data sharing and privacy safeguards must be defined through open debate if a network that uses information about U.S. residents is to gain public trust. The report calls for the government to exploit privately-held information, provided that civil liberties are adequately protected. The panel cautions that routine access to personally identifying data--even that which is widely accessible to the public--should not be a government privilege. "If government is to sustain public support for its efforts, it must demonstrate that the information it seeks to acquire is genuinely important to the security mission and is obtained and used in a way that minimizes its impact on privacy and civil liberties," states the task force. The guidelines covering access and use of information held by the private sector should take two key considerations into account, according to the report: How valuable the data is to the government, and how sensitive the information is in terms of individual privacy and other civil liberties. The task force also calls for President Bush to better define the individual roles and responsibilities of the myriad government agencies charged with gathering and analyzing domestic terrorism information.
    Click Here to View Full Article

  • "Analysis: Nano Bill Promises Real Results"
    United Press International (12/03/03); Choi, Charles

    President Bush's Dec. 3 approval of a four-year federal commitment of $3.7 billion for the research and development of nanotechnology is being heralded as a revolutionary development by nanotech advocates. "It makes nanotechnology the highest priority funded science and technology effort since the space race," boasts NanoBusiness Alliance executive director F. Mark Modzelewski. The National Science Foundation (NSF) estimates that nanotech applications could inject over $1 trillion into the worldwide economy within just 10 years, while National Nanotechnology Initiative director Mihail Roco reports that global investments in nanotech development have risen by more than 50 percent annually over the last three years, to over $1 billion. Roco expects 95 percent of the almost $4 billion grant authorized by the nanotech bill to be devoted to scientific research and development, with academic efforts receiving about 60 percent and government labs 35 percent. The nanotech bill was passed with little conflict, and has enjoyed wide Democratic and Republican support, as well as the backing of the NanoBusiness Alliance, the ACM, the IEEE, among others. Modzelewski believes nanotech R&D will yield significant breakthroughs in hydrogen storage, cheap photovoltaics for solar power, and early cancer detection in a relatively short time, while longer-term innovations could include organ replacement technology and quantum computing. The nanotech funding bill also calls for the establishment of a center dedicated to the study of nanotech's societal and ethical implications.
    Click Here to View Full Article

    To review a summary of the bill (S. 189) enacted into law, visit http://www.house.gov/science/press/108/S189_summary.htm.

  • "NSF Seeks Theoretical Limits of Computation"
    Government Computer News (12/02/03); Jackson, Joab

    The National Science Foundation (NSF) is launching a new effort to discover the theoretical limits of computation and network data transfer, and is seeking grant proposals for the $32 million the agency has allocated for the work. That money would be divided among 90 to 100 agency, university, and corporate research laboratories, and involve computer science, scientific computing, communications, signal processing theory, and mathematics fields of study. NSF Division of Computing and Communication Foundations director Kamal Abdali says the research will build upon fundamental information transfer rules established in 1948 by Bell Labs mathematician Claude Shannon. By defining the limits of computation and information transfer, Abdali expects to help discover models for optimal performance within those limits. Different types of computing may yield different limits; parallel and distributed computing may be able to handle larger computations, for example. Work will also be done in terms of finding new algorithms and their application, with Abdali noting that algorithm advances have actually moved the computing sector further along than hardware advances. Each of the NSF grants is expected to be about $125,000 per group and the deadline for application is March 4.
    Click Here to View Full Article

  • "IT Careers That Will Bounce Back"
    E-Commerce Times (12/03/03); Weisman, Robyn

    IT jobs that utilize vertical industry expertise or important creative skills are going to grow in demand, according to industry analysts who say that companies are focusing more on value creation after years of paring back costs. Though jobs cuts in the IT field are down 50 percent from one year ago, according to Challenger, Gray, & Christmas CEO John Challenger, the type of jobs that are available today are different than those in demand at the peak of the boom years. Offshore outsourcing has eliminated many posts that dealt with more routine tasks such as application development and infrastructure development. Companies can hire skilled programmers for those tasks in places such as India and Eastern Europe, paying salaries far lower than those demanded by U.S. IT workers. But Yankee Group analyst Carrie Lewis says that the IT field in the United States is bouncing back as businesses look to retrain their existing workers for higher-value tasks and hire new workers that can create value for the company. Business process engineering and other jobs that affect the company's core business are expected to grow in demand, as are positions requiring specialized vertical industry expertise such as in healthcare and financial services. Challenger says other jobs are going to be segregated such as Web design, where creative talent remains in the United States but actual Web development and site work is completed offshore. Similarly, computer operations are likely to remain in-house where companies can get fast response times and workers can collaborate with business users more effectively. Cultural and communication difficulties often hinder collaborative tasks from being offshored. Finally, computer security and wireless networking look to be strong growth areas for now across the board.
    Click Here to View Full Article

  • "Open-Source Practices May Help Improve Software Engineering"
    Newswise (12/03/03)

    Open-source software development has many advantages over by-the-book corporate software development, according to university researchers. UC Irvine Institute for Software Research senior research scientist Walt Scacchi says open-source development projects hold a wealth of information about the software development process that would not be available if studying a traditional in-house system. Scacchi and colleagues from the University of Illinois and Santa Clara University analyzed hundreds of thousands of bug reports and other data available in transparent open-source development projects such as the Linux kernel. By mining this data, the researchers were able to find out what advantages bug reporting confers on software quality, for example, and how to apply those lessons to traditional in-house software development. The study targeted different areas of open-source software development: Network games and game mods; open-source scientific applications; Internet and Web infrastructure projects such as Linux, Apache, and Mozilla; and corporate-sponsored projects such as Sun Microsystems' NetBeans and IBM's Eclipse project. Among the benefits of open-source development are faster development times, continuing improvement, wide distribution of knowledge and skills, and the collective resources of those involved. Scacchi says open-source development may not be the best option for many projects, including niche software applications such as air defense radar software. Another aim of the study is to discover why some open-source projects have been so successful in garnering wide support while others falter and die off. The studies are supported by the National Science Foundation.
    Click Here to View Full Article

  • "Linux Security Expert Defends Debian"
    InternetNews.com (12/04/03); Wagner, Jim

    Linux expert Jay Beale defended the response of Debian Project leaders to a security breach that shut down three Debian servers and stopped open development activities. Debian is a Linux operating system variant popular with Linux enthusiasts. Most of the Debian Project services run by the three servers--such as Web search functions, bug tracking, mailing lists, and security updates--were mirrored elsewhere within three days of the initial break-in. Completely securing Internet-connected servers is impossible, especially for machines that run a large open-source operation such as Debian, says Beale, who is lead developer for the Bastille Linux project and consultant at JJB Security Consulting & Training. The large number of developers who have access to the servers is a tremendous liability. The attacker obtained login information from one Debian developer using a sniffer program, then protected themselves from normal security inspection with TESO's Burneye Protection encryption program. The attacker also exploited a recently discovered vulnerability in the Linux kernel that gave them control over kernel memory space after an integer overflow operation. Released late last month, Linux 2.4.23 covers that vulnerability. Debian Project administrators say Advanced Intrusion Detection Environment programs installed on two of the machines were able to see through the SucKIT root-kit tool that launched malware in the kernel, and then notified administrators with a log warning. Though the attacker had root-level access to the machines, they did not change Debian code in the archives. Beale says current steps used to make the system secure again, such as disabling every developer account and conducting forensic analysis on the compromised servers, are inconvenient but necessary.
    Click Here to View Full Article

  • "Duke Researchers Will Reap Benefits From Cluster Farm"
    Triangle Tech Journal (12/02/03)

    Duke University is building a cluster supercomputing farm that will alleviate the administrative burdens of its researchers, as well as provide extra computing resources when necessary. Researchers at Duke commonly rely on computer clusters, which provide cheap and flexible computing power compared to monolithic mainframe computers. The new program is voluntary, and researchers will still retain control over their resources, according to Center for Computational Science, Engineering, and Medicine (CSEM) director John Harer. He says the cluster farm will allow researchers to take on projects they might otherwise have shied away from and take full advantage of the huge amounts of technical data collected at the university. Additionally, the computer cluster will attract key talent to Duke, says biostatistics and bioinformatics professor Tom Kepler. Duke CIO Tracy Futhey says computer clusters had become standard fare at research labs, but that complaints kept mounting over administrative and maintenance costs; the new cluster farm will provide top-notch cooling and hardware support, as well as limited virtualization of memory and processing resources so that other researchers can tap more processing power when colleagues' projects are idle, for example. Researchers will be able to use grant money to buy processors and memory in the cluster farm, analogous to land plots in a real farm, and apply those resources to certain applications, which are analogous to specific crops. Kepler expects the system will grow from its current 200 nodes to 400 nodes by 2004, dramatically increasing potential performance.
    Click Here to View Full Article

  • "With Roadside Data, Better Forecast for Snow Removal"
    New York Times (12/04/03) P. E6; Austen, Ian

    The state of Iowa was sufficiently impressed with last winter's trial run of the Winter Road Maintenance Decision Support System to carry out a second round of tests this winter using an upgraded version. The software system integrates specialized weather forecast data, readings from roadside weather stations, and pavement sensor measurements to predict local road conditions and make recommendations on when snow removal crews should be dispatched. It also suggests how the roads should be treated under specific climate circumstances, and William P. Mahoney III of the National Center for Atmospheric Research says this feature can be very useful in light of the impending retirement of experienced snowplowing and ice-clearing staff. The Iowa Department of Transportation's Dennis Burkheimer notes that the system has already demonstrated its potential to cut costs, improve road safety, and reduce environmental damage from salt and other chemicals. Mahoney reports that in terms of effective highway maintenance, the United States lags far behind Northern Europe and Japan; local National Weather Service forecasts for rural areas usually cover a region approximately 40 square miles on each side, but highway maintenance crews require a forecast for roughly 10 to 15 miles of road. Roadside weather stations that can read pavement temperature and moisture are tasked with collating more localized climate and road condition data. But despite the advantages promised by the Winter Road Maintenance Decision Support System, Burkheimer bemoans the state of weather forecasting in general. He observes that the National Weather Service gives little consideration to light rain or snowfall, and its snowfall measurements are often highly inaccurate.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "New Electronics Recycling Method Converts Gray to Green"
    NewsFactor Network (12/03/03); Martin, Mike

    Purdue University industrial engineering professor Julie Ann Stuart has developed a software method to make electronic recycling operations more efficient and profitable by refining plant scheduling. E-cycling efficiency will be critical as about 3 billion consumer electronics units and around 1 billion units of computer gear could be scrapped by the end of the decade, according to the latest report from the International Association of Electronic Recyclers. Stuart says the rules used to schedule operations in manufacturing processes, for instance, do not apply to recycling: Deadlines are not a major priority, and the speed with which workers extract the final "products"--raw materials such as steel and copper in the case of e-waste--is less critical. "In recycling, you have a different objective when you schedule jobs than you do in manufacturing, and you need different key measurements to achieve that objective," notes the Purdue researcher, who along with research partner Vivi Christina devised key recycling measurements and defined a new objective. Of primary consideration in e-cycling is maintaining enough short-term storage space, given the unpredictability of discarded equipment shipments. A lack of staging space can be costly, either because incoming shipments will have to be turned away, or trailers will need to be rented to store the surplus. Recycling companies' current strategy to maximize staging space is to first move the equipment that can be most rapidly disassembled, but Stuart has uncovered an even more efficient method in which the biggest products are moved out first; moving the most valuable or most quickly dismantled products first is not as effective. Stuart says that her and Christina's method, when simulated, "showed that using our scheduling policy could lower the required maximum staging volume by as much as half."
    Click Here to View Full Article

  • "Deadwood Finds an Eternal Electronic Life on the Web"
    Associated Press (11/28/03)

    In spite of the Internet's power to present data quickly and often, the World Wide Web is glutted with "deadwood"--sites abandoned and highly outdated. A recent study by Perseus Development found that of the 3,634 surveyed Web sites and Web journals--known as blogs--begun by individuals, two-thirds had not been updated for two months or longer and 25 percent had not been updated since the first day. Although many sites were abandoned because their creators got too busy to keep them going, others went to pot because a specific event took place and ended, such as a political campaign or the arrival of the new millennium. Sites are also abandoned because they cost money to operate: Unless they employ a free service such as Geocities or know someone willing to lend space, Web site developers must pay costs for Web hosting and domain names. Few developers are like Alan Porter and Anand Ranganathan, who are willing to shell out $14 annually to retain the domain name Votexchange2000.com, which three years ago allowed users in one state to trade their vote for president to somebody in a different state. The site operates off a computer at Ranganathan's desk at work. Porter states that they are keeping the site as a historical archive, although he concedes that it cannot last indefinitely as technology advances. Parts of the site, for example, do not work with newer browsers.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "North American ISPs Trial IPv6"
    Network World Newsletter (12/01/03); Marsan, Carolyn Duffy

    U.S. ISPs are starting to test services that support IPv6, but it may take until 2007 before U.S.-based multinationals are prepared to deploy production IPv6 networks, say advocates of the protocol. IPv6 offers an enhanced 128-bit addressing scheme, easier administration, and better security, but has been slow to catch on because it requires an expensive upgrade to the Internet's backbone and edge systems. AT&T, NTT, and Sprint are supporting a nationwide test network, Moonv6, with two more ISPs joining up. "Most U.S. enterprises do not have IPv6 testbeds yet," says North American IPv6 Task Force Chairman Jim Bound. "Starting next year we're going to be reaching out to the automotive industry, electronics, banking and manufacturing." Corporate network managers should get IPv6 compatibility when they buy network equipment and software, to ensure the ability to transition, recommends Yanick Pouffary, a member of the IPv6 Forum's Technical Directorate.
    Click Here to View Full Article

  • "ITU and ICANN Clash May Be Delayed"
    Computer Business Review (12/03/03); Murphy, Kevin

    The International Telecommunications Union (ITU) is holding a meeting next week at the World Summit on the Information Society (WSIS) in Geneva and is expected to form a consensus on Internet governance, with international governments adopting a Declaration of Principals for global access to information technology and a Plan of Action on how to accomplish the access. Some nations want the ITU to take over governance of the Internet's IP address space and domain name system, instead of letting ICANN run it, while others--including the United States--want to limit government input on public policy and have private-sector management. A third proposal calls for the ITU to define the problem first, and establish a working group to define Internet governance and identify related public policy issues. This proposal is the most likely to be adopted, and will require findings by 2005, when the WSIS meets again in Tunis. The compromise proposal calls for participants to "develop a common understanding for respective spheres of responsibility among governments, existing intergovernmental and international organizations and other forums." Register.com's Elana Broitman says, "Terms left in the Plan will be symbols of the continuing encroachment of the ITU into the domain name system," while others say the compromise merely delays the inevitable clash between the ITU and ICANN.
    Click Here to View Full Article

  • "The Guts of a New Machine"
    New York Times Magazine (11/30/03) P. 78; Walker, Rob

    Since its market debut two years ago, Apple's iPod portable digital music player has become the company's highest-volume product, and Apple CEO Steve Jobs credits its ascendance to iconic status to its seamless and simple design. The iPod features a white front and a stainless steel rear, while instrumentation is minimal--just a thumb-controlled wheel to scroll through songs and adjust volume, a large central button and four smaller ones to select and play tracks, and an LCD screen to display the song menu. The device is also equipped with headphones and a port for a FireWire line that can be used to recharge the battery and get digital songs from computers via iTunes software. The iPod's inner workings, like its aesthetics, are also simple: Its guts are comprised of a slim, rechargeable battery, a 1.8-inch hard disk that stores up to 5 GB (about 1,000 songs), and a circuit board that converts digitally encoded music files into audio files, supports the FireWire connection, and coordinates the various components' activities via a central processing unit. Apple's VP of industrial design, Jonathan Ives, attributes the product's appeal to its "overt simplicity," and says CEO Jobs wanted the design approach to be "about being very focused and not trying to do too much with the device--which would have been its complication and, therefore, its demise." The apparent seamlessness of the iPod and its components' integration is echoed in Apple's development process, in which designers and engineers form a coherent, collaborative whole. On the other hand, RealNetworks CEO Rob Glaser says the iPod is yet another example of Apple flying in the face of commercial logic in order to promote its "ideology." The device does not play tunes in formats employed by any other digital music vendor, and music bought through Apple's online music store does not play on any competing device. Glaser expects Apple's share of the digital music player market to dwindle when the iPod is measured up against rivals who generally support each other's products and services, though Jobs insists that the iPod will maintain its leadership through regular upgrades.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "New Weapons of Information Warfare"
    Computerworld (12/01/03) Vol. 31, No. 54, P. 41; Strassmann, Paul A.

    Self-aware computer attack programs will overwhelm and out-maneuver current computer defenses in the near future, writes information warfare expert Paul Strassmann. The October 2003 issue of Communications of the ACM contains an article by robot pioneer Han Moravec talking about when computers will achieve the same intelligence levels of different living organisms. According to that research, computers will have the same capabilities of a mouse as soon as 2005, and possibly be on par with humans by 2020. Malicious software programs harnessing these self-aware capabilities will be able to quickly and devastatingly exploit today's static computer defenses, mutating and changing strategy to optimize damage and alerting other attacking programs to new vulnerabilities. The only way computer system defenders can stymie these attacks is if they can disrupt and outpace attackers' learning cycle. Realizing this threat is imperative for today's businesses, since estimates from the Security Intelligence Products and Systems organization put cumulative computer security damages this year between $170 billion and $203 billion. With the cost of setting up computer security included, these costs make up a significant and growing percentage of total business computing costs. And as business computing spending is increasing relatively slowly, the rapidly growing percentage devoted to computer security means less money for value-added applications that grow the industry's baseline. Instead of relying on defensive tactics as obsolete as Cold War era military strategy, computer security needs to develop new active search-and-destroy techniques that can disrupt attackers before they have a chance to find and exploit vulnerabilities. Strassmann also says punitive liability for human authors of malicious programs also need to be increased, as does the CIO's role within organizations.
    Click Here to View Full Article

  • "Bioinformatics Moves Into the Mainstream"
    Industrial Physicist (11/03) Vol. 9, No. 5, P. 14; Ouellette, Jennifer

    The enormous amount of data being generated by bioinformatics projects such as gene mapping and proteomics research is driving a shift away from project-specific software and toward more commercially available products. "Researchers need smart software that can understand the biological complexity of the experiment and automate the routine analysis and data mining that need to take place," explains Rosetta Biosoftware general manager Doug Bassett. Eric Jakobsson of NIH's Center for Bioinformatics and Computational Biology splits bioinformatics into three general areas--atomic- and molecular-level simulation of biological systems via physics and chemistry axioms, dynamical systems modeling, and pattern analysis. Proteomics, which focuses on proteins and protein interactions, is a promising bioinformatics application. "We see a lot of potential in the proteomics arena in identifying gene and protein expression biomarkers that can help scientists diagnose a disorder, determine a patients prognosis, or whether a patient will respond to a particular drug," notes Bassett; but the current level of computational power is insufficient to model the kinetics of protein folding. IBM's Life Sciences business unit aims to improve proteomics research through the development of Blue Gene, a supercomputer that promises a hundredfold increase in computing speed over current models through its cellular architecture. Another bioinformatics application with potential is medical-imaging analysis, which is an arduous, error-prone procedure when done manually; Badrinath Roysam of Rensselaer Polytechnic Institute has devised algorithms that allow researchers to condense raw image data into Excel spreadsheets that can draw distinctions between normal tissue and test samples through statistical analysis. Bassett says the future of bioinformatics resides in systems biology, in which all biological system components are analyzed to establish and integrate their relationships to each other.
    Click Here to View Full Article

  • "The Superheroes of Technology"
    Government Executive (11/03) Vol. 35, No. 16, P. 68; Harris, Shane

    Supercomputers are employed to carry out enormous calculative operations for diverse fields such as weather forecasting, genome decoding, nuclear blast modeling, and astrophysics, and their movement toward a clustered architecture is being driven by the growing sophistication of off-the-shelf products. The National Weather Service uses a 7,000-square-foot supercomputer built by IBM to process around 100 million daily climate "observations," according to National Centers for Environmental Prediction CIO Kevin Cooley; the machine is composed of 2,800 fast processors equally split into two systems, Frost and Snow, that are programmed to make meteorological calculations and refine weather forecast accuracy and detail, respectively. The supercomputer processes 450 billion calculations per second and has 42 TB of storage capacity. More than 5 million weather products used by forecasters are generated by the Weather Service every day, and Cooley notes that the timely availability of meteorological data is a major concern for climate-sensitive industries such as agriculture, energy, and construction. In the mid 1990s, traditional supercomputing companies faced an eroding market as more and more consumers purchased PCs, while government researchers were churning out computer code that was too taxing for proprietary supercomputers. Cluster computing, in which machines were built out of commercially available components to save money, was seen as a solution to both problems, and the continuous advancement of commercial processors translates into greater computing power and faster computing speed. Analyst Tim Keenan thinks supercomputers are ideal for pattern recognition and link analysis, which are employed in intelligence operations and could be used to chart the path of disease epidemics and bioterrorism attacks. IBM is developing two new supercomputers for the Energy Department that promise even greater processing power, says IBM's Tom Burlin. Burlin says future systems "will be thinking more like the human mind," and be able to function with little need for human assistance by becoming self-healing.
    Click Here to View Full Article



    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM