HomeFeedbackJoinShopSearch
Home
 
ACM TechNews sponsored by Learn more about Texis, the text-oriented database providing high-performance search engine features combined with SQL operations and a development toolkit, that powers many diverse applications, including Webinator and the Thunderstone Search Appliance.   Learn more about Texis, the text-oriented database providing high-performance search engine features combined with SQL operations and a development toolkit, that powers many diverse applications, including Webinator and the Thunderstone Search Appliance.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Thunderstone or ACM. To send comments, please write to [email protected].
Volume 7, Issue 822: Friday, July 29, 2005

  • "Senate Moves Toward New Data Security Rules"
    CNet (07/28/05); Broache, Anne; McCullagh, Declan

    Security breach and data safeguard legislation was a key issue on Capitol Hill yesterday, as three distinct congressional committees mulled over similar proposals. The Senate's Commerce Committee unanimously accepted Sen. Gordon Smith's (R-Ore.) bill to give the FTC the authority to develop an information security program that offers "administrative, technical and physical" safeguards as well as establishes guidelines for notifying potential victims of a data security intrusion. Within a package of revisions accepted by the committee was one from Sen. Barbara Boxer (D-Calif.) suggesting that companies must alert individuals of an intrusion no more than 45 days after the incident. Another accepted revision from Sen. Bill Nelson (D-Fla.) called for a ban on the sale and publication of Social Security numbers except in special situations. The Senate Judiciary Committee voted to postpone its decision on three personal data security proposals, including the Personal Data Privacy and Security Act from Sens. Arlen Specter (R-Pa.) and Patrick Leahy (D-Vt.). Whereas all three bills require swift notification of security breaches, expansion of the federal government's regulatory powers, and establishment of minimum data security standards, the Specter-Leahy bill stands out by decreeing stiff prison sentences to those who intentionally hide information related to a security intrusion and who penetrate systems maintained by data brokers. The House's Energy and Commerce subcommittee discussed data protection legislation of its own on Thursday.
    Click Here to View Full Article

  • "SIGGRAPH 2005: Light Clouds, Camera Arrays and Speedier Rendering"
    Jacobs School of Engineering (UCSD) (07/28/05); Ramsey, Doug

    Four research papers to be presented at SIGGRAPH 2005 were written or co-written by researchers from UCSD computer science professor Henrik Wann Jensen's Computer Graphics Lab at the Jacobs School of Engineering, with Jensen himself contributing to three of the papers. One paper details Jensen and Ph.D. candidate Craig Donner's efforts to digitally mimic the lighting of multi-layered translucent materials such as paint, paper, or human skin. A second paper co-written by Jensen, UCSD grad student Wojciech Jarosz, Lund University's Tomas Akenine-Moller, and Petrik Clarberg with Lund and UCSD demonstrates a new method of importance sampling that efficiently enables the modeling of scene illumination that may change over time through the use of wavelets. Jarosz says the new technique offers a more than tenfold increase in efficiency over the best previous methods. A third paper Jensen co-authored with research assistant Anders Wang Kristensen and Akenine-Moller presents a new technique for real-time relighting of scenes where illumination is provided by local light sources via unstructured light clouds. Jensen says the method facilitates real-time rendering of models with dynamic cameras, moving lights, global illumination, and glossy materials, and adds that the system can efficiently manage indirect lighting for models with upwards of 10,000 triangles. Ph.D. student Neel Joshi, another member of Jensen's team, co-wrote the fourth paper, which discusses the use of large camera arrays to generate high-performance imaging. All four papers will be published in the Proceedings of ACM SIGGRAPH 2005.
    Click Here to View Full Article

    For more information on SIGGRAPH 2005, or to register, visit http://www.siggraph.org/s2005/.

  • "Revenge of the Nerds--Again"
    BusinessWeek (07/28/05); Elgin, Ben

    Search companies' meteoric ascendance in the tech domain is reflected in the defection of top software engineers to Google and Yahoo!. Recent hiring coups at Google include former Microsoft researcher Kai-Fu Lee and former eBay advanced technology research director Louis Monier, while new Yahoo! talent includes Larry Tesler, formerly of Amazon.com, and algorithm expert and former Verity CTO Prabhakar Raghavan, currently editor-in-chief of the Journal of the ACM. Engineers of such caliber are attracted to search companies primarily for the formidable technical challenges these firms must contend with, which encourages the pursuit of creative solutions. Another potent lure is the opportunity to work among such distinguished company as Unix operating system co-inventor Rob Pike and XML programming pioneer Adam Bosworth, both of whom are now at Google. "Engineers want to work on the coolest problems with the smartest people," says Moreover Technologies CEO James Pitkow. In addition, search companies promote a work culture that rewards and celebrates geekdom: Games are often held during downtime, gourmet meals are served in the company cafeteria, and engineers are encouraged to pursue personal projects. Concurrent with the rise of search companies is the Internet's supplanting of the PC as the tech arena's nexus of bleeding-edge innovation.
    Click Here to View Full Article

  • "GAO to Study National Plan to Recycle Computers"
    InformationWeek (07/27/05); Chabrow, Eric

    The increasing volume of used electronics will be harmful to human health and the environment unless properly managed, John Stephenson of the Government Accountability Office (GAO) told a Senate panel on July 26; he cited EPA estimates that less than 6 million computers out of roughly 50 million units that became obsolete in 2003 were recycled. E-waste recycling suffers from inconvenience, high costs, and a lack of federal standards, and Stephenson quoted a 2003 report from the International Association of Electronics Recyclers indicating that the costs associated with recycling are higher than the revenue received from reselling recycled goods. Although the GAO did not suggest any remedial actions, it did promise to more deeply probe the situation and present recommendations at a later time, and Stephenson implied that his office would seriously consider a national recycling strategy. "It is becoming clear, though, that in the absence of a national approach, a patchwork of potentially conflicting state requirements is developing, and that this patchwork may be placing a substantial burden on recyclers, refurbishers, and other stakeholders," he noted. Stephenson pointed to a United Nations University study concluding that up to 80 percent of the energy spent throughout the course of a computer's life can be conserved via reuse. In addition, he cited U.S. Geological Survey estimates that 40 to 800 times the concentration of gold in gold ore and 30 to 40 times the concentration of copper can be extracted from one metric ton of computer circuit boards.
    Click Here to View Full Article

  • "Rapid Results Without a Rugby Scrum"
    Financial Times-IT Review (07/27/05) P. 2; Baxter, Andrew

    As many companies are increasingly fed up with traditional software development methods, Scrum has become an appealing alternative; Scrum is a method of developing software where the project is broken down into small pieces divided among autonomous teams, that each yield a discernable benefit upon completion. Each day of the Scrum begins with a brief meeting where the teams are brought together and expected to report on what they accomplished since the last meeting, what they will work on until the next meeting, and what obstacles kept them from being more productive. Scrum developers Ken Schwaber and Jeff Sutherland say the emphasis is on collaboration, where traditionally isolated designers are brought together to produce a piece of software as a team. Many criticize the traditional waterfall method of software development where a team hands off one leg of a project to another team for the next phase of development; developing a project in this incremental fashion can drag on over time, often long enough for the customer's needs to change, and it also runs the risk of one underperforming team derailing a project altogether. Schwaber is adamant that Scrum is a free technique that anyone can adopt, refusing all invitations to commercialize it. Corporate cultures pose one of the greatest roadblocks to Scrum's acceptance, as Schwaber admits it can be a seismic shift for a company to move toward a climate based on small, collaborative teams, and that only a third of the companies that try to implement Scrum succeed. Leadership is also pivotal in facilitating any kind of structural change, as many workers are inherently reluctant to give up what is familiar to them.
    Click Here to View Full Article

  • "Fingernails Store Data"
    Technology Research News (08/03/05); Patch, Kimberly

    Japanese scientists have demonstrated that data can be written into and read from a human fingernail through the use of a laser. Dot patterns were written into a fingernail by striking it with a laser that ionized molecules and caused a miniature explosion that decomposed the keratin protein molecules in the targeted areas. These areas can be read thanks to their ability to absorb and emit light at a higher rate than the untouched fingernail material. Three layers of dots were recorded and read at 40, 60, and 80 microns below the surface of the nail; the dots can be read using an optical microscope because they are visible in blue light. The dots are small enough to allow 2 GB of data to be written per cubic centimeter of fingernail, which means 5 MB could be stored within a fingernail recording area of 5 millimeters by 5 millimeters by one-tenth of a millimeter deep. The proof-of-concept samples were still readable 172 days after recording, which is probably the practical limit of fingernail storage since a nail has grown enough to be replaced after six months. University of Tokushima professor Yoshio Hayasaki says the technique could be applied to personal authentication, and dovetail with biometric authentication such as vein patterns and fingerprints. He believes the technique could be practical within a few years, once a safe, cheap method for writing data is found.
    Click Here to View Full Article

  • "Pittsburgh Unveils Big Ben the Supercomputer"
    Pittsburgh Supercomputing Center (07/20/05)

    The Cray XT3 supercomputer, also known as Big Ben, is now online at the Pittsburgh Supercomputing Center (PSC), where it will support nationwide scientific research as part of the National Science Foundation's TeraGrid. Big Ben consists of 2,090 AMD Opteron processors with a general peak performance of 10 teraflops, and was acquired last September through a $9.7 million NSF grant. The supercomputer will be the successor to LeMieux, a 3,000-processor, 6-teraflop machine currently serving as PSC's lead system. Big Ben can perform better than LeMieux thanks to its superior inter-processor bandwidth, and research domains that stand to benefit from this upgrade in processor communication speed include protein dynamics studies, design of new materials, nanotechnology, severe storm prediction, and earthquake soil-vibration simulation. Big Ben's processors reside in 22 cabinets or nodes, while 6.5 Gbps bidirectional inter-processor communication is sustained by the Cray SeaStar interconnect. ClusterFileSystems' Lustre File System serves as the XT3's parallel file system, and the supercomputer boasts 245 TB of local disk storage and 200 additional TB of global disk. Carnegie Mellon University President Jared Cohon described computational science as a key pillar of the U.S. economy, and noted that PSC has been a leading national computational science resource for almost two decades.
    Click Here to View Full Article

  • "'Shadow Walker' Pushes Envelope for Stealth Rootkits"
    eWeek (07/28/05); Naraine, Ryan

    HBGary director of engineering Jamie Butler and University of Central Florida Ph.D. student Sherri Sparks disclosed a new method for concealing malware with their demonstration of the Shadow Walker stealth rootkit at the Black Hat Briefings conference. Analysts say such research is very fortuitous, given the discovery of rootkit-like features in common spyware programs. Sparks described Shadow Walker as a proof of concept for fourth-generation rootkits that can thwart current rootkit detectors by hiding themselves within the kernel memory with negligible performance impact. She said an in-memory rootkit could avoid disk detection if installed by a kernel exploit. "If we can control a scanner's memory reads, we can fool signature scanners and make a known rootkit, virus or worm's code immune to in-memory signature scans," Sparks reasoned. "We can fool integrity checkers and other heuristic scanners which rely upon their ability to detect modifications to the code." Internet security practitioners attending Black Hat were chilled by Shadow Walker's implications, and Sparks suggested that anti-virus vendors re-conceptualize rootkit scan methodology. She recommended a hardware memory scanner with access to read physical memory as the most effective measure for detecting rootkits such as Shadow Walker.
    Click Here to View Full Article

  • "Not Playing to Women"
    Associated Press (07/22/05); Sandoval, Greg

    The video game industry needs more female programmers if it wants to expand its appeal beyond its core audience of men; IDC analyst Schelley Olhava says 70 percent of console game players are male, but female players could comprise a lucrative market--if their views were considered by the gaming industry. "There's no question that we need more diversity," says International Game Developers Association executive director Jason Della Rocca. "We're saying that we need to grow the business and broaden the audience, and yet the game creators are still mostly young, white males." This dominant male presence has helped cultivate a perception that the industry is primarily interested in violent action games where female characters are usually voluptuous sexpots, and this is discouraging for female game designers and programmers, according to insiders. Midway Home Entertainment software engineer and MIT graduate Tammy Yap says the marketing campaigns of software companies appear to reflect this attitude, as demonstrated by the scantily clad vixens adorning the covers of game magazines. Women in the gaming industry must also frequently contend with the loneliness of being in a boy's club. Anthony Borquez of the University of Southern California's Integrated Media Systems Center notes that there are typically no female students in the school of engineering's video game programming classes, while Great Britain's University of Derby's new game programming course attracted 106 applicants, but all were male. However, an Electronics Arts-sponsored summer programming camp at USC for female high school students attracted eight students; last year none applied.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

    For information on ACM's Committee on Women in Computing, visit http://www.acm.org/women.

  • "The Unity of Linux"
    ZDNet (07/28/05); Murphy, Paul

    Despite the inclination of some to see it as a competitor to Unix, Linux actually carries the Unix flag, and counts as just one of several variants of the open source method that stands in opposition to the Windows model, according to IT consulting industry veteran and author Paul Murphy. Unix represents a set of ideas more than a brand, and remains true to its historic commitment to free access to code, though critics mistakenly accuse Unix of discontinuity and claim its focus is fragmented. Linux is joined by BSDs and Solaris as the most popular and influential forms of Unix available today, but to gain proficiency in any of these forms, a core knowledge of processes can be a springboard from which system engineers can jump to almost any Unix application with minimal acclimation difficulty. Programmers with MCSE status are penned in by a more limited skill set when applied to Unix systems than their counterparts with Red Hat certification, for example. Compared with Unix, Microsoft has actually been erratic in its focus, offering little continuity throughout the different iterations of Windows, Murphy says. However, the comparison is faulty on a fundamental level as Unix is more a core belief that can support many brands, while Windows is itself one company's brand. Regardless of which application of Unix is in use, it is the embrace of open source programming that benefits the entire community, Murphy concludes.
    Click Here to View Full Article

  • "GILS Could Soon Get the Boot"
    Federal Computer Week (07/25/05); Sternstein, Aliya

    The National Institute of Standards and Technology (NIST) is considering commercial alternatives, such as Google and Yahoo!, to replace its 10-year-old Global Information Locator Service (GILS) standard that indexes electronic information available to the public. Despite federal mandate to adhere to the standard under the Paperwork Reduction Act, GILS has been largely ignored. Endorsers of GILS had hoped the standard would retrieve data from disparate systems irrespective of syntactical differences, as well as steer future historians toward the most relevant information through searches based on title, author, subject, date, and location. A recent survey conducted by the General Services Administration (GSA) omitted GILS as it cast around for a new FirstGov search engine in line with W3C specifications. The moves by the NIST and GSA come as a disappointment to members of the library community, many of whom had embraced GILS. Developers of FirstGov claim they will partner closely with the library community to ensure interoperability. The GAS has appealed to vendors to utilize W3C standards so that government and the private sector operate compatibly. "The federal government is moving toward the adoption of commercial standards," said the GSA's Keith Thurston.
    Click Here to View Full Article

  • "U.N. Internet Summit Draws Rights Groups' Fire"
    MSNBC (07/27/05); Moran, Michael

    The United Nations' International Telecommunications Union (ITU) is being criticized by several rights organizations, including the Freedom to Publish, International Publishers Association, and Reporters Without Borders, for its decision to hold the second part of its World Summit on the Information Society (WSIS) in Tunis, the capital of Tunisia. The organizations are recalling UN Secretary-General Kofi Annan's speech during the first half of the WSIS, in which he said governments should use the Internet to empower their citizens, not to censor and control them. In its 2004 "Freedom in the World" report, international advocacy group Freedom House classified Tunisia as a restrictive country, noting the government's tight hold on broadcasting and publications. "It is a cruel irony that Tunisia will host the World Summit on the Information Society in November 2005," says Reporters Without Borders. Over a dozen free-speech groups have banded together to pressure the United Nations and the Tunisian government to make significant changes or to hold the WSIS somewhere else. UN spokesman Brenden Varma says the ITU member states have the right to decide where to hold their meetings, but Annan is currently urging the ITU to make changes that will prevent decisions that "don't fit with the values of the United Nations" in the future.
    Click Here to View Full Article

  • "Really Open Source"
    Inside Higher Ed (07/29/2005); Jaschik, Scott

    Efforts to make college course materials freely available on the Web, such as MIT's OpenCourseWare and Rice's Connexions programs, are reshaping the way educational content is distributed. Connexions founder Richard Baraniuk, a professor of electrical and computer engineering at Rice, says the current model of selling textbooks is broken, and that information should be absolutely free. Connexions, created in 1999, has drawn source material from around the world, with professors in all manner of disciplines posting material that Baraniuk claims has all been of high quality. He has incorporated a rating tool into the program, similar to Amazon's, which he hopes will provide user feedback to guide users to the best material. Baraniuk believes Connexions is especially useful for community colleges, which rely heavily on adjunct professors who typically do not have the time to develop elaborate curricula. In addition, he says the cost of textbooks in certain fields is too inflated. This week Connexions begins a partnership with the National Council of Professors of Educational Administration, a scholarly body that seeks to popularize the site among the academic community, calling on its 1,000 members to generate material. Adhering to the theme of offering information without delay, Baraniuk insists that the peer reviews of Connexions content will be welcomed from all users after the material is posted, unlike the traditional academic journal model where the reviewing process often substantially delays publication.
    Click Here to View Full Article

  • "Big Problems? Call Out the Big Iron"
    Sydney Morning Herald (Australia) (07/26/05); Carey, Adam

    Australia's leading supercomputers are being used for academic as well as commercial ventures, while new hardware investments are helping the country regain its status as a major supercomputing center. The most powerful Australian supercomputer is housed at the Australian Partnership for Advanced Computing's (APAC) national facility at Australian National University (ANU). The SGI Altix 3700 Bx2, which the latest Top500 benchmark ranked the 26th fastest supercomputer in the world, is a 10-teraflop system that uses 1,680 Intel Itanium2 processors with shared 32-processor memory nodes connected by SGI's proprietary NUMAlink network. The APAC facility boasts a 1,200 TB mass data storage system, and is used by academic researchers for scientific research such as the simulation of ions' passage through cell membrane channels, which Shin-Ho Chung of ANU's Biophysics Group says could be applied to drug design. The second- and third-fastest Australian supercomputers, a pair of IBM Blade Center machines with Intel Xeon chips, are used by the Animal Logic visual effects and animation company to create digital effects for movies, and animation and title sequences for TV shows. Animal Logic IT manager Monique Bradshaw says, "The amount of processing power that we need to produce high quality images is very similar to what is needed for high end research," the difference being that her company generates those images for entertainment. She says the computational muscle provided by supercomputers is used to add detail to the animator's work. Petroleum Geo-Services employs Linux cluster-based supercomputers to construct maps of the ocean floor with a high level of detail for the purposes of oil exploration, as well as model the depletion of reservoirs over time.
    Click Here to View Full Article

  • "Creepy Crawlies to Explore Other Worlds"
    New Scientist (07/23/05) Vol. 187, No. 2509, P. 24; Dartnell, Lewis

    Many scientists believe the next step in the evolution of unmanned space exploration is biomimetics, in which nature serves as the inspiration for more robust and versatile robot probes. Roger Quinn of Case Western Reserve University's Biologically Inspired Robotics Laboratory has developed robots that adopt the locomotive mechanics of insects to traverse uneven terrain: Achievements include the six-legged, pneumatically-driven Robot V walker, which is modeled after a cockroach. A more advanced model, Whegs II, is lighter and less complex than Robot V thanks to a hybrid leg-wheel mechanism, while antennae sensors aid navigation and a flexible middle section facilitates climbing over large obstacles. A team of researchers led by Frank Kirchner at Germany's University of Bremen is developing Scorpion, an eight-legged walker that controls the motors driving each leg joint via a rhythmic signal from electronic circuits modeled after central pattern generator neurons in animals. In Britain, Alex Ellery of the University of Surrey's Surrey Space Center is collaborating with Julian Vincent with the University of Bath's Center for Biomimetic and Natural Technologies on a proposal for a Martian explorer that can sample the red planet's soil for signs of life using a self-contained, burrowing drill modeled after the woodwasp's ovipositor. Biomimetics is currently a hard sell for NASA engineers, who claim such design strategies have yet to be sufficiently polished or rigorously evaluated. However, Ellery predicts wheeled explorers will be incapable of fulfilling the missions' scientific parameters in about 15 years, at which point biomimicry will be more widely embraced.
    Click Here to View Full Article
    (Access to full article is available to paid subscribers only.)

  • "Whose Work Is It, Anyway?"
    Chronicle of Higher Education (07/29/05) Vol. 51, No. 47, P. A33; Carlson, Scott

    The Copyright Office is holding a series of hearings this week concerning whether copyright law should be amended to permit scholars, artists, historians, and others to use "orphan works"--copyrighted content whose creators cannot be identified--more liberally. Various proposals for the disposition of orphan works are being floated: Some groups suggest that changes in law should only pertain to domestically published orphan works; music-licensing organizations would like exemptions for orphan works to be inapplicable to music; and visual artists are against any orphan-work exemptions. Columbia University law professor Jane Ginsburg is concerned that any change in copyright law would be a blow to individual artists and independent writers who lack the clout or financial resources to promote ownership of their work. American University law professor Peter Andrew Jaszi encouraged his students at the Glushko-Samuelson Intellectual Property Law Clinic to propose a solution to the orphan works problem, which many libraries and publishers now consider a model scheme. The proposal defines an orphan work--published or unpublished--as any whose owner cannot be identified. Use of the work is permitted after "reasonable effort" to find the owner has been made, and the law should not dictate the specifics of such an effort; and an owner who materializes after an apparently orphan work has been used is entitled to a small amount of compensation, but not entitled to statutory damages, attorneys' fees, or injunctions. Meanwhile, Stanford University law professor Lawrence Lessig suggests that producers of creative works be required to register their material within 25 years of publication, while software developers would get five years.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "The 100-Year Archive Dilemma"
    Computerworld (07/25/05) P. 39; Mearian, Lucas

    To address the costs of storing an ever-growing body of data, and to comply with federal regulations demanding that more content be stored for a longer time, companies are in search of new methods for long-term digital storage. Currently, transferring the information from one medium to another is the only way to extend data's lifespan, though alternatives are in the works. Some turn to plain-text formats, such as ASCII and Unicode, though while they offer compatibility, they do not support enhanced features such as graphics. Alternatively, Adobe's PDF/A offers long-term data storage without the backward-compatibility issues. The technologies with the most potential are XML-based data storage designs. For media, the Storage Networking Industry Association (SNIA) is addressing the challenge of the 100-year archive in its search for a format that will always be readable. Others are not betting on one particular format, instead opting for expansive disk arrays that ensure the data are readable and available, but even they do not solve the long-term problem. Central to any archival project, such as the open-systems management center the state of Washington recently created, is the backup of data. Washington paid considerable attention to metadata to aid future searches. Each document is tagged with information specific to its creation, such as its author, the location and time of its creation, and even the computer used to produce it. The system also standardized its formats: All Word documents are converted into PDF files, and all images are turned into TIFF files. Long-term data storage is still evolving, though, and there is minimal continuity at this point. "There aren't what we'd call standards for long-term archiving--only best practices," said Strategic Research's Michael Peterson, who also serves as a program director for the SNIA Data Management Forum.
    Click Here to View Full Article

  • "Technology Roadmap: Trusted Computing Architectures"
    Network Magazine (07/05) Vol. 20, No. 7, P. 53; Dornan, Andy

    The computer industry is hyping the Trusted Platform Module (TPM) as an important defense against malware, one that allows users to check that their PCs have not been infected and enables networks to confirm the identity and status of remote machines. The TPM is a PKI chip featuring a cryptographic coprocessor, secure storage for encryption keys, and a random number generator. However, few customers turn on the TPM or employ its encryption software, partly due to the technology's lack of readiness. Another disadvantage is that the TPM may permit the service provider or vendor to enforce unfair software licenses or inhibit interoperability. Other trusted computing technologies include Intel's Active Management Technology, which allows PCs to be controlled by the IT department even when they are turned off; Microsoft's Secure Startup, which encrypts and signs the entire hard disk with keys on the TPM, but can hinder performance and cannot preserve data if the TPM is damaged; and built-in virtualization from AMD and Intel that enables the simultaneous operation of multiple operating systems. This last option supports secure administration without user awareness or intervention, although each OS still needs its own memory, disk space, and other resources. Slow progress in the area of trusted computing has a positive side, in that the emerging trusted computing architecture seems poised to deliver more security and less vulnerability to vendor debasement than the originally proposed architecture.
    Click Here to View Full Article

  • "Bolstering U.S. Supercomputing"
    Issues in Science and Technology (07/05) Vol. 21, No. 4, P. 28; Graham, Susan L.; Snir, Marc; Patterson, Cynthia A.

    Current government policies and spending levels will not adequately provide the supercomputing resources needed to bolster U.S. defense and national security, write Susan Graham, Marc Snir, and Cynthia Patterson, who all participated on the National Research Council committee that produced the report, "Getting Up to Speed: The Future of Supercomputing." "The federal government must provide stable long-term funding for supercomputer design and manufacture and also support for vendors of supercomputing hardware and software," the authors contend. Graham, Snir, and Patterson attribute supercomputing's lamentable state to its commoditization, which is reflected in the preponderance of cluster systems built from off-the-shelf components in the TOP500 list; the market value of custom systems has shrunk along with demand, concurrent with dwindling federal support for the development and acquisition of custom supercomputers. Yet custom systems' computational speed is essential for climate modeling, intelligence analysis, nuclear stockpile management, and other critical national domains. The authors suggest that federal agencies reliant on supercomputing assume responsibility for strengthening the U.S.'s supercomputing infrastructure and funding the community development of a long-term supercomputing roadmap, while the government must also support hardware and software vendors either through federal funding of R&D expenses, regular acquisitions, or both. Graham, Snir, and Patterson also recommend the government guarantee that the most powerful supercomputers are available to scientists with the most pressing computational needs. In addition, science communities that use supercomputers should also wield significant influence in the provision of sufficient supercomputing infrastructure.


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM