HomeFeedbackJoinShopSearch
Home
 
ACM TechNews sponsored by Learn more about Texis, the text-oriented database providing high-performance search engine features combined with SQL operations and a development toolkit, that powers many diverse applications, including Webinator and the Thunderstone Search Appliance.   Learn more about Texis, the text-oriented database providing high-performance search engine features combined with SQL operations and a development toolkit, that powers many diverse applications, including Webinator and the Thunderstone Search Appliance.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Thunderstone or ACM. To send comments, please write to [email protected].
Volume 7, Issue 816: Friday, July 15, 2005

  • "ACM Past President Maria Klawe and Bill Gates to Address Importance of Academia/Industry Collaboration"
    ACM (7/15/05); Gold, Virginia

    ACM Past President Maria Klawe, Dean of Engineering and Applied Science at Princeton University, will join Bill Gates, Microsoft Corp. Chairman and Chief Software Architect, to discuss opportunities and challenges facing academic researchers and the technology industry on a live webcast at the sixth annual Microsoft Research Faculty Summit on Monday, July 18, in Redmond, WA. Also participating in the discussion, which will cover the importance of collaboration between academia and industry, will be Rick Rashid, senior vice president, Microsoft Research, and former chairman of the ACM Software System Awards Committee. Gates, Klawe, and Rashid will discuss their mutual concerns about the opportunities and challenges facing academic researchers and the technology industry, and the importance of collaboration between academia and industry, before an audience of more than 400 faculty researchers from 175 universities worldwide. Topics include the challenges facing computer science including dwindling government funding for research, declining numbers of students entering U.S. computer science programs, and efforts to increase the participation of women and minorities in computing careers. The Faculty Summit is an annual conference that provides a lively forum for members of the global academic community and Microsoft researchers to explore collaborative opportunities and showcase their innovative work in diverse areas of computing. The live webcast (from 9 - 9:45 a.m. PDT) will be available at varying connection speeds from these links on Monday, July 18:
    http://metahost.savvislive.com/microsoft/20050718/gates_msr_faculty_20050718_56.asx
    http://metahost.savvislive.com/microsoft/20050718/gates_msr_faculty_20050718_100.asx
    http://metahost.savvislive.com/microsoft/20050718/gates_msr_faculty_20050718_300.asx

    For more information, visit http://campus.acm.org/public/pressroom/special/klawe_gates_webcast.cfm

  • "Open-Source P2P Projects Keep Swapping"
    CNet (07/15/05); Borland, John

    Independent open-source file-swapping projects have not been muzzled by the Supreme Court's recent ruling that commercial file-trading companies that encourage copyright infringement by users could be liable for digital piracy, although the court case has been a topic for discussion among open-source programmers. Freenet founder Ian Clarke says commercial peer-to-peer (P2P) software developers are likely to be the ones most affected by the court's decision. "Even then, the impact is really to make them more careful about what they say both within their companies and externally about aspirations for their software," he notes. Many open-source programming projects are decentralized and do not generate revenues, although the file-swapping software they produce is similar in function and popularity to that used by commercial P2P software distributors. Intellectual property lawyer Jeffrey Neuburger argues that open-source developers could be deemed legally responsible for copyright infringement if they are not careful. Few open-source programmers seem willing to halt their projects, although some are taking pains not to appear as through they are endorsing copyright infringement. Still, it is hard to deny that many of the most prominent P2P concepts and products, such as BitTorrent, sprang from open-source projects. BitTorrent is a popular file-swapping tool, but its use for sanctioned activities, such as the distribution of open-source operating system files, is increasing.
    Click Here to View Full Article

  • "Gap Emerges Between High-Performance Computing Hardware, Software"
    Computerworld (07/14/05); Thibodeau, Patrick

    A new report from IDC indicates a widening gap between the capabilities of high-performance computing (HPC) systems' hardware and software: Hardware vendors can assemble HPC systems with hundreds or thousands of processors in parallel, while independent software vendors are producing HPC applications that employ just 12 or 16 processors, on average. IDC analyst Earl Joseph says there are not enough users who want to scale their systems across hundreds or thousands of processors to justify the high costs for rewriting and testing applications on large systems. Companies that test and design products via simulation and virtualization see HPC as an essential tool for shortening the product development cycle and speeding up time-to-market. The IDC study was issued on July 13 to coincide with the Council on Competitiveness' High Performance Users Conference, and was commissioned by the council and the Defense Advanced Research Projects Agency. There was general agreement among conference attendees that carrying out product development in a simulated environment has tremendous business value, but HPC needs to be more widely embraced. Fluent's Paul Bemis expressed his belief that making HPC accessible to smaller businesses will broaden its adoption. Chevron CTO Donald Paul said research should be government's chief responsibility, while "the key role for industry is to connect into that research."
    Click Here to View Full Article

  • "Weka's Winning Ways"
    Scoop (NZ) (07/12/05)

    The ACM Special Interest Group on Knowledge Discovery and Data Mining (SIGKDD) will honor Waikato University's Computer Science Machine Learning Group with its highest honor, the 2005 SIGKDD Data Mining and Knowledge Discovery Service Award, at the 11th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining on August 21 in Chicago. SIGKDD is recognizing the group for its development of the Weka (Waikato Environment for Knowledge Analysis) data mining software, and the accompanying data mining and machine learning textbook, "Data Mining: Practical Machine Learning Tools and Techniques." Available for free, the software has attracted 200,000 downloads on SourceForge since April of 2000. "It is currently downloaded at a rate of 10,000 a month," says professor Ian Witten. "The Weka mailing list has over 1,100 subscribers in 50 countries, including subscribers from many major companies." There have been 15 major projects that make use of Weka.
    Click Here to View Full Article
    For more on the upcoming SIGKDD conference, or to register, visit http://www.acm.org/sigs/sigkdd/kdd2005/

  • "E-Voting: Paper Trail Versus Transparency"
    NewsForge (07/13/05); Lyman, Jay

    Johns Hopkins University computer science professor Avi Rubin and VerifiedVoting's David Dill discuss the relative value of transparent e-voting systems and a verifiable voter paper receipt (VVPR), the latter of which has been receiving most of the attention. Dill ranks publicly available voting machine code as the second most important requirement for reliable and trustworthy e-voting, after VVPRs and more liberal recount laws or random audits to check paper ballots against electronic counts; his argument for assigning less priority to disclosure is that voting machines can have exposed source code yet still be untrustworthy. Still, Dill supports full disclosure of e-voting system software and hardware, though he thinks companies should be permitted to scrub their programs prior to release to avoid the exposure of vulnerabilities. His attitude to open source development of voting systems is to wait and see if such a goal can be successfully reached. Rubin says software should be open from a project's outset, but he sees no legislative attempts to make transparent e-voting technologies a requirement; he contends that publicly available code is essential, but by itself inadequate for reliable, trustworthy e-voting. Dill reports that open source e-voting systems and software are not a federal requirement, while Rubin says few legislators have pushed for such measures. Rubin and Dill concur that the major, proprietary vendors of e-voting systems are resisting the open-source concept, mostly through willful ignorance. "Now is the time to fix the [e-voting] system, when people are less worried about getting through the next election," Dill says.
    Click Here to View Full Article
    For information on ACM's activities involving e-voting, visit http://www.acm.org/usacm

  • "DHS Reorganization Creates New Cybersecurity Position"
    IDG News Service (07/13/05); Gross, Grant

    A restructuring of the Department of Homeland Security (DHS) announced by DHS Secretary Michael Chertoff on July 13 will involve the establishment of a position of assistant secretary for cyber and telecommunications security. IT groups have been advocating for such a move, as have several bills introduced in Congress this year, arguing that it makes cybersecurity a higher DHS priority. ITAA President Harris Miller recently said an assistant secretary will possess the clout to set policy and facilitate communication between the government and the private sector. He said private industry is often hit by cyberattacks sooner than government agencies, which raises the necessity for a "sophisticated, real-time, highly trusted" way for the government and private companies to share information. IT organizations also think that a more authoritative higher-level position would reduce the amount of turnover among federal cybersecurity directors. Ounce Labs CEO Jack Danahy said an assistant secretary can combine several inter-government cybersecurity initiatives, but some security experts are skeptical that such an officer would have sufficient authority to significantly improve cybersecurity. James Lewis with the Center for Strategic & International Studies admitted in a recent interview that a higher-level position could symbolically give cybersecurity issues a bigger profile, but doubted it would make a major difference in practice.
    Click Here to View Full Article

  • "Computer Science Enrollment Lulls After End of Tech Boom"
    Diamondback (07/14/05); Campbell, Kate

    In an effort to boost computer science enrollment, the University of Maryland has developed several outreach programs to champion the field of computer science. The University has seen an overall decline in enrollment in its College of Computer, Mathematical, and Physical Sciences of about 33 percent since 2002, while enrollment of women is down 39 percent, and minority enrollment has fallen 47 percent. Department Chairman Larry Davis believes those numbers merely reflect a natural correction of the elevated interest fueled by the booming tech economy and Y2K concerns around the turn of the century. The outreach programs include summer seminars that bring matriculating high school students to the campus and efforts to spur minority enrollment in graduate school. Computer science professor Nelson Padua-Perez has also used his Java Passport program, which contained a healthy representation of women and minorities from area high schools, as a platform to address the image of the computer scientist. "If you look here in [the department] you see very balanced people," he says. Padua-Perez has geared elements of his program to cater to women, and other programs at Maryland have emerged to advance minority involvement in the field. Padua-Perez is applying for a National Science Foundation grant to house high school students on the campus during the program and develop an advanced curriculum for participants with prior experience.
    Click Here to View Full Article

  • "UN to Propose Global Internet Council"
    Computer Business Review (07/14/05); Murphy, Kevin

    The United Nations' Working Group on Internet Governance (WGIG) is set to release a report next week that will recommend the creation of a new governing body to replace ICANN, which is affiliated with the U.S. Department of Commerce. The report, which has yet to be published but was obtained by ComputerWire, recommends that "no single government should have a pre-eminent role in relation to international Internet governance" and suggests that control be transferred to "a new space for dialogue for all stakeholders on an equal footing on all Internet governance-related issues." The report outlines four possibilities for this forum, three of which are linked to the United Nations (UN) and three of which keep ICANN somewhat intact. The first proposal calls for the creation of a Global Internet Council comprised of government representatives with UN oversight; it would take over the Department of Commerce's current role in ICANN and set international domain name policy. The second model would resemble the status quo, while the third would establish an International Internet Council, which would take hold of Commerce's oversight powers. The broadest recommendation calls for three bodies-- the Global Internet Policy Council, a Global Internet Governance Forum, and World ICANN (WICANN)--that would work together to coordinate and oversee various Internet functions. In what seems to some as a response to the forthcoming WGIG report, the U.S. government's National Telecommunications and Information Administration (NTIA), which currently oversees ICANN, issued four "U.S. Principles on the Internet's Domain Name and Addressing System" last week that pronounce the country's intention to retain its current role in Internet governance.
    Click Here to View Full Article

  • "Study Shows Promise of Entry-Level IT Jobs for Low-Wage Workers"
    UC Berkeley News (07/14/05); Maclay, Kathleen

    Karen Chapple, an assistant professor of city and regional planning at the University of California, Berkeley, has found that despite all the warnings about offshore outsourcing, there are many IT jobs available for entry-level, low-wage workers. Her research suggests that workers enjoy steep wage increases of up to 56 percent in three years once they land an initial position in IT. AnnaLee Saxenian, dean of Berkeley's School of Information Management & Systems, believes the U.S. IT industry is emerging from a recession characterized by a loss of jobs to India, including an unexpected growth in the industries of health care, agriculture, and biotechnology. Chapple found that large cities with substantial entry-level job markets tend to be the most susceptible to entry-level IT job losses. She also determined that offshoring typically appeals only to larger IT companies, and that the more than 1 million entry-level IT jobs in the United States cannot be filled exclusively by foreign workers or those with four-year degrees, particularly as those jobs grow by 5 percent annually over the next eight years. Her study, funded jointly by the National Science Foundation and the UC Institute for Labor & Employment, found that the industry is moving more toward part-time and contract labor, fueled partially by layoffs and baby-boomer retirements. With federal funding for training programs declining, privatized nonprofit programs are emerging that Chapple has found to be more attuned to the requirements of the IT job market.
    Click Here to View Full Article

  • "Panama Gets Software to Assess Students"
    Associated Press (07/12/05); Kaczor, Bill

    The Florida Institute for Human and Machine Cognition is giving its Cmaps concept mapping software to schools and teachers in Panama and elsewhere as a learning enhancement tool. The software was created partly as a knowledge preservation system that represents information as a diagram of concepts linked by verbs or phrases. Cmaps can be used to evaluate student knowledge, structure information for writing projects, promote thinking and problem-solving, and help educators flesh out new curricula. Computer Cmaps can also be connected to Web sites, and support collaboration between students in different classrooms, schools, and even countries. Concept mapping is one part of Panama's Get Connected initiative, which Panamanian secretary of governmental innovation Gaspar Tarte describes as an effort to migrate education from a repetitive, memorizing model to a dynamic paradigm. The Florida Institute started training Panama's teachers in concept mapping in February, and many educators will work out their concept maps on paper until computers are installed in all participating Panamanian schools. Although Brown-Barge Middle School in Pensacola, Fla., has an integrated curriculum course that employs Cmaps, Florida Institute associate director Alberto Canas says concept mapping's adoption by U.S. schools has been sluggish, mainly because of bureaucratic hurdles.
    Click Here to View Full Article

  • "Simulated Society May Generate Virtual Culture"
    New Scientist (07/14/05); Knight, Will

    Researchers participating in the New and Emergent World Models Through Individual, Evolutionary and Social Learning (NEW-TIES) project are developing a society of virtual characters that can eat, reproduce, communicate, and learn through interaction. The scientists hope the virtual people's communications ability will spawn sophisticated cultural activities similar to those found in human societies. The project will place about 1,000 "agents" in a simulated environment hosted on a computer network distributed across two U.K. universities, two Dutch universities, and one Hungarian university. Each agent will be programmed to move throughout the simulation, build simple structures, survive through eating, and learn from its environment; they will also be able to mate with members of the opposite gender and produce offspring that will receive a random collection of hereditary traits from both parents. Moreover, the virtual characters will concoct their own language by pointing to objects and employing randomly generated "words." University of Surrey scientist Nigel Gilbert says the experiment could be particularly incisive if the simulated people develop ritual practices or learn to use non-functional objects symbolically. Indiana University's Edward Castronova is skeptical that NEW-TIES will generate important insights into human society and culture, arguing that "Inferences from an entirely artificial system are always going to be weakened by the artificiality." He says it is more sensible to study real human societies that mature within virtual fantasy environments.
    Click Here to View Full Article

  • "Reading Phone Text One Word at a Time"
    CNet (07/13/05); Kanellos, Michael

    Stanford University researchers have tapped the concept of Rapid Serial Visual Presentation (RSVP) to create BuddyBuzz, software that makes reading text on cell phones easier. BuddyBuzz flashes just one word at a time in the center of the display briefly before proceeding to the next word, making each word less difficult to read than in standard presentations; the demo version of BuddyBuzz enables users to adjust the speed of the scrolling text. BuddyBuzz also follows syntax rules: Proper names are flashed for a longer length of time than prepositions, and pauses are introduced between stories as well as for periods and commas. Stanford undergraduate research specialist Dan Eckles says the reading method aligns particularly well to short text bursts such as news articles or blog postings. RSVP was originally conceived in the 1970s, but the idea may finally find an enthusiastic market in cell phone users. The Stanford researchers presented the BuddyBuzz demo at this week's IBM's New Paradigms in Using Computers conference, where other concepts for cell phone text presentation were also highlighted. Motorola researchers unveiled Screen3, a system that scrolls headlines across a cell phone's display; the system is programmed to present a paragraph-long summary of a story when a user clicks a button once, and the entire story when the button is clicked twice. Meanwhile, the India-based Rediff.com portal is developing ideas to reorganize news stories so mobile device users can absorb them better.
    Click Here to View Full Article

  • "Poker-Playing Robots Battle for $100,000 Pot"
    InformationWeek (07/12/05); Greenemeier, Larry

    Participants in the World Series of Poker Robots believe the tournament say programming a robot to play mathematically perfect poker is easy, but it also demands new levels of artificial intelligence research that address issues such as luck and reading an opponent, as well as presenting the obstacle of overcoming incomplete and often misleading information. The University of Alberta's Computer Poker Research Group has created a computer gaming program with a human dimension. "The problem with poker, or any domain where you're working with unknown information, is that there's no one way to do it," says computer science professor Jonathan Schaeffer. The robots playing in this week's tournament compile databases of opponents' moves to better predict future play, just as human players do. In seeking a real-world application, Schaeffer likens the last 14 years of U.S.-Iraqi relations to a poker game, where the invasion of Kuwait was the ante, and the initial U.S. military action constituted the first raise, and so on. Schaeffer believes that artificial intelligence could prove useful if it can aid humans in solving problems with only incomplete information to work with. The winning robot of the tournament will move on to play professional poker player Phil Laak for a $100,000 prize, but Schaeffer is concerned that the tournament format will not include enough hands to effectively neutralize the luck factor; in testing a program, he says that between 40,000 and 50,000 hands are dealt, whereas this tournament will likely only see a few hundred hands dealt in each four-hour elimination session.
    Click Here to View Full Article

  • "Robo-Soccer Teams Shoot for Big Goals"
    ABCNews.com (07/12/05); Eng, Paul

    The development of technologies and methodologies to enable automated teamwork between robots is the goal behind RoboCup, the annual international contest in which teams of robots compete against each other in soccer tournaments. The performance of RoboCup teams has improved substantially partly due to improvements in computing power and capabilities, according to researchers. Carnegie Mellon University computer science professor Manuela Veloso says the Sony AIBO robot dogs her RoboCup team is using in this year's competition are distinctively better than older models: The machines boast sturdier legs, improved joints and motors, and more advanced sensors and built-in wireless communications equipment. Carnegie Mellon's CMDash05 team has also augmented the AIBOs with artificial intelligence routines that enable them to receive balls, use different formations against different teams, and perform other sophisticated soccer maneuvers. The concept of cooperative teams of robots has potential applications in fields that include search and rescue. Challenges that must be overcome before effective robot teamwork is truly realized include the possibility that excessively powerful AI routines could cause robots to waste time analyzing and talking with each other when they should be responding to surrounding situations. Veloso says the additional research and funding needed to solve such problems is a challenge in itself, as the United States devotes significantly fewer resources to robot R&D than countries such as Japan. Fortunately, interest in robotics is increasing among the American public as well as the research sector.
    Click Here to View Full Article

  • "Argonne Wins Four R&D 100 Awards for Scientific, Technological Innovation"
    Argonne National Laboratory (07/08/05)

    The MPICH2 software from the U.S. Department of Energy's Argonne National Laboratory has been recognized with a R&D 100 Award from R&D magazine. Application developers will be able to use MPICH2 to program parallel computers, including clusters of laptops and workstations working together on massive problems, with the same code. MPICH2 takes a layered architecture approach to distributing mathematical workloads, enabling vendors and researchers to change the lower layer for proprietary networks and use the portable upper layer to comply with community standards and state-of-the-art computational algorithms. The open-source implementation of the MPI-2 international message-passing standard is now available for free. Aircraft engines have been designed at Pratt and Whitney using MPICH2, while other applications include materials science, combustion simulation, astrophysics, climate modeling, and bioinformatics. William Gropp, Ewing Lusk, Robert Ross, Rajeev Thakur, and Brian Toonen developed the software.
    Click Here to View Full Article

  • "'Hard Fun' Yields Lessons on Nature of Intelligence"
    EE Times (07/11/05) No. 1379, P. 1; Brown, Chappell

    Co-director of MIT's Future of Learning Group David Cavallo says revolutionary insights about human intelligence can be produced through "hard fun" projects that apply technology creatively, methodically, and assiduously. He explains that early artificial intelligence research did not align well with intelligence modeling, since it followed a simplistic vision; but the computer can now enable researchers to better account for the complexity of intelligence. "What's really been rich in AI, what's really rich in the computer, and what has helped us to understand minds better was trying to build models of minds," Cavallo says. Rich computing can be applied to education to make learning a more active and dynamic experience for students. Cavallo notes that arts educators tend to be more accepting of active learning projects than technical teachers, since people with an arts background are familiar with mixed media and are eager to embrace new concepts and forms of expression. He cites a project in which Brazilian students were challenged to design an ideal city as a model for the kind of "hard fun" classes his group develops: The project begins with a brainstorming session where each participant works out what specific urban element he is going to model, and then thinks through the various pitfalls associated with that element; the overall goal of the project is to give students, teachers, and administrators new ideas about making learning different. Cavallo says his team hopes that future learning will involve the use of the computer to support thinking, creativity, and the realization of imagined concepts. "We're not just looking at the computer as an information-delivery device or a communication device: It's a dreaming and making device," he concludes.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Software Under Scrutiny"
    CIO Australia (07/07/05); Bushell, Sue

    Software inspection processes are the best way to lower development costs, abbreviate delivery schedules, and make operational software products more trustworthy in the field. Software inspection process expert Edward Weller III says experience and discussions with industry fellows indicate that management is more open to software solutions when developers present them in a feasible, credible way that satisfies decision makers' information needs. One of the most renowned software inspection methodologies is the Fagan Inspection Process, which is designed to reduce the number of defects users must contend with, eliminate defects prior to testing, and instill measurability and manageability within software development projects. The data captured through software inspection can help managers ascertain any software process improvement action's return on investment by quantifying incurred costs and achieved savings, while technical practitioners can use the data to determine defect detection rates and detected bug types in order to improve software practice and products. IBM's Andrew Bowey says CIOs can be very resistant to inspections, so it should be clearly communicated to them that the organization will enjoy more savings when bugs are spotted and patched earlier in the development process. John Salerno of Dedicated Systems reports that "companies that have been burnt are more likely to use [inspections] because what we find is that inspections are valuable for this reason: that it finds bugs where they are injected, which is usually by the engineers themselves in the development team." IBM's Joji Vergara says the benefits to be gained through the inspection process depend on how clearly each participant is told why his participation was solicited.
    Click Here to View Full Article

  • "Weather Forecasters Turn to High Technology"
    Military & Aerospace Electronics (06/05) Vol. 16, No. 6, P. 24; Ames, Ben

    Weather forecasters are embracing advanced data fusion, software, and computationally efficient technologies in order to make faster and more accurate broadcasts, which are critical to both military and civilian operations. The "nowcast" data-fusion system being developed by the U.S. Naval Research Laboratory is designed to allow forward-deployed battle groups to automatically collate meteorological data to scan the battlespace, establishing unified situational awareness for warfighters through continuous updates. Another data fusion initiative coordinated by the National Oceanic and Atmospheric Administration (NOAA) is the development of the Advanced Weather Interactive Processing System (AWIPS), which will let meteorologists capture sensor, model, and observation data from various modeling systems, and then process and share that data for weather prediction and tracking via interactive software and graphic displays. Data overload is a major concern in weather forecasting, and engineers are exploring several options to address this issue. One such option is FuzzyTAF, an automated version of the National Weather Service's Terminal Aerodrome Forecast (TAF) that removes the need for human forecasters through fuzzy logic, which expends little computational muscle. FuzzyTAF can operate on a garden-variety dual-processor desktop PC, and downloads data over the Internet or satellite at a maximum rate of 60 MB per hour, creating forecasts for 1,400 U.S. locations, which are updated hourly. Meanwhile, engineers at the U.S. Army's High Performance Computing Research Center are trying to shed the computational costs of forecasting by running the Air Force Weather Agency's MM5 operational model at finer resolution and nesting forecast domains.

  • "The Fading Memory of the State"
    Technology Review (07/05) Vol. 108, No. 7, P. 44; Talbot, David

    Archives such as the National Archives and Records Administration (NARA) will need to be staffed by librarians who are proficient in metadata, computer scientists who are skilled in storage technologies, and experts who have the eye of a historian in order to preserve the electronic records of today. NARA, which preserves all of the retired records of the federal government, is facing the difficult challenge of dealing with digital records that rot much faster than paper documents, and records that are from thousands of incompatible data formats. Harris Corporation and Lockheed Martin have offered to deliver on what would be a miracle--a permanent Electronic Records Archive (ERA) that can handle the 16,000 software formats used by the federal government and any future file-reading software and hardware; and ensure that stored records are authentic, available online, and secure from hackers and terrorists. Approximately $136 million has been authorized and proposed for the system, which NARA wants to roll out in stages from 2007 to 2011. Commercial developments in "virtual storage" technologies should be beneficial, as well as the achievements of other governments, such as Australia and its digital archives program, which relies on the open-source software XENA (XML Electronic Normalizing of Archives) to convert records to a standard format that could be read by future technologies. At the same time, NARA and the National Science Foundation are funding initiatives at research sites such as the San Diego Supercomputer Center, which is focused on extracting data from old formats, among other issues. Meanwhile, Robert F. Sproull, the Sun Microsystems computer scientist who chairs the National Academy of Sciences panel advising NARA, believes the electronic-records program is too ambitious, and that a more pragmatic, incremental approach should be pursued.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM