HomeFeedbackJoinShopSearch
Home

       HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.

To send comments, please write to [email protected].

Volume 5, Issue 543: Wednesday, September 10, 2003

  • "Copyright Directive 'Could Be Europe's DMCA'"
    New Scientist (09/09/03); Knight, Will

    A Sept. 8 report from England's Foundation for Information Policy Research (FIPR) warns that the European Union Copyright Directive (EUCD), whose wording and goals are similar to the United States' Digital Millennium Copyright Act (DMCA), has the potential to stifle academic freedom and technological creativity, as well as restrict the rights of computer users. The EUCD places limits on the bypassing of technology designed to block the duplication of digital content to prevent piracy, but critics claim this same regulation can also be used to undermine the efforts of computer scientists, end users, and business rivals. "The EU copyright directive uses very similar language [to the DMCA] and you can see the same things happening," states FIPR director Ian Brown. The directive could conceivably give software developers the power to stop competitors from releasing compatible programs, and Brown notes that EU member states should add legislation that allows consumers to make copies of digital content for personal use. Research on copy protection technology is supposed to be exempt from the restrictions of the EUCD, but the FIPR reports that only two out of the five countries that have deployed the directive--Finland and Germany--have augmented their legislation with robust safeguards. A proposed EUCD deployment was issued by the British government in an August 2001 consultation paper, but opponents were so incensed that a revision was commissioned. The revision is expected to be released in October, while the European Commission will study the EUCD's effectiveness and consider amendments next year.
    Click Here to View Full Article

  • "Security Executive Stresses Trade-Offs"
    Investor's Business Daily (09/10/03) P. A5; Krey, Michael

    In his book, "Beyond Fear," Counterpane Internet Security CTO Bruce Schneier emphasizes the need to deeply understand security and acknowledge the associated trade-offs. He implies that the U.S. frittered away the international good will that accrued in the wake of the Sept. 11 tragedy; he also notes that the progress the country made in upsetting terrorist networks was offset by the impact the security push has had on civil liberties. Schneier is opposed to the use of facial recognition and other biometrics technologies for security purposes because of their propensity to generate more false positives than successful identifications. National ID cards are an especially sore point: Their implementation entails massive infrastructure and maintenance costs, and Schneier is uncertain as to what problem the cards will solve. Schneier also does not expect the computer virus situation to improve--in fact, he believes things will worsen. A chief reason he cites for this trend is the lack of liability among software companies that make insecure products. "Microsoft...wants us all to think that viruses just happen, like the weather," Schneier contends. He also thinks electronic voting is inherently flawed and too susceptible to fraud. Schneier says his book makes the case that security and its trade-offs must be rationally considered if effective strategies are to be formulated.

  • "New Chemistry Software Automatically Generates Computer Code"
    Ohio State Research News (09/08/03); Gorder, Pam Frost

    The Tensor Contraction Engine (TCE) promises to relieve chemists, physicists, and materials scientists of a significant burden by automatically generating computer code needed to simulate the structure and interaction of intricate molecules; the tool could also shave time off research projects at supercomputer facilities and national laboratories throughout the U.S. The research areas likely to benefit the most from the TCE software are computational chemistry and computational physics, which revolve around large-scale atomic and molecular interactions. These interactions are modeled with complicated mathematical matrices--tensors--comprised of tens of millions to billions of elements; simulation relies on dozens to hundreds of tensor contractions, which can take months to program. The TCE promises to carry out this operation in mere hours by creating a parallel program that employs the least amount of computer memory and supports speedy communications between parallel supercomputer processors, according to Ohio State University computer and information science professor Ponnuswamy Sadayappan. He adds that the time is ripe for potential TCE users to join the consortium that developed the tool so its functionality can be refined. The consortium, which Sadayappan leads, includes Ohio State University, the University of Waterloo, Oak Ridge National Laboratory, Louisiana State University, and the Pacific Northwest National Laboratory. A TCE prototype was presented at the national meeting of the American Chemical Society on Sept. 7. The TCE was developed with funding provided through the National Science Foundation's Information Technology Research program.
    Click Here to View Full Article

  • "The Future of Force-Feedback Technology"
    TechNewsWorld (09/10/03); Koprowski, Gene J.

    The penetration of force-feedback, or haptics, technology into the mass market will require a significantly larger corporate/consumer marketing effort than the industry can support at its current size, but U.S. universities are making considerable advances in the field through research and development financed by the National Science Foundation--advances that could find commercial application in a few years. A team of researchers led by Thenkurussi Kesavadas of the State University of New York at Buffalo have developed sympathetic haptics technology that allows a person wearing a virtual reality data glove to feel the tactile sensations experienced by another person via Internet transmission. Such a breakthrough could conceivably allow both blind and sighted users to learn sculpture, surgery, and other skills remotely. Meanwhile, a haptics project at the University of Southern California enables people to virtually feel sculpture and other artwork, providing a new level of art appreciation without the risk of damaging the work itself. Rutgers University researchers have filed a patent for a PC-based virtual reality system designed to facilitate the rehabilitation of stroke victims, who wear sensor-outfitted gloves to manipulate virtual piano keyboards and other onscreen graphics. The scientists note that patients who used the keyboards experienced a 140% improvement in their hands' range of motion. In order to augment the Internet with force-feedback technology, databases containing the force control algorithms used to render tactile sensations will probably need to be distributed so that local copies of the data are stored at various network nodes. This would allow the kinetic and non-kinetic properties of the manipulated objects to be accessed faster.
    Click Here to View Full Article

  • "Making a Video Screen Out of Thin Air"
    Reuters (09/10/03); Sorid, Daniel

    Military training centers, product showrooms, and museums could be augmented with the advent of walk-through displays that generate video imagery without requiring a solid surface. The Tampere University of Technology's FogScreen, which was showcased at this year's ACM SIGGRAPH conference, projects images onto a cloud of water vapor in mid-air. Commercial production of the FogScreen is expected to begin before 2004, and FogScreen CEO Mika Herpio says the machine's price--which could run as high as $100,000 right now--may fall as the device is manufactured in volume. Meanwhile, MIT graduate student Chad Dyner's Heliodisplay is a prototype display that incorporates a video projector that can generate a two-dimensional, 27-inch screen. The images projected on the display can be manipulated by hand and viewed from multiple angles. Dyner says, "This is something that people have been dreaming about for a long time," comparing his technology with the message display system used by R2D2 in Star Wars. Still, such advanced computer displays are not expected to render traditional desktop displays obsolete; furthermore, Chris Chinnock of Insight Media cautions that the technology, though impressive, may not necessarily catch on with the public.
    Click Here to View Full Article

  • "In Computer Security, a Bigger Reason to Squirm"
    New York Times (09/07/03) P. BU4; Koerner, Brendan I.

    The recent SoBig.F worm outbreak indicates that computer systems are no more secure than they were three and a half years ago, when the I Love You worm ran rampant throughout the Internet. "The whole problem here is not just having antiviral products and using antiviral updates, but a lack of computer knowledge among users," explains Steven Sundermeier of the antivirus company Central Command; he urges users to be more alert to security problems, update their virus scanners often, and suspect dubious email attachments. Security carelessness can extend to people familiar with computer technology, according to a document that Roelof Temmingh of SensePost Information Security presented at a security conference in July. There are also indications that viruses are becoming increasingly versatile, able to infect more than one operating system. The North American Electric Reliability Council disclosed that the Slammer worm, in several documented cases, disrupted bulk electric system controls, and recommended that utility companies isolate the computers controlling their power grids from corporate networks. Meanwhile, the speed at which viruses propagate is overtaking the efforts of even the most vigilant and aggressive patchers, while many expert system architects do not have the budget to effectively fix security holes. Outbreaks such as SoBig.F and Blaster have brought a new focus to "trusted computing," in which software is rendered virus-proof with cryptographic signatures; but for trusted computing to take hold, privacy issues must be addressed while software and hardware providers must reach an accord. Some experts think legislation that makes software companies liable for releasing shoddy products is another solution, but no one in Congress thus far supports the concept.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "IT Jobs That Belong Overseas"
    NewsFactor Network (09/05/03); Hill, Kimberly

    With offshore IT outsourcing having become a fact of life, experts such as Challenger, Gray & Christmas executive VP Rick Cobb and Deloitte & Touche's Maria Grant observe that some jobs are well suited for overseas migration while others are not. Cobb sees little difficulty in moving software development, data collection, and analysis operations offshore, provided the project environment is well-established and the only required communication is intermittent staff reports. "These are more pure math, and they can be moved to the cheapest place they can be done," he explains. Cobb says that outsourcing team-oriented projects can increase the complexity of IT-management procedures and practices, while both language and cultural barriers can hinder such large-scale initiatives. Transactional and routine jobs such as software debugging, network maintenance, and coding for well-defined software modules lend themselves very well to outsourcing, according to Grant. Programming, network administration, and help desk functions can also be outsourced, as long as security and confidentiality issues are addressed. Grant dissuades enterprises from offshoring "strategic tasks" that involve business-process decisions, which play a key role in the implementation and customization of customer relationship management and enterprise resource planning suites. She also notes that modifying software to line up with an organization's business processes is a task best left to domestic staff.
    Click Here to View Full Article

  • "Senator Questions RIAA Crusade"
    IDG News Service (09/10/03); Gross, Grant

    Senate Judiciary Committee Chairman Sen. Orrin Hatch (R-Utah) said he agreed with many arguments presented by a Verizon lawyer during a committee hearing on Tuesday, especially that the Digital Millennium Copyright Act (DMCA) dangerously allows subpoenas without involving a judge. Verizon general counsel William Barr reiterated his company's arguments that the subpoena power was too broad and would inevitably lead to abuse by people falsely claiming copyright violation. Verizon lost two court battles with the Recording Industry Association of America (RIAA) before identifying four of its subscribers accused of trading music over peer-to-peer networks. But despite his sympathy, Hatch said the DMCA issue had not yet "ripened enough" for Congress to step in; he asked for three updates from the RIAA, Verizon, and other parties concerning potential DMCA subpoena abuses over the next six months. Eventually, Hatch said a mutually beneficial compromise solution could be worked out, and that he agreed the RIAA should take some action to prevent pirating of its content. U.S. Copyright Office register of copyrights Marybeth Peters called the DMCA subpoenas "appropriate and constitutional," while RIAA President Cary Sherman reminded the committee the subpoena provision was included as a compromise to protect ISPs from directly being sued.
    Click Here to View Full Article

  • "Wi-Fi and 3G May Come Together"
    CNet (09/09/03); Shim, Richard

    New 802.11b Wi-Fi chips from Broadcom and Royal Philips Electronics signal a likely complementary convergence of Wi-Fi and 3G cellular technologies. Analysts previously forecast the two technologies would be used together in handheld devices once the price, size, and power requirements of Wi-Fi chips reached a certain point. The new chips are up to 80% smaller than Wi-Fi chips currently on the market, such as Intel's Centrino, and should cost between $12 and $13 each in 10,000-unit batches once production ramps up near the end of this year or beginning of next. The new Broadcom Airforce One Chip integrates three main Wi-Fi components for reduced size and cost, and especially features power management functions limiting current to just 6 milliwatts in standby mode--80% less than in previous Broadcom Wi-Fi chips. The Philips chip, while not as integrated as Broadcom's, uses just 3 milliwatts in standby and precedes an even smaller chip planned for release next year. Deutsche Bank analyst Ross Seymore says the new chips are the first real opportunity for hardware makers to try out their wireless convergence ideas. Adding Wi-Fi to cellular would allow for high-speed connections in metropolitan hot-spot areas while using slower cellular data networks for in-between coverage. Wireless service providers such as Wayport and Cometa, as well as cellular carriers such as T-Mobile, AT&T Wireless, and Sprint are all racing to set up Wi-Fi networks regionally and nationwide. Meanwhile, the 802.11b standard is giving way to a faster 802.11g standard on PCs, which do not have the stringent power and design requirements of portable devices; Forward Concepts analyst Will Strauss says the new chips show 802.11b Wi-Fi is still viable, but in new uses and platforms.
    Click Here to View Full Article

  • "Chic Gear to Suit Net Generation"
    BBC News (09/10/03); Twist, Jo

    Designers are readying the next generation of wearable computing fashions, integrating fashion and electronics. Glasses with computer displays embedded in the lenses and Bluetooth headsets that allow people to use their cell phone hands-free are already available. But Motorola researcher Joseph Dvorak says today's wearable computing equipment is bulky and obvious, although continued technology miniaturization and improved speech-recognition tools will make wearable computing inconspicuous. Within five years, Dvorak predicts wearable computing fashions that will blend with peoples' different taste in style without attracting attention. Textile firm Softswitch touts a fabric that provides an electronic interface with computer devices and can help bridge the gap between fashion and electronics engineering. The company's product will be used in the Burton Amp jacket, which will allow snowboarders and skiers to control their Apple iPod with a textile-based control pad on the jacket's sleeve. October will see a second brand-name jacket using Softswitch technology, this one touted as a full "telecoms jacket." Softswitch's Dr. Dianne Jones predicts 20% of all clothing will have some basic circuitry in it in 10 years.
    Click Here to View Full Article

  • "Brave New Skies"
    Salon.com (09/04/03); Manjoo, Farhad

    The Transport Security Administration's (TSA) second-generation computer-assisted passenger prescreening system (CAPPS II) promises to protect air travelers from terrorists and reduce the scrutiny passengers face at airports, but critics charge that it threatens to support government monitoring of all citizens and will actually increase scrutiny for certain people. CAPPS II aims to use the computerized reservation systems (CRSes) of Amadeus, Galileo International, Sabre Holdings, and Worldspan to keep tabs on virtually all passenger records, and critics hypothesize that these four corporations would seek to recoup their costs for overhauling their infrastructure for CAPPS II by selling passengers' travel data to outsiders. San Francisco travel agent Edward Hasbrouck estimates that the travel industry will have to spend billions to integrate passenger data that is not normally captured--names, home addresses, phone numbers, and birth dates--but adds that "The key impact of the [CAPPS II] proposal would be that it would enable the CRS to correlate previously separate reservations for trips into a life-long history of your travel." Not only would this make it easy for CRSes to exploit travel histories for financial gain, but would give government easy access to those records. The government's motivation for CAPPS II is to find a solution to the fear and frustration Americans feel toward air travel--one that has the high security standards of Israel's El Al airline, but is invisible, automatic, and interferes with only a small portion of travelers. CAPPS II advocate Capt. Steve Luckey says the system will devote most of its resources to the most likely terrorist threats, but critics counter that automated profiling can miss vital clues that allow true suspects to board without being detained, while focusing on less relevant clues that single out innocents for detention. Even a 1%-2% percent false positive rate would result in millions of people misidentified as possible terrorists annually. The first proposed version of CAPPS II aroused so much anger for its vagueness that the TSA revised the proposal to quell critics, but the new draft, which detailed the system's data-mining procedures, only fueled further criticism.
    Click Here to View Full Article

  • "Hard Drive Industry Gets More Respect"
    SiliconValley.com (09/08/03); Diaz, Sam

    Hard disk drives have long taken a back seat to speedy processors, memory, and software, but both consumers and manufacturers of consumer electronics have developed a new respect for the technology, as evidenced by significantly elevated revenues and rising stock prices. Consumers desire voluminous hard drives so they can store the digital content--movies, photos, music, etc.--they treasure, and the average price of a 56 GB hard drive is $73 today, whereas a 4.4 GB drive cost about $142 five years ago. The major hard drive makers are controlling product cost and technology advancement because they build their own components rather than purchase them from outside retailers. J.P. Morgan analyst William Lewis notes that price alone does not give hard drive makers an advantage with customers. Far more attractive is the flexibility and responsiveness of companies, and products of consistently high quality. Mike Cordano of Milpitas hard drive manufacturer Maxtor says the industry has received a significant boost through consolidation, which has stabilized the economic environment; the number of market players has fallen from 80 to just seven over the last 18 years. Hitachi Global Storage Technologies CEO Jun Naruse explains that, unlike processors, the data stored in the hard drive cannot be replaced. Manufacturers are attempting to advance hard drive technology even further: IBM's Microdrive could be a potential replacement technology for tapes and flash memory, while disk drive pioneer Paul Gilovich has patented a technology for drives with multiple actuators to speed up data retrieval and delivery.
    Click Here to View Full Article

  • "Paper Trail Not Dead Yet"
    Wired News (09/09/03); Batista, Elisa

    Rochester Institute of Technology professor Frank J. Romano told Seybold conference attendees on Sept. 8 that although today's office is not paperless, contrary to what outfits such as Wang Labs predicted long ago, there is a noticeable decline in the need for printed material, which should spur the print industry to make the appropriate adjustments to ensure its survival. Romano told his audience that most offices have replaced fax machines with networking technology, while email or an intranet is often workers' primary mode of communication. Many people are making their transactions online, and even bulk mail is moving away from paper thanks to the growth of spam. Less paper also characterizes what Romano terms the "digital generation"--young people who maintain a constant Internet connection and rarely read printed publications. Romano believes wireless devices will drive the move toward a less paper-reliant office: Offices will use such technology to do business rather than pay rising postage costs for snail mail, and this trend will lead to a fall in handheld computer prices. Romano estimated that a $1,500 laptop will cost $300 by 2015, while the price of tablet PCs, personal digital assistants (PDAs), electronic book readers, pocket TVs, and Web-enabled cell phone/e-book readers will have declined by a factor of 50. "I predict that someday Time magazine will give you a PDA for free, in exchange for a four-year subscription," Romano declared.
    Click Here to View Full Article

  • "Passwords Multiply as Users' Rage Rises"
    Baltimore Sun (09/07/03) P. 1A; Dang, Dan Thanh

    As more online services and digital products require passwords for security, users are beginning to balk. Password policies often stipulate the complexity of a password--a minimum number of characters interspersed with numbers and special symbols, and no dictionary words. For example, passwords composed of six, single-case letters have 308 million possible combinations, while eight-character passwords that can have both upper and lower case letters and one special character or punctuation mark have 6,095 trillion possible combinations. But with an increasing number of such passwords, many people have actually compromised security by writing down their passwords or reusing their passwords for different logins. Princeton enterprise infrastructure services director Dan Oberst decided to replace users' multiple system passwords with a single password that adheres to strict rules increasing its complexity. He says the method has lessened the number of help desk calls. University of California at Irvine Center for the Neurobiology of Learning and Memory director James L. McGaugh says the problem with numerous computer passwords is not human memory capacity, which he says is virtually unlimited, but rather interference that confuses people's memory and causes them to forget. Password Crackers President Bob Weiss hires his company's services to people who have either forgotten or otherwise cannot find passwords to files belonging to them, and he says many passwords are guessable. Software programs hack simpler passwords in just seconds, while new biometric security technology promises to make access security both more rigorous and simple for users.
    Click Here to View Full Article

  • "Software Patents: A Clicking Bomb"
    Economist (09/06/03) Vol. 368, No. 8340, P. 55

    Controversy is brewing in Europe over a proposed pan-European directive that covers the issue of software and Internet business methods patents. Supporters of such patents claim that they set up an incentive system that fosters innovation, while opponents contend that they actually impede innovation by allowing major companies to drive smaller competitors out of business. The directive seeks to establish a unified system to give computer-implemented inventions the same legal status throughout the European Union while simultaneously avoiding the drawbacks of the American patent system, such as a greater willingness to issue patents for business methods and algorithms based on case law. According to the directive, an invention can only qualify for patentability if it makes "a contribution to the state of the art in a technical field which is not obvious to a person skilled in the art." Though this classification satisfies larger members of the Business Software Alliance, smaller software companies and open-source lobby organizations are incensed, and are demanding that the directive be clarified so that its potential for allowing American-style patents can be removed. The European Parliament's Arlene McCarthy has suggested a patentability test in which an invention must demonstrate a new way to employ "controllable forces of nature" and possess an "industrial application." The criticism that has erupted in the wake of the directive's proposal has forced the Parliament to postpone the first reading of the directive several times.
    Click Here to View Full Article

  • "The Once and Future IT"
    Computerworld (09/08/03) Vol. 31, No. 42, P. 28; Hamblen, Matt

    Autonomic software promises to significantly boost labor efficiency and save users money by automating and speeding up routine IT tasks such as remedial operations. Forrester Research analyst Laura Koetzle projects that average server utilization rates could climb from 20% to 80% through the deployment of autonomics. The technology's cost-effectiveness stems from its potential to replace expensive IT employees currently handling prosaic jobs, but Koetzle does not foresee a fall-off in IT positions--rather, she expects skilled workers to be reassigned to development and planning operations. "My concern in 10 years is there won't be enough people with knowledge of the automated process, and that that could lead to downtime," says Bayer HealthCare senior process engineer John Freeman. Gartner anticipates the emergence of 22 autonomics-related technologies over the next 10 years. Automatic high-availability/fail-over capability, network load balancing, resource chargeback to user groups, hardware partitioning, and in-house massively parallel processing (MPP) grids are already available or will be in less than two years. Technologies derived from autonomic computing Gartner expects to come out within two to five years include self-healing hardware and software, MPP grids external to organizations, IT service provisioning, and root-cause discovery and correction. The research firm believes general-purpose grid computing, service billing, service governing, and service policy managing systems will arrive between 2008 and 2013.
    Click Here to View Full Article

  • "Is Small the Next Big Thing?"
    U.S. News & World Report (09/08/03) Vol. 135, No. 7, P. 29; Pethokoukis, James M.

    Tying a company or product to nanotechnology may raise its esteem, but the sensible approach for investors, government officials, and venture capitalists is to look beyond the hype and carefully study the technology's actual value. Current commercial nanotech applications--stain-resistant pants from Eddie Bauer and more durable tennis balls, for example--may be useful, but they are not exactly revolutionary. Industry is more excited about nanotech breakthroughs that could yield new chip fabrication techniques or cheap solar cells, while National Nanotechnology Initiative director Mihail Rocco foresees microscopic drug delivery systems and increased life expectancy. But Lux Capital analyst Josh Wolfe advises startups to avoid sporting the "nano" prefix, given that few of the companies that boast it are truly involved with nanotech. He believes the largest beneficiaries of the enthusiasm surrounding nanotech will be companies such as Veeco Instruments and FEI, which offer products used to manufacture nanoscale materials. An oft-cited 2001 National Science Foundation Study predicts a $1 trillion nanotech market within 10 years, but Freedonia Group analyst Mike Richardson says a more realistic, albeit speculative, estimate is $35 billion in 2020. Draper Fisher Jurvetson's Steve Jurvetson thinks truly revolutionary nanotech developments could be ushered in through a federal program similar to the 1960s initiative to send men to the moon.
    Click Here to View Full Article

  • "Morphing the Mold"
    Software Development (08/03) Vol. 11, No. 8, P. 38; Poppendieck, Mary

    The iterative concurrent development approach that has worked so well for Japanese auto makers can be extended to software development, in which design decisions are held back as long as possible in order to maximize the design's robustness and flexibility. Sequential development, on the other hand, focuses on freezing design decisions at the outset, which can lead to design brittleness, a lack of product adaptability, and the increasing likelihood of making a major error in critical structural decisions. Over 50% of software system development occurs after production or shipment has begun, and the impetus for following a sequential approach has been the contention that the cost of fixing software problems escalates dramatically after delivery. Until the 1990s, there was little consideration that the high escalation ratio was directly attributable to the sequential process itself. Delaying design decisions to what the Lean Construction Institute terms the last responsible moment cuts down on the number of high-stakes limitations, boosts the chances of making the right high-stakes decisions, reduces the need for change by deferring most of the decisions, and significantly lowers the cost-escalation variable for most changes. Making decisions at the last responsible moment is a challenging proposition that incorporates a number of special strategies, including the sharing of incomplete design information; organization of a direct, worker-to-worker collaborative scheme; development of a sense for critically valuable domain features and well-timed decision-making; and cultivation of a rapid response capability.

  • "What's Next for Technology Policy??"
    Issues in Science and Technology (08/03); Branscomb, Lewis M.

    Lewis Branscomb of Harvard University's John F. Kennedy School of Government writes that the formulation of the U.S. Technology Policy suggested a clear understanding by the White House of the importance of commercially promising research into high-tech innovations. Current studies indicate the government has warmed to investment in research to further economic growth: Federal policy implicitly acknowledges the employment of public-private collaborations as the favored approach to the creation of new technology, while the National Science Foundation remunerates grantees in nanotechnology and other high-tech fields for scientific breakthroughs as well as the invention of prototype devices with commercial potential. Branscomb notes, "Sponsoring agencies seem to have learned how a practical, even urgent, public need, such as medical progress or understanding climate change, can best be pursued through highly creative efforts in basic scientific understanding." The Government Performance and Results Act, which requires federal research projects to document the societal benefits of their outcomes, is an indication that understanding such benefits is a critical issue on Capitol Hill. The economic downturn has resulted in a scarcity of seed capital to nurture new business opportunities, and Branscomb thinks that federal R&D policy must bridge the divide between the science and technology sector and the management and investment sector. Branscomb observes that "There has been a shift toward encouraging vigor in private markets, through support of ATP and other experimental programs, as well as support for efforts to develop promising new technological infrastructures; and there has been a growth in support for mission-justified basic research." The next step, in his view, is to develop a policy to tackle the terrorist threat.
    Click Here to View Full Article

 
                                                                             
[ Archives ] [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM