HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Parasoft Logo    Automated Software Analysis - by Parasoft:
Error prevention for unit testing, coding standards analysis,
functional/performance testing, monitoring and reporting.
http://www.parasoft.com/products


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Parasoft or ACM. To send comments, please write to [email protected].
Volume 6, Issue 725:  Wednesday, December 1, 2004

  • "Anything But IT"
    Computerworld (11/29/04); Verton, Dan

    Job market uncertainties are fueling a dearth of interest in computer science among students and a decline in the number of advanced IT degree holders, which could threaten U.S. leadership in the world IT market, experts warn. Many students intend to pursue business rather than technical degrees in graduate school because they believe business expertise will reduce their chances of getting outsourced. Another factor is a disconnect between the IT skills schools provide and the skills required by the marketplace, according to Purdue University Department of Computer and Information Sciences Chairman Mathew Palakal. "A traditional computer science curriculum prepares students for academic pursuit and not necessarily for the business world," he explains. However, demand for college graduates remains low and only recently has hiring begun to pick up once more, reports Kendall Placement Group President Mike Kendall. Some firms acknowledge the gap between IT education and workforce needs, and are launching programs to bring new hires up to speed; one such company is ThoughtWorks, which sends entry-level employees to a three-month overseas boot camp. "Our people must be a cultural fit as well as being extremely savvy technical individuals," says ThoughtWorks U.S. senior recruiter Sonia Muhaimeen. Outsourcing is also pressuring students to seek advanced degrees--Dartmouth College computer science and engineering student Katherine Farmer expects domestic demand for high-level design and development skills to rise as more low-level jobs are offshored.
    Click Here to View Full Article

  • "Eyeing the Future of Ubiquitous Computing"
    IST Results (12/01/04)

    Ubiquitous computing hinges on the development of advanced yet unobtrusive sensor networks, and tackling the challenges inherent in this task is the goal of the IST program-funded EYES project, which is due to wrap up in February. Such challenges include enabling energy efficiency and reconfigurable architecture in sensor nodes. EYES project coordinator Paul Havinga of the University of Twente says the nodes must consume the least possible amount of battery power in order to sustain operations that could last anywhere from six months to several years, and the EYES partners' solution was to create protocols that deactivate unused nodes but allow them to reactivate in response to anomalous events or changes in the things they are monitoring. Another key factor is the sensors' adaptability to fluctuations in the operating environment and their ability to maintain the network even if one or more nodes become defective or the number of nodes in a specific location changes. The potential everyday applications of the EYES protocols and architecture were demonstrated by a system of sensor nodes affixed to dairy cows that gauge the animals' health by monitoring their location and movement. "We chose the application area because it demonstrates the many possibilities for sensor networks, given that in this instance the sensors had to be dynamic, mobile and robust, they had to provide location information and had to constantly adapt to their environment," notes Havinga. Also under development is a prototype sensor network designed for generic reconfigurability and adaptability, which will be spotlighted at the IST 2004 conference in the Hague. Havinga expects increasing use of small wireless networks in the short term, and the wide proliferation of sensor networks within 15 years.
    Click Here to View Full Article

  • "Mutating Bots May Save Lives"
    Wired News (12/01/04); Sandhana, Lakshmi

    Researchers around the world are developing robots that can change their shape to suit unpredictable situations and perform numerous jobs. University of Pennsylvania scientist Mark Yim's robo-centipede, designed for search and rescue work, can morph into different configurations to crawl through tight spaces and look for survivors in collapsed buildings; Palo Alto Research Center's Hansel and Gretel map out unknown environments using radio and ultrasound; and a team led by the Center for Robot-Assisted Search and Rescue's Robin Murphy is working on robots capable of determining their own shapes and selecting the safest configuration for the situation at hand. The majority of these reconfigurable devices require human operators, but enabling the machines to think will allow them to function independently, although roboticists are divided on the best way to achieve this breakthrough: "There are people who are trying to develop systems which have entirely distributed intelligence--all the intelligence is distributed over every module," notes Yim. "There's another camp which says that you need to have some kind of brain or some kind of hierarchy." The hierarchical model also needs software that can internally identify, represent, and comprehend the machine's environment and its potential interactions with it; carry out self-analysis and self-repair when necessary; and recognize the type of form circumstances call for and morph into that shape. Current demonstrations have mostly concentrated on reconfiguring locomotion, such as transforming from a multilegged mode to a snake-like mode, while other groups have focused on machines that can reconfigure themselves into practical shapes. Existing shape-shifting robots select their configurations from a preprogrammed toolbox, but researchers are developing a way for systems to automatically examine the required job and generate the optimal shape to perform the task.

  • "Wearable Computing for the Commons"
    Technology Review (12/01/04); Garfinkel, Simson

    Georgia Tech professor Thad Starner's Complex Computing Group discovered through a study of many students and the tools they use to keep track of appointments that memory remains the simplest tool, while the employment of wearable and mobile technologies hinges on how easy the devices are to use. Starner himself is living proof of the relevance of this principle: He carries a handheld Twiddler keyboard for jotting down quick notes, and has been using the device more often since he started affixing it onto his omnipresent shoulder bag with a strip of Velcro rather than storing it in a holster or within the bag. The professor recently published a paper concluding that speed of access is a key determinant of whether a mobile information device will be employed for occasional or routine chores, and estimated that such a device is not used if it takes more than 10 seconds to retrieve it. Starner finds the Twiddler to be superior in terms of speed, accuracy, and flexibility; a 2003 paper by him and his team notes that people can master the Twiddler faster than a QWERTY keyboard, and type more than 60 words per minute after just a few hours of practice. Starner believes the hearing-impaired, who "have adopted wireless texting as a convenient means of communication within the community," can benefit from the Twiddler by linking the keyboard to a cell phone, enabling wireless texting. Another possibility, for those who find the Twiddler too cumbersome or inconvenient, is "dual-purpose speech," a voice recognition system that triggers functions in wearable computers--raising an electronic calendar to schedule an appointment, for instance--in response to words often repeated in conversation. Starner envisions a system the user can fine-tune to respond to specific words with a push-to-talk button.
    Click Here to View Full Article

  • "Conversational Engagement Tracked"
    Technology Research News (12/08/04); Patch, Kimberly

    Researchers from Palo Alto Research Center (PARC) and the University of Rochester are working on voice communication systems that respond to the way people talk and can automatically adapt voice channels on the spot by evaluating the level of conversational engagement, according to PARC scientist Paul Aoki. He also says the system could facilitate other unique computer adjustments to users: "If your computer can detect that you are deeply engaged in conversation with another person, whether on the telephone or [in] the same room...it might defer a loud announcement that you have new email, or it might set your instant messaging status to busy," Aoki explains. The system measures engagement, starting with individual speakers' prosody, which is piped into a first-level module that can identify specific emotional states through pattern recognition; a second-level module tracks emotional states as time passes as well as the states of the other speaker. Five levels of engagement can be gauged by the system. In tests with recorded conversations, the first-level module alone yielded an accuracy rate of 47 percent, compared to 20 percent resulting from random choice. Measuring second-level emotion over time resulted in a 61 percent accuracy rate, while emotion tracking of the other speaker added two more percentage points. Chen Yu, assistant professor of psychology and cognitive science at Indiana University, says that developing accurate emotional state categorization techniques that function well across different speakers was a formidable challenge. The PARC researchers are attempting to enable real-world tests by adding the conversational engagement tracking software to their existing voice communication system, and Yu thinks the approach could be practically applied in three to six years.
    Click Here to View Full Article

  • "Blazing a New Data Speed Record"
    CNet (11/30/04); Kanellos, Michael

    A team of researchers from the California Institute of Technology, CERN, Fermilab, England's University of Manchester, and elsewhere won this year's Supercomputing Bandwidth Challenge by successfully achieving a sustained data transfer rate of 101 Gbps--equivalent to transmitting the full contents of the Library of Congress in a quarter of an hour--between Pittsburgh and Los Angeles. Caltech physics professor and team leader Harvey Newman wrote in an email that significantly higher throughputs are possible, and capable of being sustained for hours in subsequent tests, rather than minutes. The experiment's main objective is to develop technology that will allow globally distributed physicists to collaborate on massive, data-intensive projects using far-flung computers, while the fields of bioinformatics, astronomy, global-climate modeling, and geosciences also stand to benefit. The latest speed record outpaces the same group's throughput of 23.2 Gbps, which won last year's Bandwidth Challenge, as well as the sum of all the other marks in the previous two years. Caltech professor Steven Low's Fast TCP protocol, which offers better congestion prevention than standard TCP, is partly responsible for the higher throughput. Also contributing to the speed record was a fortified hardware infrastructure that included several 10 GB connections, four dedicated wavelengths of the all-optical National LambdaRail academic network, Web services software, and other diverse technologies. Newman said the experiment shows that "There are...profound implications for how we could integrate information sharing and on-demand audiovisual collaboration in our daily lives, with a scale and quality previously unimaginable."
    Click Here to View Full Article

  • "Avoiding Identity Crises"
    Financial Times-IT Review (12/01/04) P. 5; Perkin, Julian

    The need for government transparency and data-sharing is driving adoption of information standards such as handles and Digital Object Identifiers (DOIs). Handles are persistent identifiers that are a sort of industrial-strength Web link, and DOIs build on the Handle System and identify the content itself instead of just the location. Governmental organizations such as the European Union, the Organization for Economic Cooperation and Development (OECD), and national governments are investing in both technologies to strengthen their data-sharing capabilities and make sure their digital content investments are protected. In government operations, especially, it is critical to have identifiers that recognize unique information in a way that is universally understood and not subject to change, says Robin Wilson of The Stationery Office (TSO), which provides publishing services to the U.K. government but operates as a private company. Standard information identifiers enable interoperability between government systems and also help meet the legislation requirements. Britain's TSO and the Office for Official Publications of the European Communities in Luxembourg are both DOI registration agencies and distribute identifiers free-of-charge. DOI accommodates numbering systems such as ISBN/ISSN and is flexible enough to allow change of ownership. The OECD uses TSO-distributed DOIs and plans to use them with its electronic charts: Users who want to view the chart information in finer granularity can click through the DOI link and receive more information. OECD Publishing also plans a Stat Link service, employing DOIs, that will allow users to overlay data from different charts and spreadsheets in order to enhance analysis.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "A Few Steps Forward, a Few Back"
    Government Computer News (11/22/04) Vol. 23, No. 33; Jackson, William

    The government's cybersecurity practices and infrastructure will take years if not decades to update, due in part to entrenched bureaucracy, says former cybersecurity czar Amit Yoran. "You have to be realistic about how long it is going to take to produce a more secure infrastructure," he says. During his tenure, Yoran guided the establishment of a startup capability and successfully identified 5,700 network blocks in an attempt to map government IP addresses. A project identifying the exposure of each network block is ongoing. Despite limited successes the federal government has fallen far short of the revolutionary effort needed to fix the situation, Yoran says. Much of the problem lies in ignorance about the true state of IT security in the federal government. Another problem is the review process for discovering lapses in security. Vendors are undergoing inadequate, multimillion-dollar reviews that only serve to validate original vendor claims, and Yoran calls the process too time-consuming and too dependent on paperwork. In order to make faster progress, the new government cybersecurity head will have to effectively maneuver through bureaucracy. Yoran says the task of securing cyberspace is daunting, and requires both short-term and long-term approaches. Long-term solutions include research into new software development tools that automate the quality assurance development process; the goal is to end the cycle of software patches to fix new vulnerabilities. Yoran says, "DHS cannot by itself change the way software is developed, but we can encourage and help in the adoption of some of the new tools." Yoran also says the Common Criteria evaluation process for software testing and certification is only a slight improvement over the previous Orange Book standards it replaced and "doesn't encourage government to work with industry" in the development of reliable software.
    Click Here to View Full Article

  • "ICANN to Tackle Internet Policy"
    ITWeb (South Africa) (11/26/04); Vecchiatto, Paul

    When ICANN meets this week in Cape Town, South Africa, for its third meeting this year, ICANN delegates will publicly recognize the organization's newly approved strategic plan for the next three years as well as the establishment of the Regional Internet Registry for Africa (AfriNIC) and the re-delegation of the .za country-code domain name for South Africa. The strategic plan is an attempt to initiate various reforms within ICANN, which acts as administrator for the Internet's Domain Name System under a memorandum of understanding from the U.S. Department of Commerce. ICANN has been criticized by some, mainly governments, for its "control" of the Internet, and the United Nations has proposed sharing some of the responsibility for such areas as domain name administration and policy development through its International Telecommunications Union. Several forums scheduled to take place during the Cape Town meeting will address these and other issues pertaining to ICANN's policies and processes. At ICANN's July meeting in Kuala Lumpur, Malaysia, ICANN Chairman Vinton Cerf stated that the "UN was off track in its discussions on whether government officials should set Internet policy." However, during that same meeting Cerf did indicate that there was a role for the UN to play. "What we do need and do not have are parts of the UN to look at issues such as electronic commerce, the question of digital signatures, tax, fraud, and enforcement," Cerf said.
    Click Here to View Full Article

  • "Searching Smarter, Not Harder"
    Wired News (11/30/04); Gartner, John

    The federal government and other entities are indexing their data with topic maps in order to boost the relevancy of database search queries. Patrick Durusau with the Organization for the Advancement of Structured Information Standards explains that topic maps can assign context to terms with more than one meaning; "The payoff [of topic maps] from the user standpoint is that you are no longer confronted with everything in the world that is known about the subject," he notes. Topic maps classify terms based on their relationships with other things; InfoLoom President Michel Biezunski says the maps are constructed using a combination of computer automation and human intervention. The data is organized into a precursory map with an artificial intelligence application, while relationships between terms are defined by human experts. Biezunski was a consultant on the first topic map deployment, which was undertaken by the IRS to index tax forms and improve agency representatives' handling of phone calls, create the small-business CD sent to taxpayers, and compare the agency's data with the Social Security Administration's. Other topic map projects are being implemented by U.S. Defense Department agencies, and Biezunski thinks the legal and pharmaceutical sectors will probably be the next industries to embrace topic maps. Innodata Isogen President George Kondrach notes that intelligence agencies are working with topic maps to address regional variations in spelling so that linkage to data about suspected terrorists, for example, will not be blocked due to different spellings of their names. LexisNexis software engineer Eric Freese, who contributed to the development of the World Wide Web Consortium standard for building XML documents that can be embedded in topic maps, reports that interest in topic maps is high in Europe, where work is underway to devise commercial applications.

  • "Privacy's Random Answer"
    CNet (11/24/04); Kanellos, Michael

    Michael Kanellos is impressed by IBM's work with data randomization as a possible solution to the increasingly contentious issue of consumer privacy, which is fueled by consumers' growing outrage with how companies and organizations collect, exchange, and distribute their personal information. The idea is that data randomization would use indecipherable mathematical calculations to effectively scramble consumer data such as age, income, past purchases, or medical information while allowing back-end systems to discern patterns within the customer base. The randomization system uses Bayesian probability to determine the relationship between different values, so that consumers do not have to falsify their data, which is randomized before being transmitted to the corporate server. The back-end computer attempts to ascertain the randomizing calculations employed to conceal the original values, so that accurate customer base simulations can be extrapolated. "I think the key insight was that you don't have to have access to precise information to build good models," explains IBM senior fellow Rakesh Agrawal, who is directing the data randomization project. In several trials, there was a mere 2 percent to 3 percent difference between the curve plotted by the original data and the reconstructed curve. Among the areas Agrawal believes could benefit from data randomization technology is hospital services, which could provide records about disease epidemics without fear of litigation. In addition, large businesses could share their data without revealing customer lists, while network security could be bolstered.
    Click Here to View Full Article

  • "Increasing Public Sector Efficiency Through Knowledge Management"
    IST Results (11/24/04)

    European researchers have successfully tested a collaborative knowledge-sharing tool for the public sector, under the European Commission's Information Society Technologies program. The KIWI platform combines mobile computing with collaborative group software to overcome the traditional geographic and organizational barriers that prevent government employees from working more efficiently. A KIWI test in Milan, Italy, teamed 45 regional immigration workers, police, and the Interior Ministry to jointly facilitate a new family reunification law that required processing thousands of petitions from immigrants; KIWI enabled the three government agencies to share a single workflow process, as well as allowed immigration workers to process requests faster, resulting in both cost savings and increased quality of service. In Finland, several municipalities provided 30 home care workers with KIWI-loaded devices so they could remotely access support functions and provide better care to 130 elderly people receiving public assistance. The remotely accessed applications included tools for more accurate medical diagnosis as well as care management tools. The KIWI platform consists of a Web-services knowledge warehouse that manages necessary information and user profiles, and a knowledge-management toolset that facilitates collaboration between government workers, including messaging functions, group networking, and community services; KIWI is also a multimedia platform and supports images, video, and audio. The public sector generally lags the private sector in adoption of knowledge management technology, but KIWI provides a platform that addresses the organization and processes of government agencies. Other European and Asian governments are inquiring about the platform.
    Click Here to View Full Article

  • "A Robot for the Masses"
    New York Times Magazine (11/28/04) P. 94; Goldman, Francisco

    Wow Wee's Robosapien, touted by creator Mark Tilden as "the first real mass-marketed humanoid robot," may be representative of the current state of robot technology, in that the most significant advances are coming from the consumer entertainment sector rather than academic or government laboratories. The battery-powered, remote-controlled Robosapien can perform simple tasks such as picking up and carrying objects or guarding closets, although its performance depends on the operator's skill. Tilden opted to design Robosapien without any digital processing, using analog circuits that generate continuous waves to drive simple components that imitate the pendulum kinesics of animal locomotion. As Tilden recounted in a 2000 interview: "We started out years ago on the assumption that rather than try to evolve a robot from some sort of intelligence, which has always seemed to have failed, we would evolve robots by themselves, starting out with the very simplest of devices and then working our way up, to see if we could eventually try and build humanoid-like devices that would interact with us on our own particular level." Combining the robot's ability to carry out mundane chores and act as a sort of companion with an affordable price of $99 makes the device marketable across many age levels. However, Robosapien could not be mass-produced until a digital equivalent to Tilden's analog controllers was developed through an intensive brainstorming session between Tilden and engineers at a manufacturing plant in China. The key to Robosapien's appeal is its "play depth," the level to which users can explore the device's functions once they have mastered basic operations contained in the manual. Tilden also noted that Robosapien was designed with hacking in mind, to encourage users to experiment with the machine and broaden its capabilities.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Managing Complexity"
    Economist (11/27/04) Vol. 373, No. 8403, P. 71

    A 2003 study by the National Institute of Standards and Technology estimates that software errors account for $59.5 billion in yearly losses to the U.S. economy, and these errors stem from the inability of software development applications to keep pace with the millions of lines of code needed for increasingly complex software projects. Reliable tools that can handle both the massive code volume and changing project requirements must be created; driving the evolution of software development is a growing awareness of the need to concentrate harder on software lifecycles, a move toward automated software testing, and the advent of open-source code, all of which converge in the "agile" programming movement. The software lifecycle traditionally follows a sequential "waterfall" model, but IBM software development director John Swainson considers an "iterative" model in which the organization continually cycles through the developmental steps to be more sensible. Upcoming products from Borland, IBM, and Microsoft embrace the iterative model, although a paper by Penn State researchers in the February issue of ACM Queue argues that the waterfall paradigm is still more commonly accepted, in practice. Software testing falls into two categories, unit testing and more complex functional testing, but Mercury Interactive estimates that roughly 60 percent of functional testing can be automated, enabling developers to implement far more test cases in a shorter amount of time than they would manually, facilitate reusable tests (a critical advantage when software is revised), and mitigate the switch from a waterfall model to an iterative model. The open-source ethos is being embraced by numerous companies, some of which use more restrictive licensing terms than others. Agile programming's central tenet is that programmers must interact frequently with each other as well as the business people who set project requirements, and coupling this practice to the rapid delivery schedules agile programmers aim for hastens the iterative software development model.

  • "Outgoing IETF Chair Reflects, Looks Ahead"
    Network World (11/22/04) Vol. 21, No. 47, P. 14; Marsan, Carolyn Duffy

    Cisco Fellow Harald Alvestrand says his four-year tenure as Internet Engineering Task Force (IETF) chairman was marked by administrative restructuring to ease workloads and the development of Session Initiation Protocol (SIP) and iSCSI specifications. When Alvestrand took his job in 2000, there were 2,900 people in attendance at the December meeting, compared to just 1,300 attendees at the most recent meeting in November this year. Alvestrand says the expectation of unrealistic growth is gone and this has allowed the IETF to focus on evolutionary technological advances, such as iSCSI; still, the IETF has faced an increasingly heavy workload and pressure to serve market forces. Alvestrand says an agreement with the Internet Society to assume administrative duties such as contracting will ease the burden for incoming leaders and allow them to focus on protocol work; several high-profile members of the IETF's leadership resigned under Alvestrand, citing overwork. The IETF has not been able to address the issue of spam concretely because it is not a problem that is solved with a single technical solution, as was explained at a recent Federal Trade Commission hearing in November, though Alvestrand suggests that IETF working groups addressing the spam issue could reemerge if there is enough interest. In the last four years, the scaling challenge has been lessened by the advent of much faster routers, but mobility and secure out-of-the-box device configurations are still needed. There are 600 million people on the Internet today, but the aim is to drive further adoption by making the Internet more mobile. Alvestrand points out that even IETF standardization is of little use when it does not serve the purposes of vendors, as evidenced by the incompatibility of the Jabber instant messaging software used at IETF meetings with MSN instant messaging software.
    Click Here to View Full Article

  • "Linux Unchained"
    Computerworld (11/22/04) Vol. 32, No. 47, P. 37; Pratt, Mary K.

    The cost savings promised by Linux are causing demands for enterprise Linux deployments to surge, but the savings could be offset by the scarcity of Linux expertise, which allows skilled Linux administrators to command a premium. Skills cited by Linux specialists and proponents as critical to Linux system work include programming, documentation, file editing, source code modification, and management, though Worcester Polytechnic Institute professor Michael Ciaraldi reports that finding people with expertise in networking and graphics is difficult; he also notes that Linux experts must be willing to seek help from others, using resources such as Web sites and user groups. Analyst Laura DiDio says, "What you're basically looking for is that eclectic network administrator or software developer from circa 1988--someone who knows lots of different things." Some contend that a dearth of in-house staff skilled in Linux is the reason why companies are so hesitant to embrace Linux, while others say IT departments can find the needed talent with little hardship or extra pay--Ciaraldi, for one, thinks personnel experienced with other operating systems will have little difficulty transitioning to Linux. However, Linux presents challenges for companies that prefer versatile IT staffs skilled in both Linux and Windows. Furthermore, experts assert that companies desire people with business experience in addition to Linux skills. Though some firms are trying to deal with the shortfall of skilled Linux workers by turning to vendors, DiDio cautions that there are not enough tech support personnel among even the major Linux distributors to satisfy demand. Another approach Linux-minded companies are taking is to opt for Linux training and certification from vendors, as well as recruit new people chiefly for their Linux expertise.
    Click Here to View Full Article

  • "Smart Software"
    EE Times Supplement (11/04) P. 83; Turley, Jim

    Software, rather than hardware, has become the key differentiator for consumer electronic devices, but the success of future products depends on designers resisting the tendency for "creeping featurism" by developing user-friendly software that delivers obvious, unambiguous features. The inherent variability of consumer devices negates the need for a universal software standard, while the growing complexity of device-to-device communication can only be addressed by software. The most likely way to solve this problem is through the adoption of standard guidelines by rival vendors. For instance, smarter software and industry standards will be critical for the support of a household entertainment network consisting of single-purpose boxes that talk to one another wirelessly. The highly successful TiVo user interface is one yardstick that other remote controls are measured by, and usually found wanting with their identically colored and shaped buttons and unreadable text; at the same time, standardization is not considered a wise move for user interfaces--at least for now--as their diversity is a primary differentiator. The complexity of consumer electronics can be effectively concealed with the right amount of software, but achieving this goal will require careful planning and additional collaboration between vendors. The expansion or rethinking of existing standards is a necessity for consumer electronics firms, although this carries the risk of making current products obsolete or incompatible.
    Click Here to View Full Article

  • "Spectrum R&D 100"
    IEEE Spectrum (11/04) Vol. 41, No. 11, P. 61; Goldstein, Harry; Hira, Ronil

    Factors that likely have a dramatic impact on research and development in the long run, according to executives and analysts participating in IEEE Spectrum's third annual R&D survey, include more and more R&D resources being devoted to software development, systems engineering, and consulting; a migration of basic research from corporate labs to government-backed academic efforts; and globalization's effect on industrial research. UC Berkeley's Henry Chesborough argues that the last trend has significantly increased the population of available talent for hire as well as driven industrial researchers to become "knowledge brokers" who use the Internet to find potentially lucrative areas for R&D. "Those two coming together has created a global research environment where there is very little unique leadership that you could point to in any one [technical] area that isn't very closely followed by someone else in another location," remarks Intel CTO Patrick Gelsinger. Over the past 40 or so years, industry spending has overtaken federal spending for R&D, resulting in a slackening in basic research and a greater corporate reliance on universities for R&D breakthroughs. The situation is unlikely to improve soon because of promised federal R&D budget cuts, and some experts are worried that financial incentives such as patent rights are encouraging academia to move further away from basic research. The survey registers the most R&D spending gains in the software and services sector, indicating a prevalence of software development and less hardware differentiation. The move to a decentralized research model where innovations are sought outside the company is reflected in changes in corporate missions and structures where companies hire researchers to integrate existing products into new configurations for clients rather than commit R&D resources to the improvement of those products.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM