HomeFeedbackJoinShopSearch
Home

Be a TechNews Sponsor

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM. To send comments, please write to [email protected].
Volume 7, Issue 766:  Wednesday, March 16, 2005

  • "Want to Increase Retention of Your Female Students?"
    Computing Research News (03/05) Vol. 17, No. 2; Werner, Linda L.; Hanks,Brian; McDowell, Charlie

    National Science Foundation-funded research conducted by a team of four University of California, Santa Cruz professors and one Fort Lewis College professor suggests that pair programming in introductory programming courses can aid the retention of female students in computer science (CS)-related majors and play a key role in closing the CS gender gap. Pair programming is a technique in which two programmers--a "driver" who enters program code and a "navigator" who looks for errors and provides advice--collaborate on a project, exchanging roles regularly. Traditional introductory programming courses generally require that students work by themselves, a situation that can give them the mistaken impression that software development is a field characterized by social isolation. Pair programming actually dovetails with the collaborative development of working-world, non-trivial software projects. The research team introduced pair programming to introductory CS courses with over 500 students, and concluded that participants exhibited greater confidence in the programming assignments and more enthusiasm in completing assignments than students who worked alone, and had a higher probability of finishing and passing the course. Paired students' performance on individually taken final exams matched that of solo students; paired students were equally likely to pass subsequent programming courses where pair programming was not used, and had a higher likelihood of registering as CS-related majors one year later. The researchers discovered that the percentage of paired programmers, both male and female, who went on to declare a CS-related major was significantly higher than the percentage of students who programmed alone.
    Click Here to View Full Article

    For information on ACM's Committee on Women and Computing, visit http://www.acm.org/women

  • "Innovation: The Next Big Thing"
    Computer Weekly (03/15/05); Bradbury, Danny

    A 2004 Grand Challenges conference yielded seven computing milestones that U.K. researchers would strive to realize in the next several decades, and these milestones were recently disclosed in the British Computer Society's Grand Challenges in Computing Research report. The milestones include science for global ubiquitous computing and scalable ubiquitous computing, which both dovetail with distributed systems that support expandable networks of multitudinous, pervasive sensors that draw power from the environment. BT Exact futurologist Ian Pearson says achieving this milestone will fundamentally change the telecom industry to the degree "where we will charge for individual calls and [go] to an 'all you can eat' model." Another milestone involves explorations into non-classical computation that deviate from the von Neumann model of sequential computing, examples of which include quantum computing and quantum cryptography. The dependable systems evolution milestone emphasizes verifiable systems that can mathematically confirm their security and reliability prior to operation, a challenge that will become more formidable as systems grow in complexity. An absolutely provable system, a Holy Grail of IT, may not be possible, says Eugene Spafford, chair of ACM's U.S. Public Policy Committee, especially when the human element is factored in. However, he says "there are many fields when appropriate stochastic measures let us make risk decisions." The in vivo-in silico milestone focuses on the simulation of organic systems, a key step toward bio-inspired computing that Pearson says could yield radical technologies such as self-growing computers. He envisions first-generation systems composed of neurons, and second-generation systems comprised of protein clusters built from self-assembling DNA. ACM President David Patterson cautions that biology should not take precedence over IT, which he calls the "foundation of all research." He says, "We have this ability to simulate rather than perform physical experiments and that is a pretty powerful notion."
    Click Here to View Full Article

    For more information on ACM's U.S. Public Policy Committee, visit http://www.acm.org/usacm.

  • "Schneier: Secure Tokens Won't Stop Phishing"
    IDG News Service (03/15/05); Roberts, Paul

    Strict government regulation is more important for e-commerce security than technology solutions, says Counterpane Internet Security founder Bruce Schneier in an interview. Schneier's article in the April issue of Communications of the ACM argued that two-factor authentication and other end-user technology solutions will not be enough to thwart determined hackers. He says online fraud is becoming more active and immediate; multi-factor authentication is useless when Trojan programs monitor plain text and keystrokes or when man-in-the-middle attacks dupe users into entering information on fake Web sites. Two-factor authentication is useful in some applications, such as securing internal access to company servers, but not for e-commerce. Schneier says a more effective solution to e-commerce fraud is to make banks liable for financial fraud in the same way credit card companies face most of the cost of credit card fraud. After regulations in the credit card industry, those companies began tightening down on fraud through detection technology in their own databases instead of focusing on how customers use their cards; Schneier believes the banking industry will similarly take steps to identify and stop online fraud if their bottom line is threatened. In the battle against online fraud, absolute security is impossible because security is a continuum--the aim is to manage risk enough so that commerce can continue. Security tokens issued by U.S. Bancorp, e-Trade, and America Online will provide improved security against some e-commerce threats, but eventually the benefits from multi-factor security will diminish as hackers shift tactics, says Schneier.
    Click Here to View Full Article

  • "Key Open-Source Programming Tool Due for Overhaul"
    CNet (03/14/05); Shankland, Stephen

    The widely used GNU Compiler Collection (GCC) will receive new optimization capabilities that should boost performance of open-source software compiled using the tool, including Linux, Firefox, OpenOffice.org, and Apache. GCC 4.0 will add new optimization technologies that enable the compiler to not only optimize local portions of a program, but take the overall structure of the program into account; using technology called scalar replacement and aggregates, data structures spread over a large amount of source code could be broken up, allowing object components to be stored directly in on-chip memory instead of main memory, for instance. A framework called Tree SSA (static single assignment) is also available for optimization plug-ins written later, says CodeSourcery "chief sourcer" and GCC 4.0 release manager Mark Mitchell. The result of the upgrades should be more efficient and effective translation of codes from high-level languages such as C into binary code understood by computers; open-source programs that use GCC will also run faster. Mitchell says GCC, included in the original Gnu's Not Unix effort and launched in 1987, is becoming more professional and commercial, along with other open-source software. There are now about 10 GCC core programmers dedicated to GCC's development, while Intel, which offers its own high-quality compilers for x86-run software, also contributes to GCC development because so much GCC-compiled software runs on the x86 chip platform. Pathscale also offers a performance-oriented open-source compiler that is derived from Silicon Graphics' Open64 compiler for scientific programs. Pathscale's Len Rosenthal says the goal is to make Open64, which is compatible with GCC 3.3, the default compiler for the x86 platform.
    Click Here to View Full Article

  • "Crack in Computer Security Code Raises Red Flag"
    Wall Street Journal (03/15/05) P. A1; Forelle, Charles

    A flaw in a "hash function" technique for encrypting online data has been uncovered by a team of Chinese researchers at Shandong University, and this has raised alarms in the computer security industry because it casts doubt on the so-called impenetrability of hash function-based cryptography. The researchers found the vulnerability using the SHA-1 hash algorithm, a federal standard circulated by the U.S. National Institute of Standards and Technology (NIST) that is also considered to be cutting edge as well as the most popularly employed hash function. The Shandong team learned that "collisions," in which two different chunks of data yield the same hash, can be uncovered in SHA-1 far faster than previously thought. Cryptographers say the exploitation of the flaw, though seemingly impractical, could affect applications involving authentication, theoretically enabling a hacker to erect a bogus Web site with convincing security credentials and steal data sent to it by unsuspecting users. Counterpane Internet Security CTO Bruce Schneier confirms the existence of the SHA-1 flaw, which the Chinese researchers have not publicized. NIST is advising federal agencies to keep SHA-1 out of any new applications, and urging them to devise plans to eliminate SHA-1 from existing applications. Recently demonstrated vulnerabilities in other hash functions such as MD4 and MD5--which SHA-1 is based on--have also made cryptographers nervous. Concerns about information security are at an all-time high even without revelations about hash functions' vulnerability, most recently thanks to break-ins at data aggregators LexisNexis and ChoicePoint.

  • "Creative Commons Is Rewriting Rules of Copyright"
    Washington Post (03/15/05) P. E1; Cha, Ariana Eunjung

    Stanford University law professor Lawrence Lessig's Creative Commons licensing scheme, which permits authors and artists to distribute their works online while retaining "some rights reserved" rather than "all rights reserved," has gained a substantial advocacy base. Lessig is concerned that creativity is being stifled because many works remain outside the public domain thanks to the extension of copyright laws, and he envisions the Creative Commons as a repository of universally accessible "artifacts of culture." Over 10 million works, including music, videos, and course materials, have thus far been released under Creative Commons licenses, and even hard-line opponents of unauthorized online file-sharing such as former Motion Picture Association of America (MPAA) President Jack Valenti have been won over. "The Internet is a machine for making copies, and artists need to come to grips with that," argues author Cory Doctorow, who has distributed several novels under Creative Commons licenses. He adds that new distribution models have always provoked howls of outrage that they threaten art, and such fears are unfounded. Washington general counsel for the MPAA Fritz Attaway sees no reason why Creative Commons-licensed works and traditionally copyrighted works cannot exist together. However, he contends that major film studios or record labels already reap plenty of profits from works released under the traditional copyright system, which makes wide distribution of their content under a Creative Commons scheme highly unlikely.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Programming Wizards Offer Advice at Developers Confab"
    eWeek (03/15/05); Schindler, Esther

    This week's Software Development Expo is an event where programmers can pick each other's brains on effective strategies for developing quality code, and selecting methodologies that work for developers and their team has been an overarching theme. Ronin International consultant Scott Ambler noted that most software development projects come in behind schedule, and every project participant is aware that the schedules are unworkable; he believes software methodologies such as Scrum, in which iterative development is funded in very brief time frames of no more than a month, is an effective solution, one that cuts bureaucracy and allows developers to learn what features are important to them upfront. Gerald Weinberg, a 50-year software development veteran and author of "The Psychology of Computer Programming," said developers' habit of misrepresenting the progress of projects to management has not changed in the last half century. "[Management doesn't] know what you're doing, so they reward the appearance of work," he remarked. Although the primary goal behind developers' acceptance of methodologies is a desire to produce quality code that does what customers want, they are also aware of the challenges to management; Sun Microsystems Java architect David Heskell said it is often mistakenly believed that methodologies that work well on one kind of system are universally applicable to all projects, an assumption negated by the reality of dynamic project needs, technical complexity, and people issues. Google principal engineer Joshua Bloch recommended in a panel discussion on agile methods that developers be especially cognizant of variable naming strategies. His advice was to try to make code read as close to English as possible.
    Click Here to View Full Article

  • "Local Research Team Works Alongside Homeland Security"
    Daily Illini (03/15/05); DeAvila, Liz

    Ten schools, including the University of Illinois at Urbana-Champaign, are involved in the Institute for Information Infrastructure Protection (I3P) consortium's two-year, $8.5 million program to identify security flaws in the supervisory control and data acquisition (SCADA) systems that help manage America's electric power, water, and oil and gas infrastructure. University of Illinois electrical and computer engineering professor David Nicol and Information Trust Institute director Bill Sanders will participate in the effort, which is funded by the National Institute of Standards and Technology and the Homeland Security Department. I3P assistant director for research and analysis Eric Goetz says Illinois researchers "are helping identify and evaluate vulnerabilities in existing SCADA designs, modeling and testing SCADA systems in order to develop more secure technologies in the future." Nicol says he and Sanders are well versed in system assessment, and have recently focused on the evaluation of security systems. He is confident that they can supply a formal mathematical foundation for determining a system's reliability under a series of threats. Nicol and Sanders will put in "people time" in upcoming weeks to gauge the data they will receive from members of their team, and that information will be applied to attempts to re-design SCADA systems so that they are more secure against both physical and online threats. Each institute participating in the I3P program will focus on different elements of the SCADA security problem.
    Click Here to View Full Article

  • "Looking at Open Source Software Through Arabeyes"
    Islam Online (03/16/05); Noronha, Frederick

    A group of Arabic-speaking volunteers is working to create open source tools that will make computing more accessible to hundreds of millions of people who use Arabic lettering; the Arabeyes project aims to establish itself as the nexus of Arabization efforts, which previously have been initiated by overseas students and stopped when those students finished their studies. Besides Arabic, the effort will also benefit other South Asian scripts such as Urdu, Pashto, Farsi, and even Hindi. In all, volunteer Mohammed Sameer from Cairo estimates between 225 million and 400 million people will directly benefit from these efforts, and he sees the core group of about 13 daily contributors as the Arabic-speaking equivalents of Richard Stallman or Linus Torvalds. Top Arabeyes contributors include volunteers from Sudan, Algeria, Iraq, and Lebanon. So far, Arabeyes has produced translated versions of Gnome and KDE Linux desktops, the OpenOffice.org 1 suite, and multi-platform text editor Vim; ongoing projects include OpenOffice.org 2, the Firefox browser, and an all-new Akka software layer that allows Arabic to be used on text-based Linux and Unix consoles, while other Arabic-specific projects include the Duali Arabic spelling checker and the Web dictionary interface QaMoose. Arabeyes.org has more than 500 registered users and focuses on programs for Linux. Sameer welcomes people who want to help on localization projects, and suggests they first think about the language needs and look for places where they can fill the gaps. Sameer is also a member of the Egyptian Linux user group responsible for promoting Linux in that country.
    Click Here to View Full Article

  • "Open-Source Movement Now In Hands of Hired Guns"
    Investor's Business Daily (03/15/05) P. A4; Brown, Ken Spencer

    Corporate programmers have for the most part supplanted volunteer programmers as developers of core open-source software. IBM committed $1 billion to the development and promotion of the open-source Linux operating system four years ago, and has since made over 500 software patents and 30 software applications freely accessible to open-source programmers. "As Linux goes mainstream, the market gets bigger and the dollars available around the world grow, it becomes a great business opportunity," notes Open Source Development Labs CEO Stuart Cohen. Many companies are devoting their developers' time to the improvement of Linux in the hopes of ensuring that the OS is compatible with their hardware and software, while Cohen says some firms are gambling that increasing demand for Linux will in turn raise sales of related products. Corporate involvement benefits Linux by enhancing the OS with industrial-grade features that volunteer programmers would take years to develop. Linux creator Linus Torvalds is not concerned about any company dominating the development of Linux so that it gains a competitive advantage over rival Linux firms, because open-source development follows a democratic model to guarantee that only the best ideas prevail. In addition, improvements to the software are available to anyone through Linux's open licensing scheme. Andrew Morton, a chief deputy of Torvalds', maintains that most programmers, even commercial ones, develop a sense of loyalty to Linux that is stronger than corporate fealty.

  • "Sign Language"
    Government Technology (03/14/05); Newcombe, Tod

    Pennsylvania State University scientists are working on a computer that allows people to pull together disparate pieces of data using gestures, voice input, cognitive engineering, natural language technology, and geographic information systems (GIS) into a coherent whole in order to facilitate better decision making in crisis management. Project investigator and Penn State School of Information Sciences and Technology professor Michael McNeese says the biggest problems crisis teams currently face are information overload and their inability to organize and study spatial data using GIS. Penn State's GeoCollaborative Crisis Management (GCCM) project focuses on developing technology that allows government crisis teams to amalgamate critical data without the need for extensive and costly GIS training through a three-year analysis of teams' collaborative crisis response tactics. At the heart of the project is the Dialogue Assisted Visual Environment for GeoInformation (DAVE_G), a multimodal interface that can interpret and carry out user commands by combining gesture and voice recognition. "This technology will make the information that's essential for crisis management more accessible to the people who need it," says Penn State geography professor Andrew MacEachren. GCCM is funded by a $400,000 Digital Government Research Program grant from the National Science Foundation. Penn State is pursuing the project with the help of federal agencies such as the Homeland Security Department, the Federal Emergency Management Agency, and the EPA, as well as state partners such as the Florida Division of Emergency Management and Pennsylvania's Department of Environmental Protection.
    Click Here to View Full Article

  • "Video Games--A Girl Thing?"
    CNet (03/15/05); Winegarner, Beth

    Sony Online Entertainment senior game designer Sheri Graner Ray is a crusader for increasing women's presence in the video game developer and game player communities, and was honored with the International Game Developers Association's Community Contribution award at last week's Game Developers Conference. Ray, whose contributions include the seminal textbook "Gender Inclusive Game Design: Expanding the Market" and the Girls in Games volunteer organization, says in an interview that the game industry is now acknowledging the reality of female gamers and is more actively pursuing this market segment. Ray's advocacy began with her curiosity as to why most women--even those in the industry--are disinterested in video games, and she says the reasons for this disinterest will be better understood once females comprise half of all gamers as well as half of all developers. A lack of female characters or "avatars" to play is one barrier to women's participation in video games, and Ray believes this issue will be addressed. She says women want female avatars to exhibit heroic traits such as youth and strength, but without over-exaggerating their sexuality, as is usually the case. As a consultant for other game developers and software firms, Ray has often been confronted with the question, "How do we make games for girls?," but she thinks the question that should be asked is, "How do we get more women to play our games?" Ray observes that more women play massively multiplayer games than PC standalone or console titles partly because multiplayer games can accommodate a broader range of play styles. She also notes that women often play a key role in keeping multiplayer game communities together because of their tendency to internalize the game.
    Click Here to View Full Article

    To learn more about ACM's Committee on Women in Computing, visit http://www.acm.org/women.

  • "Electrical Engineer Receives $400,000 to Evaluate Maximum Capacity of Wireless Networks"
    University of Texas at Austin (03/08/2005); Rische, Becky

    The National Science Foundation has awarded a $400,000 Early Career Development grant to University of Texas at Austin electrical engineer and Wireless Networking and Communications Group professor Sriram Vishwanath, who will use the money to determine the maximum capacity of wireless networks. "The problem is there is a lack of appropriate tools within any one field to tackle the problem in its entirety," remarks Vishwanath, who plans to concoct a never-before-seen methodology for finding the maximum amount of information a wireless network can send by combining the fields of information theory, optimization theory, and coding theory. Vishwanath will start by modeling simple networks of connected multiple computers or phone systems, and then analyze how data is streamed through these systems using information theory. The next step will be to ascertain the maximum information flow capacity via optimization theory and mathematical optimization models, and the final step will apply coding theory to the interpretation of the results and the verification of real-network maximum capacity. Principles derived from the assessment of simple networks will be generalized to more complex networks with this method so that the transfer of data between network components can be easily accomplished. The resulting theory will enable any network participant to know their network's maximum information flow capacity, as well as the best way to maximize throughput. Vishwanath acknowledges that developing such a theory represents the most formidable challenge.
    Click Here to View Full Article

  • "Experts Look to Digital IDs to Boost Net Security"
    IDG News Service (03/11/05); Pruitt, Scarlet

    Experts at the CeBIT computer trade show in Germany said digital IDs, enabled by multi-factor authentication, would help boost flagging consumer confidence in e-commerce. Currently, rampant identity theft, spyware infections, and other cyberthreats are even causing some companies to scale back their Internet activities, said RSA Security CEO Art Coviello. At Credit Suisse, customers are asking more questions about online security, which shows that cybersecurity issues are starting to undermine the bank's trustworthiness, said IT security architecture head Beat Perjes. Digital IDs would increase e-commerce security and enable other applications, such as digital passports. Individuals would be given some information only they possess, possibly a security token or biometric measurement, which would have to be authenticated in addition to a password or other more standard identification component. Experts said digital IDs could be used across a number of processes, but warned against using a centralized data repository that could be pilfered by hackers. Sun Microsystems chief technology officer Hellmuth Broda touted the Liberty Alliance Project, which is a coalition of more than 150 companies seeking to create a federated identity system; vendors need to coordinate their efforts if they are going to solve common problems, he said. Coviello suggested that a federated system could institute privacy firewalls where companies could easily share customer data without revealing the specific identity of those individuals. Broda warned that no system would provide perfect protection, but that efforts could make it much more difficult for hackers to break.
    Click Here to View Full Article

  • "IETF Leaders Urge Detente With Rivals"
    Network World (03/14/05) Vol. 22, No. 10, P. 1; Marsan, Carolyn Duffy

    At a plenary session of the Internet Engineering Task Force (IETF), recently appointed chairman Brian Carpenter promised better outreach to competing standards bodies such as the World Wide Web Consortium and the International Telecommunication Union (ITU), arguing that such cooperation is vital to maintaining the IETF's relevance as the leading Internet standards organization--especially when confronted with declining participation and network industry consolidation. Restructuring the task force's financial and administrative architecture has been the primary focus of IETF leaders for the last several years, and this effort will soon reach its conclusion. Shinkuro engineer Allison Mankin, a director of the IETF's Transport Area, says the agency's seven areas are attempting to inject new life into their individual efforts by "developing strategic plans not so much for bringing in attendance but for looking to the future of our areas." Carpenter admitted that the Applications Area in particular needs to reconsider its initiative, but repudiated suggestions of discord between the IETF and other standards bodies in Web services and other emergent areas. There has been ample evidence of cooperation between the task force and other standards organizations in recent weeks: The IETF and the Open Mobile Alliance convened at a March 9 meeting that participants hailed as a "breakthrough," while on March 10 the task force hosted a panel on Voice over IP developments that was transmitted as a live Webcast to the Spring 2005 VON Conference. A joint IETF-ITU workshop on next-generation Internet architecture is also planned for May.
    Click Here to View Full Article

  • "Outsourcing Innovation"
    BusinessWeek (03/21/05) No. 3925, P. 84; Engardio, Pete; Einhorn, Bruce; Kripalani, Manjeet

    Western companies are outsourcing product research and development to overseas centers, a trend underscored by an increasingly homogeneous feeling among CEOs that more innovation is critical, but that current R&D investments are not delivering sufficient returns. This is causing corporations to reconsider their structure, and has prompted most leading Western companies to adopt an innovation model that relies on global networks of design partners. The benefits of this model include an increase in production speed and volume, but the flipside is the cultivation of new rivals, an erosion of brand-name companies' incentive to sustain new technology investments, and waning investor confidence; to avoid such outcomes, executives say corporations must shield some sustainable competitive advantage. The continuance of the R&D outsourcing trend will likely strengthen the role of low-wage countries churning out engineering graduates as intellectual property providers. India's contract R&D revenues are expected to surge from $1 billion annually to $8 billion in three years with help from the country's software development sector. "A Whole New Mind" author Daniel Pink projects that Western nations such as America will retain their competitive edge from "right brain" work involving "artistry, creativity, and empathy with the customer that requires being physically close to the market." Flextronics CEO Michael Marks expects Western tech conglomerates to dramatically retool R&D; he observes that about 80 percent of tasks currently performed by product engineers can easily be exported overseas, the majority of core technologies in today's digital gadgets are universally available, and the growing incorporation of technologies into semiconductors is increasing the simplicity of circuit boards. Evidence suggests that the key to maintaining leadership in the face of R&D offshoring lies in organizing and managing global networks of innovation.
    Click Here to View Full Article

  • "Managing Next-Generation IT Infrastructure"
    McKinsey Quarterly (02/05); Kaplan, James M.; Loffler, Markus; Roberts, Roger P.

    The complexity of today's IT infrastructures can be traced back to a build-to-order mindset that is traditional in most IT organizations, where the typical IT infrastructure is custom made for each organization, write McKinsey & Co.'s James M. Kaplan, Markus Loffler, and Roger P. Roberts. As the IT world shifts toward distributed computing and the use of Web-centered and client/server architectures, organizations are gaining the ability to replace this build-to-order approach with a more "off-the-shelf" model focusing on particular service requirements. Rather than specifying a particular set of hardware and a particular configuration to handle a business application, developers specify a service requirement and infrastructure groups respond with reusable services delivered by a streamlined, automated "factory." There are three main fronts on which organizations must act in order to create a next-generation infrastructure: They must segment user demand, create standardized and "productized" reusable services, and develop the shared factories that will provide streamlined delivery of these services. As CIOs work to build a new organization, the key roles are those of the product manager, who defines products and product portfolios, and the factory architect, who designs the shared processes needed to deliver them. IT leaders should concentrate on five key areas to ensure that new or improved infrastructure is working successfully: Demand forecasting and capacity planning; funding and budgeting; product-portfolio management; release management; and supply and vendor management. An example of a successful next-generation infrastructure is that used by Deutsche Telekom's fixed-network division T-Com, which outsources most IT operations to the sister company T-Systems. T-Com and T-Systems work together to manage supply and demand across applications, with T-Com paying only for the capacity it uses, which in turn puts pressure on T-Systems to make efficient use of capacity.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "U.S. High-Tech Economy Slipping"
    Physics Today (03/05) Vol. 58, No. 3, P. 28; Dawson, Jim

    The Task Force on the Future of American Innovation, which counts industrial, scientific, and academic groups among its members, released a set of benchmarks last month to convince Capitol Hill that "the U.S. government is falling behind in its commitment to basic physical sciences research, which is a critical part of our competitive future," according to task force industry member Doug Comer of Intel. Association of American Universities President Nick Hasselmo says the coalition's findings clearly indicate that fewer U.S. citizens are participating in science and technology, concurrent with increased overseas competition in the same fields. The benchmarks find that the U.S. portion of science and engineering papers worldwide fell from 38 percent in 1988 to 31 percent in 2001. U.S. funding for physical sciences has been in a three-decade decline, while China, Taiwan, and South Korea collectively boosted their gross R&D investments by roughly 140 percent from 1995 through 2001, compared to America's 34 percent increase. Meanwhile, far more European and Asian students than American students are earning science and engineering degrees, and the number of Chinese, Taiwanese, and South Korean students pursuing PhDs in their home countries dramatically overtook those pursuing PhDs at American universities between 1994 and 1998. The benchmarks also note that America is lagging behind other nations in its nanotechnology, fusion energy, and nuclear power research efforts. The release of the benchmarks followed the Bush administration's scaling back of government funding for basic research, as recommended in the fiscal year 2006 budget proposal. Nobel physicist Burton Richter blamed "bipartisan shortsightedness" for the funding crisis, warning that this policy will have a detrimental effect on America's economic leadership in the long run.

  • "Open-Source Users Find Rewards in Collaborative Development"
    Application Development Trends (02/05) Vol. 12, No. 2, P. 27; Joch, Alan

    Collaborative open-source software development is attractive to corporate IT groups--particularly those in the financial community--as a vehicle for developing cleaner code and more innovative applications. However, open-source experts caution that protective measures must be incorporated to sufficiently shield intellectual property and prevent the unwitting distribution of destructive code. Jim Herbsleb, a professor at Carnegie Mellon University's International School of Computer Science and a member of its Institute for Software Research, outlines the requirements for prosperous commercial open-source development: Among them is face-to-face communications between programmers; a consideration for what end-user needs the software must satisfy; and an autocratic development structure in which a small group or single individual dictates what code should be embedded in new releases. "I've found that most successful open-source projects are run with an iron fist," Herbsleb notes. For companies, this development model promises to deliver the advantages that come with high-quality, easily alterable code that is free of licensing fees. Vijay Gurbani with Lucent Technologies/Bell Labs says the software development process advocated by Herbsleb is not wholly autocratic; customers' needs are paramount, which makes the decision of what code to put in and what to leave out less of a technical and more of a management issue. Despite the lack of licensing fees, open-source software entails other costs in terms of responsibilities, reports Dresdner Kleinwort Wasserstein's global head of open-source initiatives Steve Howe. "Even when someone sends you a bug fix, you have to test to make sure there really is a bug, and, if there is, that the fix actually works," he explains.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM