Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Thunderstone or ACM. To send comments, please write to technews@hq.acm.org.
Volume 7, Issue 791: Friday, May 13, 2005

  • "Tech Leaders Call for More Government IT Research"
    IDG News Service (05/12/05); Gross, Grant

    Despite claims from the Defense Advanced Research Projects Agency (DARPA) and the White House Office of Science and Technology Policy (OSTP) that computer science R&D spending has increased, House Science Committee Chairman Rep. Sherwood Boehlert (R-N.Y.) declared at a May 12 hearing that current budgets are not in line with "broad consensus" over long-term IT and cybersecurity research funding requirements. Akamai Technologies chief scientist Thomson Leighton and National Academy of Engineering President William Wulf said DARPA's focus has moved away from long-term IT research toward short-term projects emphasizing incremental enhancements to existing technology; DARPA director Anthony Tether countered that the agency's current emphasis on cognitive computing and artificial intelligence research should not be construed as short term. OSTP documents indicate that President Bush's 2006 budget request for the Networking and Information Technology Research and Development Program (NITRD) is slightly higher than the fiscal 2005 budget. However, the total cross-agency NITRD budget fell by about $100 million from 2005 to the 2006 request, while DARPA NITRD funding experienced a decline of almost $100 million from fiscal 2002 to the 2006 request. Tether says such figures do not account for new IT-related projects DARPA is funding outside of NITRD's aegis. Rep. Dana Rohrabacher (R-Calif.) argued for the Bush administration's short-term, results-oriented research strategy, but Wulf said the fate of results-oriented research is intertwined with that of long-term research. "If one stops filling the pipeline, the effect on industry will not be immediately visible as it drains the pipeline...but that there will be an impact is an inescapable lesson of history," he reasoned.
    Click Here to View Full Article
    For more information regarding ACM's involvement in the May 12 hearing, visit: http://campus.acm.org/public/pressroom/press_releases/5_2005/cs_research_future_5_13_2005.cfm

  • "W3C Plots Better Browsing for PDAs"
    InternetNews.com (05/11/05); Boulton, Clint

    The World Wide Web Consortium's (W3C) Mobile Web Initiative (MWI), launched on May 11 under the supervision of W3C director Tim Berners-Lee, aims to improve the Web browsing capability of mobile devices, which has been hampered by a dearth of interoperability, technology, and standards. The consortium reports that Web sites are difficult to access on the small screens of smart phones or personal digital assistants, while content providers are hard-pressed to build sites suitably accessible for Web-enabled phones because an appropriate development standard does not exist. Hewlett-Packard strategic technology director Evan Strouse says additional improvements in device usability and mobile services will be facilitated through the MWI. The initiative will be complementary to other mobile W3C projects in the open Web space, such as the development of multimodal interaction standards and mobile device profiles. The MWI involves a pair of core groups: The "best practices" group will devise guidelines and checklists Web content developers can use to ensure that content is suitable for mobile devices, while the "device description" group will log a database of descriptions that allows content authors to shape content for a specific device. "We believe the MWI will accelerate the development of rich media content services and will be a catalyst for the next generation of engaging communications experiences," Strouse says.
    Click Here to View Full Article

  • "National ID Battle Continues"
    Wired News (05/12/05); Zetter, Kim

    Despite the Senate's unanimous passage of the Real ID Act on May 10, citizens, government organizations, and even congressmen who signed off on the measure are gearing up to protest provisions authorizing the creation of a standardized national driver's license. In addition, several states are promising to ignore the legislation because of unreasonably high compliance costs and the burden it will place on motor vehicle department employees. The Real ID Act was slipped into a larger Iraq spending appropriations bill, giving senators little choice but to accept the legislation even if they opposed parts of it. The bill's backers claim the measure would prevent terrorists and undocumented immigrants from acquiring legitimate documentation that would allow them to move throughout the country unmolested; the Real ID Act was formulated in accordance with recommendations from the 9/11 Commission to tighten control over government-issued IDs. The bill would force states to produce tamper-proof driver's licenses with machine-readable, encoded data, which critics say would essentially create a national ID card and a de facto national database. Drivers could not obtain or renew their licenses without providing multiple documents of identification, which DMV workers would have to verify against federal databases, as well as store along with a digital photo of the card holder in a database. This would add up to higher costs and longer lines at the DMV, according to critics. The Electronic Privacy Information Center's Marc Rotenberg is particularly critical of the lack of debate over the driver's license provisions prior to the bill's passage by the Senate, and he says more than 600 organizations are against the legislation, including ACM.
    Click Here to View Full Article
    For background information, and to read about ACM's stand on the Real ID issue, visit: http://www.acm.org/usacm/weblog/index.php?cat=8

  • "Open-Source Divorce for Apple's Safari?"
    CNet (05/12/05); Festa, Paul

    A leading Apple browser developer has suggested abandoning the KHTML open source rendering engine for the Safari Web browser, confirming the de facto divorce between the KHTML developer community and Apple, once seen as a white knight for the project. In an email dated May 5 and sent to KHTML architects, Apple engineer Maciej Stachowiak said Apple's own WebCore KHTML version should be the standard code base; "One thing you may want to consider eventually is back-porting [WebCore] to work on top of [KDE], and merging your changes into that," he wrote. The email is an acknowledgement of how far apart the Apple and KHTML efforts have diverged since Apple announced support for the open-source project two years ago. At that time, KHTML developers believed Apple's engineers would revitalize the effort, and Apple's move was also seen as an approval of KDE's lithe browser engine at a time when Mozilla still suffered from serious code bloat. But conflicting procedures and priorities strained the relationship, such as non-disclosure agreements required when viewing Apple code, code audit requirements, and bug reports Apple did not want shared. KHTML developers say the strains eventually led to estrangement between the two groups, even though Apple has adhered to licensing rules. Not all partnerships between corporate and open-source development groups turn out badly, and analyst Gary Hein says the Apple-KHTML case is an exception rather than the rule. In fact, the Macintosh operating system is based on the Darwin open-source operating system and Netscape's Jeremy Liew says his company's return to the Mozilla project has worked out well.
    Click Here to View Full Article

  • "SIGGRAPH 2005 Selects 98 Outstanding Papers From 461 Submissions"
    Press Release (05/11/05)

    Ninety-eight papers out of 461 submissions will be presented at SIGGRAPH 2005, and SIGGRAPH 2005 Papers Chair Markus Gross from the Swiss Federal Institute of Technology says the high number of accepted papers "clearly demonstrates the large body of excellent research in computer graphics." He says the selections reflect three key research trends: The increasing computerization of reality, more sophisticated physics simulation, and advanced image and video processing methods. The first trend Gross mentions includes increasingly "data-driven" lighting and shading models, which could facilitate more photorealistic alteration and simulation of human faces. Papers illustrating the second trend cover physically based simulations of the complex interaction of media--liquids, smoke, solids, etc.--that could enhance visual effects and future video games; contributors include Stanford University, Intel, and Industrial Light & Magic. Other papers detail a resurgence of ray-tracing algorithms and architectures that has yielded prototypes that encourage graphics hardware designers to reconsider their definition of the graphics pipeline. Papers indicative of the third trend noted by Gross present advanced techniques for panoramic video stitching, rendering 3D photos, and intelligent and user-friendly video editing. Gross expects the public to rapidly adopt these techniques as tools for editing home videos. SIGGRAPH 2005 takes place July 31-August 4, 2005, in Los Angeles.
    Click Here to View Full Article
    For more information on ACM's SIGGRAPH, and to register, visit: http://www.siggraph.org/s2005/

  • "Breaking Into the Boys' Club"
    Daily Vanguard (05/13/2005); Meyer, Elisabeth

    Far more men than women receive degrees in science and engineering, raising concerns of a disparity between opportunities for men and women in the science fields. "Equal opportunity in math and science will benefit not just the women who enter the professions, but all Americans through our technological leadership and our national security," argued Sen. Ron Wyden (D-Ore.), calling for the enforcement of Title IX in math and science. Title IX bans gender-based discrimination in any institution that receives federal funding. Female students at Portland State University (PSU) earned 27 bachelor's degrees from the Maseeh College of Engineering and Computer Science in 2004, compared to 165 degrees awarded to male students; last year's female recipients of master's degrees totaled 56 to male recipients' 158. PSU officials say the school wishes to promote females in the sciences without specifically resorting to Title IX enforcement. "We try to make sure we do the kind of advising and recruitment that draws women, and give out scholarships to women in math and science," says PSU College of Liberal Arts and Sciences Dean Marvin Kaiser. He says the university is actively attempting to interest female middle-schoolers and high-schoolers in the science disciplines through programs such as MESA and Saturday Academy. Maseeh College's Marcia Fischer says attracting women to hard sciences is a bigger issue than a dearth of opportunities, noting that the shortage of female engineers has remained consistent despite the huge amounts of money the National Science Foundation pumps into the promotion of engineering opportunities for women.
    Click Here to View Full Article
    For information on ACM's Committee on Women in Computing, visit: http://www.acm.org/women

  • "Aging U.S. Challenges Tech R&D"
    Investor's Business Daily (05/12/05) P. A4; Vallone, Julie

    Tech companies are beginning to recognize the potential rewards they stand to reap from the aging baby boomer population, a segment whose needs were largely ignored until recently. "Because there is now a significant number of older adults using the technology, and in a position to buy products, manufacturers and designers are starting to pay attention to them," says Florida State University psychology professor Dr. Neil Charness. Bentley College's Design and Usability Testing Center founder William Gribbons says few industries realize the benefits they can gain from designing products for seniors, with the exception of financial services and health care. Financial service firms are attempting to enhance access for older customers by increasing the user-friendliness of customer service systems, while the health care industry plans to drive costs down by augmenting health care management via technology; Gribbons lists doctors checking up on patients daily through email as one example. Other companies developing and marketing more senior-friendly technology include IBM, which has devised a computer mouse adaptor that eliminates excessive cursor movements for users with hand tremors. For people who are incapable of even moving a mouse, NaturalPoint has developed SmartNav, a system that employs an infrared camera that tracks a reflective dot worn on the user's forehead, thus allowing users to control a cursor and navigate a computer non-manually. And Elia Life Technology is working on a new raised alphabet that vision-impaired users can read faster than Braille or the raised Roman alphabet, along with a tactile keyboard, a printer that generates raised text, and a easily refreshed tactile display that uses protracting and retracting pixels.

  • "Gamers to Rule Their Own Virtual Worlds"
    New Scientist (05/12/05); Knight, Will

    Users of massive multiplayer online role-playing games must link to a centralized server owned and maintained by the game company, but researchers at France Telecom believe allowing players to store portions of virtual worlds on their own computers via peer-to-peer (P2P) networking will make the games more expandable, robust, and engaging. France Telecom's Solipsis project is a simple role-playing game in which users interact within a virtual space that is hosted on their own machines. "The idea is to have an infinitely scalable world," says Solipsis developer Joaquin Keller. He admits that designing the P2P virtual world so that communications from increasing numbers of players did not overwhelm the network was a challenge; the solution was to have the system exchange messages locally rather than broadcast them. Users increase Solipsis' scalability by installing the software, and though only a 2D interface for user interaction is currently available, Keller says a 3D graphical interface is on the horizon. Meanwhile, the Open Source Metaverse Project enables users to construct visually sophisticated 3D environments that can be connected to each other online. Julian Dibbell, co-editor of the Terra Nova gaming blog, says maintaining a game's appeal to users could be challenging if control is decentralized, adding that unauthorized duplication of digital artifacts could become especially problematic. One potential solution offered by Linden Lab's Second Life game is to permit users to add copy controls to items they create.
    Click Here to View Full Article

  • "Shared Computing Grid Cuts Data Mountains Down to Size"
    UW-Madison (05/10/05); Devitt, Terry

    The Grid Laboratory of Wisconsin (GLOW) aids in processing the reams of data generated by University of Wisconsin-Madison projects that range from cancer research to chemical engineering to the simulation of high-energy particle accelerator experiments. GLOW taps the unused processing cycles of hundreds of computers on campus through Condor, a computing template that accumulates all idle processing power. The grid splits large research operations into smaller, more manageable tasks and hands them off to unemployed computers on the network. Processors are contributed by participating teams, while a $1.2 million gift from the National Science Foundation and a matching grant of $500,000 from the university has facilitated the assembly of additional racks of processors managed by the individual projects. "Each group has full control over its own resources, but when they are not being used locally, they must be available for use by the GLOW community," says Condor's developer, UW-Madison computer scientist Miron Livny. He says GLOW tries to follow a generic design in order to support disparate applications. There is close collaboration between Livny's team and GLOW participants to ensure that applications can be adapted to exploit GLOW and other Condor resources. Livny says convincing faculty that participating in GLOW can be advantageous to their projects is a formidable challenge, since the grid is a limited resource despite the steady addition of processors to the hundreds that are already in use.
    Click Here to View Full Article

  • "Linux Powering UCF/DARPA Grand Challenge Vehicle"
    NewsForge (05/10/05); Reilly, Rob

    Linux software plays a key role in the University of Central Florida's (UCF) entry for the Defense Advanced Research Projects Agency's (DARPA) Grand Challenge, a 140-mile to 160-mile race between autonomous, self-navigating vehicles in the Southwest United States slated for October 2005. The UCF vehicle is a Subaru Outback 4x4 outfitted with various antennas, a camera, a SICK laser scanner, batteries, and a quartet of onboard computers. The main computer routes the sensor data to other client machines; another unit analyzes scanner and video camera input; the third actuates the car's throttle; and the fourth coordinates the safety and vehicle emergency control systems. Chief software developer Remo Pillat programs the vehicle via a wireless laptop, and he said Linux was selected to manage the vehicle because the source code is available and modifiable. Moreover, Linux allows a much faster response time than proprietary operating systems such as Windows, and the software helped Pillat deploy a multi-threaded architecture across machines. The challenge of combining all sensor inputs as well as the other outputs, relays, and switches is met with a 16-port serial to Ethernet converter. Pillat said the university employed Linux with serial ports rather than programmable logic controllers (PLCs) because it fulfills the computational power requirements for a vehicle traveling at a peak speed of 20 mph to 25 mph, and eliminates any software upgrade or change issues that might crop up with PLCs or other specialized gear.
    Click Here to View Full Article

  • "Go Forth and Multiply, Little Bot"
    Wired News (05/12/05); Leahy, Stephen

    Scientists at Cornell University's Computational Synthesis Lab claim to have built a robot that can replicate itself. The machine consists of four modules outfitted with microprocessors, sensors, and electromagnets that are stacked on a metal plate; the robot is capable of bending, reconfiguring, and manipulating other modules. Conducting an electrical current through the metal plate causes the modules to grab new modules from a "feeding" station and assemble a duplicate robot in a few minutes. Every module contains a complete blueprint of the robot and its location in the robot, says Cornell professor of mechanical and aerospace engineering Hod Lipson; self-replication depends on the robot being "fed" new modules at the right time and place. Lipson describes self-replication as "more of a continuum ranging from high dependence on the environment, like our robot, or lower dependence, such as a rabbit, which can seek out food and mates." For now, the robot's sole function is reproduction, and developing more sophisticated machines requires thousands of identical, very small modules. Swarm or amorphous computing could help achieve such a breakthrough, although Lipson believes evolutionary robotics is a more likely candidate. The professor foresees machines capable of replacing malfunctioning modules or even reconstructing themselves in extreme environments such as space or nuclear reactors.
    Click Here to View Full Article

  • "India Eyes Own Open-Source License"
    CNet (05/11/05); Kanellos, Michael

    Indian Institute of Technology professor Deepak Phatak wants to develop an open-source license that allows developers to share ideas while preserving their rights to their own software alterations. Phatak hopes his Knowledge Public License program will help facilitate a reconciliation between the open-source software community and proprietary software companies, which have traditionally been at cross-purposes; however, he acknowledges that "we have to move very carefully because the Americans have a tendency to sue anybody for anything." India has become a major software development hub, thanks to the combined power of the outsourcing industry and the country's vast population of computer science and engineering students. Furthermore, numerous sources confirm that collaboration between Indian universities and the private sector is increasing. Phatak's Ekalavya program, for instance, encourages open-source ideas by allowing students to submit concepts to a collaborative portal, which brings authors of worthy proposals in contact with industry mentors. The professor says, "Today, India is a net taker in the open-source community. In four years, I want the world to recognize India as a net giver, and that is entirely possible." The explosion of open-source licenses in recent years has prompted complaints from members of the open-source community, and spurred attempts to curb license growth.
    Click Here to View Full Article

  • "Interdisciplinary Move in Research"
    Australian (05/11/05); Cooper, Dani

    Today's science students need to be prepared for a interdisciplinary future where breakthrough solutions emerge at the intersections of biology, chemistry, environmental science, computer science, and other fields. In the medical and biological sciences fields, computers are needed to make sense of the enormous amount of information that has been gathered in recent years, says Mike Wallach with the University of Technology, Sydney's (UTA) Institute for the Biotechnology of Infectious Diseases. Successful scientists will need to quickly incorporate a wide field of knowledge and stay flexible in terms of concepts and technologies, says Wallach. University of New South Wales science dean Mike Archer believes universities will increasingly collaborate, allowing students to access experts from different institutions and pursue more hybrid degrees; he says environmental science will be a much more important field in the future as more natural resources are consumed and the human population grows. Governments need to invest in risky efforts such as the Australian Quantum Computing Center, which has already generated spinoff knowledge in the last five years to justify the $40 million-plus investment, says Archer. University of Western Australia (UWA) computer science and software engineering school head Nick Spadaccini believes future computer science graduates will have to study at least one other discipline, and UWA computer science students already are required to spend half their three-year study in another field. Disruptive technology such as nanotechnology or quantum computing could spur even more radical changes: UTS Institute for Nanoscale Technology director Mike Cortie says almost anything is possible with nanotech, but possibilities are limited by economics. It is possible to clean up all types of pollution with nanotech, but it is also not always feasible, for example.
    Click Here to View Full Article

  • "Developing With Open Source Tools"
    LinuxWorld (05/11/05); Gedda, Rodney

    Open source development tools provide a strong alternative to proprietary products because of their low cost, ease of use, wide developer base, and opportunity for customization. The so-called LAMP platform consists of Linux, Apache, MySQL, PHP, Perl, and Python; these technologies are relatively well understood and pose a low barrier of entry for new developers, which allows more people to contribute and faster conversion of business logic to code, says Solutions First director David Kempe. In addition, the non-proprietary nature of these tools ensures they are available and supported despite business irregularities such as mergers or the fate of a software vendor, while maintenance costs are also usually lower than proprietary tools because of community resources, such as the Comprehensive Perl Archive Network that offers prefabricated solutions for things like backups. LAMP technologies are useful for creating a wide range of business applications for which there are often base implementations that can be customized, while open source tools are also available to a wider range of employees via the Web. Some back-end applications remain insulated from open source development; for instance, open source accounting functions are sparse. Sydney University computer science expert and software engineer Hamish Ivey-Law says open source software often matches programmers' requirements more closely because the tools are created for the sake of programming and not from market analysis. Programmers can use LAMP technologies with a many different programming styles and different development environments, including KDevelop and Eclipse.
    Click Here to View Full Article

  • "Opposing Views: The Debate Over the H-1B Visa Program"
    InformationWeek (05/09/05) No. 1038, P. 22; Chabrow, Eric

    Responding to pressure from IT industry leaders, Congress agreed to raise the H-1B program's cap by an additional 20,000 visas. H-1B advocates argue that a looming scarcity of domestic IT talent and declining enrollment in college and university IT programs necessitates the country bringing in talent from abroad or branching out overseas to make up for the shortfall; proponents such as Gerald Cohen further contend that employers should be allowed to hire the best qualified personnel irrespective of nationality, provided they pay the prevailing rage. Visa backers claim offshore outsourcing is more damaging to American workers because such workers receive much lower salaries than they would in the United States. Critics, on the other hand, say the H-1B program allows employers to go after younger employees willing to work for less while leaving older, more experienced U.S. workers in the lurch. A 2003 Government Accountability Office study found that a 31- to 50-year-old American systems analyst/programmer with an advanced degree earned 40 percent more, on average, than a younger H-1B worker. Meanwhile, the Bureau of Labor Statistics reports that 131,000 people who regard themselves as computer professionals were out of work over the past four quarters ended in March. A study conducted by Deloitte Touche Tohmatsu concludes that developing nations will remain a source of talented IT professionals, whether they are imported through immigration programs or are paid at home through outsourcing agreements. "Business leaders will press their national governments to loosen restrictions on foreign immigration, while continuing to shift more and more work offshore," the report predicts.
    Click Here to View Full Article

  • "High-Tech Vision"
    Government Security (04/05) Vol. 4, No. 3, P. 22; Fickes, Michael

    South Florida coasts are being monitored by intelligent monitoring systems developed by the Department of Homeland Security's Homeland Security Advanced Research Projects Agency (HSARPA). The $8 million, 24-month Hawkeye program is meant to test Automated Scene Understanding (ASU) technology, which fuses diverse imaging technologies, seismic sensors, audio detection, and radar to monitor large security zones such as port facilities or chemical plants; ASU will not be able to replicate a human's ability to recognize events, but will help humans deal with huge volumes of data generated by sensors and video feeds. HSARPA program manager Peter Miller says human monitors have a limited capacity to focus on incoming data, but with ASU, human security personnel would be able to view consolidated reports and receive alerts in sensitive situations. Additionally, ASU technology will be able to correlate events over a long period of time, effectively identifying terrorist planning in its early stages. ASU is one of more than a dozen fast-track HSARPA projects under development; others include economical detector systems that would sniff out dirty bombs at airports and other points of entry, technology for identifying car bombs, and radiation-detection systems that distinguish between controlled nuclear materials and nuclear weapons. HSARPA has about $320 million to spend on projects this year out of the roughly $1.2 billion allocated to the Homeland Security Science and Technology Directorate, while other money goes toward long-term research and advanced operational programs. The department has outlined 15 "national planning scenarios" that help define the country's preparedness. HSARPA projects reflect those scenarios, including attacks on chemical plants, bio-weapons attacks on food supply, and conventional nuclear bomb explosions.
    Click Here to View Full Article

  • "Service-Oriented Science"
    Science (05/06/05) Vol. 308, No. 5723, P. 814; Foster, Ian

    Service-oriented science enabled by distributed networks of compatible services could potentially boost scientific productivity on both an individual and collective level by making powerful information tools universally available to facilitate widespread automation of data analysis and computation. Ian Foster of Argonne National Laboratory writes that interface uniformity is central to the success of service-oriented science, and lists issues that must be addressed in order for the field to fully prosper: Such issues include interoperability, scale, management, quality control, and incentives. Foster says the construction and implementation of a service necessitates expertise and resources in the areas of domain-specific content to be shared; the domain-independent software functions critical to service operation and management, and enablement of community access; and the physical resources required for hosting content and functions. The second and third capabilities can be outsourced to specialist providers via grid architectures and software. There are three distinct approaches to the problem of scaling that the various groups pursuing service-oriented science are investigating, with a focus on either one or several of these schemes. They include a "cookie-cutter" strategy involving the creation of dedicated domain-specific infrastructures where uniformity is enforced across the board; a more bottom-up strategy whereby researchers devise service ecologies featuring agreements on interfaces that enable the provision of content and function in accordance with participants' desires; and a general-purpose infrastructure that provides resources or functions independent of disciplines. The broad adoption of service-oriented science relies on a scaling scheme that incorporates aspects of all three approaches, Foster concludes.

  • "I Attended This Hacker Conference and All I Got Was All the Data on Your Hard Drive"
    Popular Science (05/05) P. 70; Mejia, Robin

    The Def Con hacking event is notorious for drawing malicious "black hat" hackers as well as "white hat" hackers-turned-security-consultants, who participate in contests to find vulnerabilities in networks and wreak mischief in cyberspace that have real-world applications with potentially damaging effects. People's growing reliance on electronic data exchanges and wireless communications is a windfall for both black hat and white hat hackers. Applications competing Def Con teams use are often run on services that most people employ on an everyday basis without realizing it, and that can be exploited with serious consequences. Though participating hackers and their attitudes may seem wild, the execution of their hacking strategies is marked by extreme discipline and focus, and the fact that this year's winning teams are organized around graduate programs may indicate that classroom gaming experience is a significant factor. Def Con attendees frequently discover many of the thousands of vulnerabilities that crop up in major software releases every year. Electronic "capture the flag" games and other strategies employed at Def Con are being applied to government and academic security training. A 2004 poll of businesses conducted by CSO magazine, Carnegie Mellon University's CERT center, and the U.S. Secret Service found that 125 out of 500 respondents admitted that their companies had lost money due to e-crimes. A separate CERT report verified that "vendors continue to produce software with vulnerabilities, including vulnerabilities where prevention is well understood."
    Click Here to View Full Article

  • "Research in Development"
    Technology Review (05/05) Vol. 108, No. 5, P. 32; Fitzgerald, Michael

    IBM Research director Paul Horn believes a new perception of research is necessary as IBM makes the transition from a product-oriented to a services-oriented company. Services are expected to account for roughly two-thirds of IBM's annual revenues now that the company has shed its PC business, and Horn must answer the question of whether corporations can conduct useful services research. IBM consultant Roland Rust with the University of Maryland's Robert H. Smith School of Business says the economy's movement away from goods makes the corporate pursuit of services research essential. Henry Chesbrough of the University of California, Berkeley's Center for Open Innovation thinks IBM's role in promoting services research could benefit services in the same way its promotion of computing benefited the academic field of computer science. Horn's campaign to support services dovetailed with IBM cognitive scientist Paul Maglio's campaign to "humanize" research by focusing on user requirements for technology instead of on the technology itself. Thus far the bulk of the work on services may be credited to IBM Research's mathematics branch. In the two and a half years since IBM began its services science initiative, IBM Research has enjoyed 250 direct consulting engagements and produced the WebFountain and Center for Business Optimization practice areas. WebFountain consists of a series of processes for configuring and analyzing massive sets of diverse data, while the Center for Business Optimization is devoted to making operations more efficient through offerings such as the Pharmaceutical Production Refactoring Tool.
    Click Here to View Full Article