HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Parasoft Logo    Automated Software Analysis - by Parasoft:
Error prevention for unit testing, coding standards analysis,
functional/performance testing, monitoring and reporting.
http://www.parasoft.com/products
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Parasoft or ACM. To send comments, please write to [email protected].
Volume 6, Issue 711:  Wednesday, October 27, 2004

  • "E-Vote Vendors Hand Over Software"
    Wired News (10/26/04); Zetter, Kim

    Federal officials announced on Oct. 26 that five voting machine manufacturers agreed to turn their e-voting software over to the National Software Reference Library, an archive established by the U.S. Election Assistance Commission (EAC) that election officials can reference in order to ascertain whether their voting software has been altered or tampered with. The vendors' actions earned praise from EAC Chairman DeForest Soaries, who requested the largest voting companies to submit their software back in June. Ultimately, counties and states will be able to use the archive to confirm that the software on their voting machines has been certified, although the library was not set up in time to analyze software that had already been loaded onto locked voting systems, meaning that election officials cannot verify its certification prior to the Nov. 2 election. However, the library, which will be maintained by the National Institute of Standards and Technology (NIST), can be used to probe any doubts about voting systems' authenticity that come up after an election by running comparisons between the hash files of the archived software and those of the software on the machines. Soaries confirmed that the NIST library will have to work in conjunction with other regulations in order to secure elections and voting systems. Related issues the EAC must address include authorizing who should check hashes prior to an election in the event local election officials lack knowledgeable personnel; locking down the optimal patching and upgrading procedures; and handling false positives. Other EAC-driven initiatives designed to bolster election integrity focus on new voting machine standards to make security a requirement, national standards for election procedures to physically secure voting machines, and the deployment of voter fraud countermeasures.
    Click Here to View Full Article

    For information on ACM's activities involving e-voting, visit http://www.acm.org/usacm.

  • "Bush and Kerry Debate Technology"
    IT Management (10/25/04); Stoodley, Kate

    In response to a questionnaire prepared by the Computing Technology Industry Association, presidential rivals John Kerry and George W. Bush both indicated that the IT industry is essential to the U.S. economy and the nation's competitiveness in the global economy. Bush cited the tens of thousands of investors, small business owners, and high-tech leaders backing his agenda for the IT sector. Kerry noted the need for government/private sector collaboration to promote IT applications that can improve the quality of life by surmounting legal and regulatory obstacles to IT adoption, supporting research and development and private projects, making the government use new technologies more intelligently, and setting up longer-term national goals on IT use. Kerry further promised that his cabinet will craft an "Innovation Agenda" committed to the promotion of digital opportunities, the institution of greater government openness, responsiveness, and efficiency, the transformation of the U.S. health care system, and the empowerment of disabled people. On the subject of cybersecurity, both Bush and Kerry agreed that protecting the national information infrastructure must involve the joint participation of government and business: Bush touted the National Strategy to Secure Cyberspace, whose priorities include establishing a national security response system and programs in vulnerability reduction, awareness and training, and government cybersecurity; Kerry promised to deploy global cybersecurity standards and best practices. Bush's solution to the decline of American IT workers' competitiveness is to provide loans for certification training, and a $250 million high-growth industry training program that taps the skills of technical colleges and the U.S. community. Kerry, meanwhile, wrote that he plans to invest in K-12 math and science education, give schools incentives for boosting science and engineering graduates, and develop online learning technologies so that U.S. workers can broaden their skills at their own pace.
    Click Here to View Full Article

  • "Citris Allows 3-D Interactions"
    Daily Californian (10/27/04); Rosenberg, Erica

    UC Berkeley's Center for Information Technology Research in the Interest of Society (CITRIS) has developed a technology that could facilitate three-dimensional interactions between people situated in virtual environments. The brainchild of CITRIS director Ruzena Bajcsy, the technology stitches together photos of a laboratory subject into a 3D image that can be inserted into one of three computer-generated worlds. Bajcsy says the technology could be used to gain insights on how people behave and trust one another in virtual environments, and as a tool for manual physical training. CITRIS is enlisting multidisciplinary researchers to explore such challenges in a controlled setting. Bajcsy collaborated on the project with UC Center of Humanities director David Goldberg, who is working with anthropologists, art historians, and archeologists to realize a virtual museum featuring 3D objects that can be manipulated. Funding has been difficult because of holey images and the intricate interactions needed between computer scientists to fulfill the project's real-time graphics, computer vision, and computer networking requirements. The technology is also far from perfect: A user's appearance in the artificial environment can be distorted or cropped if his attire is to similar in hue to the virtual decor, for instance; transferring the voluminous data between the two sites currently using the system requires the presence of an expensive cable; and Carnegie Mellon University's Takeo Kanade points out that digitizing the many cameras used to produce the 3D images requires a huge amount of computational muscle.
    Click Here to View Full Article

  • "U.S., International Space Organizations Turn to Open Source"
    InformationWeek (10/26/04); Greenemeier, Larry

    Space agencies throughout the world, most notably NASA and the Indian Space Research Organization (ISRO), are embracing open source to manage operations on and off the Earth. ISRO has enabled India to operate its own satellite communications networks, but email has become an essential tool because ISRO engineers often travel to other countries to take advantage of outside knowledge and expertise and evaluate developing technology as well. Internet-based email programs were deemed a security risk, says ISRO scientist Ramani Narayanaswamy, who is developing and implementing an information security initiative for ISRO; the organization eventually opted for the open-source Secure Mail Suite from Guardian Digital. ISRO started using Linux in 1998, and has since incorporated Apache Web Server, MySQL database, Open Office productivity applications, and the Mozilla Web browser. The ISRO Satellite Center's Unix environment was also enhanced by Linux-based mechanical structural analysis software, and the organization is currently attempting to standardize the employment of open source software across its facilities. The defection of technical personnel to the private sector has been a factor in ISRO's transition to open source. Meanwhile, NASA is using SGI's Linux-based Altix system as the operating system for its Project Columbia supercomputer, while the Livingstone2 reusable artificial intelligence software system at NASA's Ames Research Center is also open source. Livingstone2 is designed to help spacecraft, chemical plants, life support systems, or other complicated systems function with a minimum of human oversight, while Project Columbia is dedicated to space exploration, aerospace engineering, and global warming research.
    Click Here to View Full Article

  • "Life Lessons to Drive Solutions to Global Woes"
    EE Times (10/25/04); Mokhoff, Nicolas

    The theme of this year's PopTech conference, which ended on Oct. 23, was the need to reinvigorate the application of technology to address global problems. "The world needs to spend 1 percent of its GDP on reversing potential catastrophic consequences of continuing to live the way we do," argued Pennsylvania State University's Richard Alley, who insisted that available technology can reduce air, water, and environmental pollution. PopTech presenter Tom Daniel of the University of Washington noted that engineering can significantly benefit from the study of how animals move, citing his own work with computer programs and engineering models in order to develop improved robot designs. He said engineers can construct biological systems that store, process, and distribute information by observing motor signal interactions, but cautioned that researchers' primary emphasis on microbiology is hindering rather than accelerating progress in this area. Naturalist writer and biomimicry advocate Janine Benyus focused on the application of biological designs, processes, and tenets to hardware, estimating that current methods for creating new materials and processes only yields 4 percent usability. Her contention was that new forms, processes, and ecosystems can be generated by learning and imitating life's ability to infuse matter with information through the use of energy. Benyus offered a dozen biologically-inspired "big ideas" that could boost hardware efficiency, such as self-assembly; mimicking naturally resilient and self-healing systems by studying how creatures surrounded by bacteria in water keep clean; and the use of sensing and responding mechanisms in collision avoidance systems through observation of locust swarm behavior.
    Click Here to View Full Article

  • "IBM Supercomputing Goes Retro"
    CNet (10/25/04); Shankland, Stephen

    A 2000 meeting between people from IBM and the National Energy Research Scientific Computing Center (NERSC) sought to address commodity-based computing roadmaps' inability to align well with the needs of scientific computing, which led to the notion of cheaply augmenting commodity processors with additional capability. That conference sowed the seeds of IBM's Virtual Vector Architecture (ViVA) project, which aims to close the gap between vector computing and the more widespread scalar computing using its new Power5 processors. ViVA would enable the 16-processor cores of a scalar server to be harnessed so they could function in the manner of a single vector processor. Power5 designer Ravi Arimilli explains that such a system would boast 32 floating-point units, and with enough demand IBM could construct the software tools needed to develop bigger systems. Steve Scott, a chief architect for Cray's X1 vector machine, says, "I'm not worried that IBM's ViVA processors will give true vector processors a run for their money, but I do think it's a good idea." NERSC general manager Bill Kramer reports that the follow-up ViVA-2 should be able to accommodate all Cray functions, and his center has proposed to deploy a machine capable of 50 trillion calculations per second using Power6+ processors and ViVA-2. IDC analyst Chris Willard notes that scalar computers cannot match vector computers' ease of programming, their ability to handle mathematical operations involving matrixes, and the "gather/scatter" technology that lets vector processors easily read and write data to widely scattered memory locations--an issue that Kramer says ViVA-2 will resolve. Though Cray thinks vector supercomputers will sustain their value, it has also undertaken scalar computing projects, such as the Red Storm cluster Cray is building for Sandia National Laboratories.
    Click Here to View Full Article

  • "U.S. Scientists Enjoy Big Bandwidth Boost"
    New Scientist (10/26/04); Biever, Celeste

    The deployment of the National Lambda Rail (NLR) fiber-optic network in the United States will enable researchers to exchange a greater volume of data at speeds that surpass the Internet's file transfer rates. The American research community entirely owns NLR, which can deliver transfer speeds of 10 Gbps to each user; the U.S. academic community's Internet2 network, by comparison, diffuses its 10 Gbps among all its users. Although both NLR and Internet2 use Wave Division Multiplexing to concurrently transmit different light wavelengths or "lambdas" along an optical fiber and eliminate interference, the former commits 40 lambdas to its community while the latter dedicates one. "NLR is another landmark in the progression towards ubiquitous high-speed computing, which is essential for our research," notes California Institute of Technology scientist Julian Bunn, who plans to use the network to receive terabytes of data generated by the Large Hadron Collider in Switzerland, which is slated to go online in 2007. NLR will also allow surgeons to remotely monitor operations via the real-time streaming of high-resolution video, as well as permit researchers to perform "destabilizing" experiments that the commercial Internet may be too fragile to support. Such experiments could include attempting new protocols, exploring the optimal approach for routing data between two locations, and evaluating the performance of decision-making software installed at the network's core or its periphery. NLR CEO Tom West will detail the network's advantages at the Supercomputing Conference 2004 on Nov. 6.
    Click Here to View Full Article

  • "Web Accessibility Best Practice for the Classroom"
    IST Results (10/26/04)

    The goal of IST's IDCnet project was to develop recommendations on how to apply Web accessibility guidelines promoted by organizations such as the World Wide Web Consortium (W3C) to mainstream university curriculums so that Design for All (DfA) considerations will be embedded into products by future designers and engineers, chiefly as a support vehicle for the work of the European Secretariat for the Design for All e-Accessibility Network. "We tried to set up a way of incorporating resources like the W3C Web Accessibility Guidelines into mainstream university teaching, so that students when they leave university are aware of these issues," notes project coordinator Carlos Velasco of Germany's Fraunhofer Institute. IDCnet established categories of existing accessibility teaching, and the content of those categories was then typed according to the disciplines they covered, target audiences, and delivery methods. Content then became IDCnet's focus, with emphasis on classifying types of knowledge and locating potential sources of new content. IDCnet partners concentrated on the areas of content, the tools that create content, and browser accessibility prescriptions, with the goal of making overarching guidelines applicable to all Web accessibility devices. Incorporating industry requirements into current teaching strategies were addressed at a 2003 IDCnet-hosted workshop, while a second workshop held this year delved into approaches for effecting such changes. The project's final recommendations, which are publicly available online, cover such issues as primary knowledge and skill-sets of specific university curricula and which specific guidelines can be incorporated into them.
    Click Here to View Full Article

  • "System X Faster, But Falls Behind"
    Wired News (10/26/04); Kahney, Leander

    Virginia Tech's System X supercomputer, which was recently upgraded from 1,100 Power Mac G5s to 1,100 custom-built Apple Xserve servers, is unlikely to be ranked as one of the top five fastest machines in the world in the next top 500 list, despite the approximately 20 percent performance gain facilitated by the upgrade. System X's InfiniBand communications backbone, in combination with the Xserves, has yielded a 12.25 teraflop Linpack benchmark score. System X lead architect Srinidhi Varadarajan attributes about 15 percent of the performance boost to the Xserves, while optimized software accounts for 5 percent. Varadarajan expects an additional performance gain between 10 percent and 20 percent in the next few months. System X currently holds the record for the fastest academic supercomputer and was ranked the third fastest system in the world in last November's top 500 list, but Varadarajan anticipates that bigger and costlier supercomputers will knock System X off its perch. He expects System X to make it into the top 10, and to retain its price-performance superiority: System X was built for roughly $5.8 million, compared to rival supercomputers' price tags of $20 million and higher. Varadarajan forecasts the world's top five supercomputers to be, in descending order, IBM's Blue Gene/L, Japan's Earth Simulator, NASA's Columbia, Lawrence Livermore's Thunder, and Los Alamos' Asci Q. System X, which usually runs several projects concurrently, is being employed by Varadarajan, Virginia Tech scientists, and outside groups for weather and molecular modeling research.
    Click Here to View Full Article

  • "Researchers Study Wi-Fi Weaknesses"
    NewsFactor Network (10/25/04); Martin, Mike

    As wireless networking becomes more important to businesses and individual users, security experts warn against new types of attacks that take advantage of Wi-Fi technology weaknesses. Stevens Institute of Technology Professor Suzanne Wetzel says two forms of stealth attack could either disrupt or divert information flow in an ad hoc network such as one that automatically connects to mobile visitors. In the first case, hackers send a glut of information requests to a node, causing it to waste power in responding and eventually disappear from the network; Stevens Institute wireless network security director Paul Kolodzy says this is akin to denial-of-service attacks on the wider Internet, where a barrage of information requests keeps the target from discerning and responding to legitimate requests for information. Another wireless vulnerability happens when attackers modify information routing so that traffic is sent out of the range of the intended user and instead diverted through corrupted nodes to the attacker's machine. Both types of attacks are very difficult to trace, but could be stymied by implementing trusted relationships among nodes based on "reputation tables" that store information on other routers and identify reliable sources. On the broader Internet, many users take the authenticity of Google's home page or the privacy of their email for granted, and a similar type of trust should be possible on wireless networks with trusted relationships between wireless entities. Wireless technology should allow nodes and other equipment to store and exchange reputation-based information, and use this to determine what messages are legitimate.
    Click Here to View Full Article

  • "Technology in the Third Dimension"
    Belfast Telegraph (10/20/04); Arthur, Charles

    Despite dramatic gains in processing power, computer users still rely primarily on 2D data visualization such as graphs and solid windows panels. Better options have been prototyped before, such as a 3D system created by Nottingham University professor Steve Benford about a decade ago: That system placed files on a 3D grid according to attributes, and different aspects determined where files were placed on X, Y, and Z axes; when the user changed the attributes, the files were rearranged accordingly. Links showed the relationship between files, and the user was able to navigate through the 3D grid using a standard two-button mouse. Perhaps the most popular 3D desktop technology today is Apple's OSX operating system, which provides translucent windows that users can partially see through and that cast shadows on window panels behind them--but though this sounds and looks nice, it actually hampers usability because users cannot easily read front-panel content when text is showing through behind it. Another lesser-known effort is Sphere (www.hamar.sk/sphere/), which places the user in the middle of a sphere of files they can turn to and manipulate, but this concept fails when considering that humans are very front-oriented, even when playing characters in 3D game environments. Sun Microsystems is working on another 3D desktop technology called Project Looking Glass that creates a gallery of related documents for fast browsing and allows users to annotate window panels. Academic research done by the University of Maryland's Student Human-Computer Interaction online Research Experiments (Shore) shows 3D systems do not allow users to accomplish tasks faster. The researchers tested users of different skill levels, including themselves, and found that 2D systems were more efficient in every case.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "ESA Joins European Effort to Create Digital Libraries for Science"
    European Space Agency (10/26/04)

    The Internet and digital technologies allow scientists from various disciplines and geographic areas to collaborate, sharing large data sets and common pools of processing power. The European Commission is leading a new project aimed at integrating existing digital library and scientific grid computing systems, called DILIGENT (Digital Library Infrastructure on Grid Enabled Technology). The European Space Agency (ESA) is contributing its large store of satellite data, to which hundreds of gigabytes are added daily; ESA has long used grid computing to distribute its data, so that users can access large sets of information without having to download and process the entire collection locally. Now, virtual research organizations are emerging around these grid computing and digital library technologies. DILIGENT project scientific coordinator Donatella Castelli says the new European effort will allow users such as ESA to easily create new digital library collections and leverage the European Commission's Enabling Grids for E-science in Europe grid computing capability: "The objective of DILIGENT is to create an infrastructure for the creation of digital libraries on demand, so a new one could be created each time a virtual organization needs it as a supporting instrument for its activity," she explains. The program will foster broader use of digital content and more in-depth analysis. Partner organizations include EU member states, non-governmental groups such as the World Wildlife Fund, and academic bodies; each group will contribute its unique data so that a user would be able to create a map of the Mediterranean overlaid with several sets of data from different organizations, for example. DILIGENT formally launched in September and is slated to run for three years.
    Click Here to View Full Article

  • "Fighting Fire With Fire: Designing a "Good" Computer Virus"
    InformIT (10/15/04); Peikari, Cyrus

    Modeling anti-malware measures after biological vaccines may be an effective strategy, argues Cyrus Peikari, who offers suggestions for designing and testing a live, attenuated computer virus vaccine using real-world simulation. The network worm simulation system employed by the researchers offers potentially more realism than other modeling techniques because it executes actual virus code, while its object-oriented design permits simple customization of vaccines for each experiment. The virus vaccine must be designed to confer prolonged immunity against infection, conserve network resources (bandwidth) and reduce morbidity, self-replicate to facilitate easy distribution to more than 90 percent of the target population, and be cost-effective. The vaccine Peikari tested was based on the Code Red virus, but was modified to patch the exploited vulnerability and attenuated to consume less overall bandwidth than a full-strength vaccine. The results of simulations run using increasingly attenuated vaccines indicate that they spread more slowly and consume less network bandwidth with greater attenuation, but in all scenarios they control and ultimately eradicate the outbreak. The least bandwidth-consumptive strategy yielded from the experiments is to release an attenuated, "prophylactic" vaccine to immunize a network before a virus ever appears. Peikari writes that opposition to the release of a live, self-replicating vaccine on the Internet stems from the vaccine's likelihood to damage a certain number of critical systems, but designing the vaccine with the same benefit-to-risk ratio as that of biological vaccines would address the issue. The author recommends that computer vaccines be open source and developed under the aegis of the Internet equivalent of the World Health Organization.
    Click Here to View Full Article

  • "Open Source Project 'GForges' Ahead"
    InternetNews.com (10/26/04); Kerner, Sean Michael

    Developers hope the latest version of the GForge open source software project development and management tool will give commercial products a bigger run for their money courtesy of new features that include role-based access controls, a time task tracking manager, numerous charts and graphs, and support for the Subversion code versioning application. GForge version 4.0, which was released this week, is an offshoot or "fork" of the SourceForge.net (SF.net) application that drives the SourceForge.net open source software repository. The first version of GForge branched from the last open source iteration of SourceForge.net 2.6, while the most recent SF.net enterprise iteration, SourceForge Enterprise Edition (SFEE) 4.1, is a wholly proprietary product owned by VA Software. SourceForge co-founder and GForge Group President Tim Perdue says GForge is already being used by NASA, the U.S. military, the electronics industry, and other sectors. Citing statistics collected from publicly established GForges, Perdue estimates that there are a minimum of 100,000 registered GForge users in existence. "The trick now is to improve the software, so it's genuinely useful and contributing to software development," he explains, adding that his ultimate objective is to turn GForge into something similar to "faceless" network applications such as Apache. Meanwhile, the Free Software Foundation's GNU Savannah Free Software repository is powered by Savane, an SF.net fork distributed by the GNU Project.
    Click Here to View Full Article

  • "Thumb Twiddling on Cybersecurity"
    CNet (10/21/04); Lofgren, Zoe

    Critical infrastructures such as the power grid are interconnected, to the point where a successful cyberattack on one point could disrupt major services and damage economic activity, writes Rep. Zoe Lofgren (D-Calif.), who is the ranking member in a House subcommittee on cybersecurity. The Department of Homeland Security was to assume the responsibility of cybersecurity, but did nothing for months; eventually the National Cyber Security Division was created, but it has limited effectiveness and its director has little power. To help deal with this lack of action, Lofgren has helped introduce legislation in the House of Representatives to create the position of assistant secretary of cybersecurity within the Homeland Security Department. The legislation passed as part of the 9/11 Recommendations Implementation Act, but has not yet made it through the Senate. The position would let the cybersecurity director coordinate with senior Homeland Security personnel, other agencies, and the private sector, and would sit next to an assistant secretary for physical infrastructure protection, thus integrating protection for both cyber and physical structures. Lofgren writes that "securing our networks must be a priority in our country's overall homeland security strategy...We cannot wait for an "electronic 9/11" to happen before the Department of Homeland Security focuses its attention on preparing for and responding to threats to our computer networks."
    Click Here to View Full Article

  • "Video Games With a Political Message"
    Chronicle of Higher Education (10/29/04) Vol. 51, No. 10, P. A32; Foster, Andrea L.

    Politically motivated video games distributed online will become a staple of political campaigns in five years' time, says Georgia Tech assistant professor Ian Bogost. Games that require users to work out complex issues are an effective way of persuasion, and Bogost has recently become a celebrity in the emerging field of campaign-related video games, the majority of which are without nuance and simply let supporters bash the opponent. A health-care management game created for the Illinois House Republican Organization, for example, requires players to allocate funds between hospitals and research, and to set policy regarding malpractice damages; if the caps on damages are set too high, hospitals become less effective because doctors leave and the population's health suffers. The game is meant to support a core Republican campaign message in Illinois and help win a majority in that state's House of Representatives. Bogost has also created prominent Democratic games, such as one that educated Howard Dean campaign volunteers working in Iowa on the ins and outs of literature distribution, sign-waving, and door-to-door visits. A game designed for the Democratic Congressional Campaign Committee addresses a real-life problem of enlisting and managing 10,000 activists for different campaigns. Stanford University researcher Henry Lowood says games created by Bogost and colleague Gonzalo Frasca offer insight into political issues similar to editorial pieces. The Democratic campaign game also tracks the demographic information entered by players, which could add analytical value to the campaign effort. Bogost teaches classes in computing and digital media at Georgia Tech, while Frasca is a member of the Center for Computer Games Research at the IT University of Copenhagen.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "Salary Survey: Nearing the Boiling Point"
    Computerworld (10/25/04) Vol. 32, No. 43, P. 49; Collett, Stacy

    Computerworld's 18th Annual Salary Survey of almost 9,000 IT workers concludes that modest pay and bonus increases, along with record levels of stress, are bringing workers closer and closer to the boiling point. The poll finds that meager salary hikes continued for the third consecutive year with an average gain of just 3 percent in 2004, compared to a 4 percent national average, according to the Bureau of Labor Statistics. Sixty-five percent of respondents report an increase in base salary over the past year, but 35 percent say their salaries either remained flat or were reduced; bonuses were up by an average of just 1 percent, and 70 percent of polled IT workers do not expect their bonus compensation to change over the past year. The survey finds that pay raises were better for IT workers residing in the South Atlantic region, while salaries for workers in the Northeast and on the West Coast remained flat; North Central IT workers, meanwhile, experienced slightly lower than average raises. Eighty-eight percent of respondents--6 percent more than last year--say budget cuts and bigger workloads are causing stress, while 25 percent feel their salaries do not fairly reflect their job duties. Still, 51 percent claim to be satisfied with their pay, while 80 percent say they still derive satisfaction from their career choice. Analyst David Foote says offshore outsourcing was responsible for the 19 percent to 21 percent decline in salary for noncertified application programmers and enterprise application developers his company has tracked over the past few years. Gartner analyst Linda Pittenger notes that companies are phasing out the "peanut butter method" of disseminating bonuses evenly across the enterprise in favor of awarding bonuses to the best performing staffers; she projects that the mass retirement of IT professionals along with falling numbers of MIS and computer science graduates will lead to a slight IT job and pay increase starting in 2008.
    Click Here to View Full Article

  • "In Pursuit of Agility"
    Intelligent Enterprise (10/16/04) Vol. 17, No. 15, P. 20; Cummins, Fred A.

    The enterprise service bus (ESB) can supposedly enable enterprise agility by adjusting processes and organizational architecture without expensive, long, and risky information system overhauls; EDS Fellow and consultant Fred A. Cummins examines the business needs and technical requirements of an ESB, as well as current and future ESB adoption trends. ESB is generally defined as a decentralized function that uses XML technology and Web services while upholding a service-oriented architecture (SOA). Cummins calls ESB a "flexible integration infrastructure capability" that facilitates loosely coupled system interaction on a network through the exchange of messages; the ESB supplies Internet-analogous any-to-any connectivity while providing reliable, secure access to events and delivery of messages via the incorporation of supporting services. Cummins describes the ESB architecture as a collection of standards-based services and middleware, while the core ESB network is usually TCP/IP-based. Business and IT requirements necessary to fulfilling an ESB's basic business need include interoperability with a company's existing enterprise application integration (EAI) facilities; an SOA; authentication and authorization control over the acceptance of messages; business process integration; an adaptive interface provided by middleware; and legacy system adapters, among others. Cummins reports that ESB adoption is in its infancy for a number of reasons, including already existing EAI investments among potential customers and an emphasis on cost reductions and fast return on investment among most corporate IT organizations. For an ESB architecture to reach its full potential, Cummins writes that it will be necessary to devise industry standards for application programming interfaces for the ESB middleware and protocol elements, event registry service, and ESB management protocols.
    Click Here to View Full Article

  • "Getting the Big Picture"
    Science & Technology Review (10/04) P. 6; Walter, Katie

    Clustered computing has dramatically lessened the cost of supercomputing for most applications, but the standalone design of PC graphics cards has prevented commodity clusters from breaking into the graphics and visualization markets. An open-source program from Lawrence Livermore Laboratory and a group of university collaborators has helped bridge the gap between clustered computing and graphics applications: The Chromium software package allows both serial and parallel graphics applications to generate images in parallel using the OpenGL programming language, an industry standard for creating two- and three-dimensional graphics. Chromium tackles the all-important issue of scalability, which is the key element for clustered computing applications, and scales well in terms of data quantity, rendering performance, and the size of pixels in a single image. Since its release in August 2003, the program has been downloaded more than 21,000 times, used to handle data sets of up to 23 trillion bytes, and powered displays as large as 60 million pixels. The software is used in major parallel-computing visualization projects, including IBM's Deep View Visualization, the Deep Vision Display at Boston University, the Cluster Rendering project at Indiana University, and in the University of Kentucky's powerwall autocallibration software. Chromium works by intercepting OpenGL requests to the OpenGL library so that the system is entirely transparent to the graphics application and requires no modification, unlike other parallel graphics products. And because it uses modular stream processing, Chromium has spawned numerous other parallel graphics operations not foreseen by the original developers, including simultaneous remote rendering operations for different devices. The program can also act as a debugger tool for graphics applications or an accelerator for desktop applications.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM