HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.

To send comments, please write to [email protected].

Volume 5, Issue 503: Wednesday, June 4, 2003

  • "Telling of Terrorist-Tracking Tech Tools"
    Medill News Service (06/02/03); Wenzel, Elsa

    The Defense Advanced Research Projects Agency (DARPA) recently furnished Congress with a report on its Terrorism Information Awareness (TIA) program, which has come under fire from critics as a tool that would be used to spy on innocent Americans and erode citizens' right to privacy. The report, which DARPA needed to provide in order to reverse a congressional TIA funding freeze (which is still in effect), states that only suspected terrorists would be subject to investigations, which would involve mining various government and business databases of financial, medical, educational, and travel records. Furthermore, the report indicates that cybersecurity would be implemented to ensure that data is only accessible to authorized parties, while TIA operations would be required to observe the Constitutional right to due process and the First Amendment. DARPA's Jan Walker says the Defense Department "intends to continuously monitor and assess emerging potential privacy and civil liberties issues as part of oversight of the TIA program," while agencies that wish to use TIA applications would be subject to a legal review. However, privacy and civil liberties advocates remain unconvinced of the report's sincerity, alleging that DARPA is merely paying lip service to their concerns. Lee Tien of the Electronic Frontier Foundation says the report makes "broad generalizations" about privacy that hardly address civil liberties. The ACLU's Jay Stanley contends that some of the software TIA would incorporate, already common in the private sector, would be abused by federal agencies. Other points of criticism include the TIA's potential for incorrectly identifying innocent people as possible terrorists, due to the lack of a central authority tasked with making sure the information the system exploits is accurate.

    For information regarding USACM's reaction to TIA, visit http://www.acm.org/usacm/Letters/tia_final.html.

  • "Overseas Tech Jobs Proliferate"
    San Francisco Chronicle (06/01/03) P. I1; Zuckerman, Sam; Kirby, Carrie

    The attraction of transferring technology operations overseas where labor is cheaper, especially during a recession, is transforming Silicon Valley and eroding its role as a low-end software developer. Forrester reckons that 3.3 million service-sector positions--approximately 473,000 computer industry jobs among them--will migrate to nations such as India, the Philippines, China, and Russia by 2015. Marc Hebert of Sierra Atlantic predicts that in several years' time 50% of all Silicon Valley software firms will keep only 20% of their technical personnel in the United States, while the 80% that account for software support and maintenance will move offshore. Although saving money is the primary reason companies are moving tech projects overseas, another major lure is the increasing proficiency and productivity of foreign computer scientists and engineers. Opponents of offshore outsourcing such as Marcus Courtney of the Washington Alliance of Technology Workers argue that the practice is "a serious economic threat to American workers," while advocates counter that outsourcing is an inevitable consequence of the global economy, one that gives American businesses room to expand both inside and outside the United States. Experts expect companies to retain advanced development work in the United States, while more mundane maintenance and testing chores will be exported. This will be especially true for critical software development projects that rely on heavy collaboration. Meanwhile, Ravi Kumar of the University of Southern California attributes most current outsourcing projects to "a fad mentality."
    Click Here to View Full Article

  • "Honors Program Showcases IT's Role in Making a Better World"
    Computerworld (06/03/03); Thibodeau, Patrick

    Winners of Tuesday's Computerworld Honors awards stressed their dedication to improve people's lives through information technology. E Entertainment Television received an award in the Media, Arts, & Entertainment category for a digital asset management system that enables users to search and view over 400,000 videotapes in a low-resolution format. Accepting the award, E's Patrick Bennett told attendees, "I see this less as a crowning achievement than as a kind of challenge of what we can do to help people through technology and live up to all those folks who have given their lives that we might innovate and offer technological solutions to help society." American Express Corporate Travel Solutions won the Transportation honor for developing a centralized, Internet-based system that facilitates real-time collaboration between travelers and agents at any location; CTO Michael Laughlin noted that the system not only boosts convenience and enhances the organization of travel itineraries, but can keep track of travelers, an important consideration in the event of emergencies. The 9/11 attacks and its aftereffects became a catalyst for some honorees: The 2001 USA Patriot Act spurred Sumitomo Mitsui Banking to devise an Internet-based system to search for people on government terrorism lists and to boost its own ability to detect fraud. The winner in the Business & Related Services category was Wireless & Satellite Networks, which deployed a wireless network that provides cheap Wi-Fi access to residents of Zamora, Spain. Japan's Earth Simulation Center earned an Achievement award in the Environment, Energy, & Agriculture category for a supercomputer designed to model natural phenomena such as earthquakes and man-made global changes.
    Click Here to View Full Article

  • "Security Standards Could Bolster File-Sharing Networks"
    New Scientist (06/03/03); Knight, Will

    Harvard University researchers postulate that anti-copying safeguards suggested by the Trusted Computing Platform Alliance (TCPA), ostensibly to curb digital piracy, could actually make peer-to-peer (P2P) file-sharing programs stronger. The TCPA proposes, among other things, that hardware and software incorporate cryptographic signatures that could form the basis of new audio and video schemes and systems that limit the duplication of digital content. However, Michael Smith and his Harvard colleagues say such signatures could act as a user authentication tool that fortifies P2P networks by effectively barring outsiders from disrupting network traffic or eavesdropping on users' file-sharing. The researchers explain that a small group of users will always be able to copy content directly from a computer's output, a critical factor in the control of content distribution via P2P networks. Microsoft has announced plans to issue a version of Windows that will work in concert with the TCPA copy-control technology, but has promised that the standard shall remain open. This would enable software engineers to write P2P software that confirms the trustworthiness of all network users, according to Smith. Chris Lightfoot of the Campaign for Digital Rights cautions that TCPA technology could place restrictions on users' computer capabilities, and result in privacy infringement, an inhibition on open standards, and a cessation of innovation.

  • "Pentagon's Super Diary Project Could Put Powerful Software in Private Hands"
    Associated Press (06/03/03); Sniffen, Michael J.

    Pentagon documents state that the goal of the Defense Advanced Research Projects Agency's (DARPA) LifeLog project is to develop software that deduces behavioral patterns from monitoring people's daily activities, and DARPA officials say the initiative could be used to improve military training as well as the memory of military commanders. LifeLog volunteers would be equipped with cameras, sensors, and microphones to record everything they feel, everything they do, and everywhere they go; the research is not classified, which means that LifeLog software could eventually be made available to private companies. According to the Pentagon documents, the LifeLog software would not just file geophysical and vital readings, but also emails, instant messages, phone calls, voice mails, snail mail, faxes, and Web-based transactions, as well as links to every radio and TV broadcast the subject hears and every publication, Web site, or database he or she sees. The Center for Government and Technology's James X. Dempsey is concerned that such a tool could impact privacy: He notes that the government can easily get hold of the voluntarily collected information with a search warrant, as well as take such data from third parties via request or subpoena. There are also unanswered questions about how data culled from LifeLog software would be interpreted by government agencies and private organizations, not to mention whether the system will include adequate safeguards to shield Americans from errors. DARPA insists that LifeLog will not be used for clandestine surveillance, and the agency's Jan Walker says there is no relationship between LifeLog and the Pentagon's Terrorism Information Awareness project.
    Click Here to View Full Article

  • "Imagine Machines That Can See"
    Wired News (06/04/03); Baard, Mark

    Biomimetics researchers are designing robots programmed to operate the same way biological systems do, and a major biomimetics push involves the development and refinement of robot vision. Boston University's Active Perception Lab is working on a system that mimics the small movements of the human eye to enhance depth perception and improve robot navigation and object manipulation, explains lab director Michele Rucci. She and researcher Gaelle Desbordes employed computers and an eye-tracking device to verify that visual sensitivity as well as 3D information collection are augmented by eye jitter; they discovered, for example, that visual sensitivity falls as much as 20 percent when small eye movements are eliminated. Meanwhile, Iguana Robotics CEO M. Anthony Lewis is attempting to give robots a more natural response to the obstacles they come across. His company is developing an ambulatory machine that uses artificial neurons and a sight-directed navigation system. For instance, if the robot stumbles over an object in its path, it associates the collision with an image of the object so it can remember to bypass the obstacle the next time. "Getting limbs to behave without conscious thought and under visual guidance, as they do in humans, remains a challenge," notes Lewis, who hopes Iguana's research will yield a machine capable of spontaneous adaptation to changing environmental conditions. Such a breakthrough will require systems with the capacity for chaotic rather than deterministic cogitation.
    Click Here to View Full Article

  • "Corporate Computing Tries to Find a New Path"
    Financial Times-IT Review (06/04/03) P. 1; Waters, Richard

    The IT explosion and its promised transformation of business have been victims of their own hype, as demonstrated by today's unwieldy, overcomplicated corporate information systems. "It's almost as if the technology took over," observes IBM's Irving Wladawsky-Berger, who notes that the arrival of client/server architecture in the form of the desktop PC has empowered individual users and IT managers but done little for the overall enterprise. Contributing to the current situation is a lack of interoperability between applications and a failure to bring technology in line with business management. As a result, corporate IT projects have an extremely high failure rate, the costs of corporate information system maintenance are exorbitant, and companies are not fully tapping their IT assets. Corporate leaders have started to doubt that technology truly offers businesses a competitive advantage, particularly if the technology is also available to their rivals. The technology industry believes that the enterprise will truly be transformed once corporate IT systems are perfected and seamlessly integrated. This will involve the development of next-generation IT systems characterized by self-maintenance, interoperability via industry standards, digital asset virtualization, and utility computing. Making such systems a reality will require tech companies to convince IT buyers that they will deliver on their promises.
    Click Here to View Full Article
    (Access for paying subscribers only.)

  • "New Software Helps Teams Deal With Information Overload"
    EurekAlert (06/04/03)

    Collaborative Agents for Simulating Teamwork (CAST), a software program co-developed by John Yen of Penn State University's School of Information Sciences and Technology, is designed to improve teams' decision-making process and augment cooperation between members by focusing on relevant data. "CAST provides support for teams by anticipating what information team members will need, finding commonalities in the available information, and determining how that information should be processed," explains Yen. He believes the customizable software could help military officers handle the hundreds of thousands of ground-sensor and satellite readings they receive every hour, and more rapidly adjust to changing battlefield situations. Yen adds that disease epidemics and potential terrorist threats could also be monitored with CAST. The software incorporates what Yen calls "shared mental models" about team goals, team process and structure knowledge, and postulations about problems. "A computer program that acts as a team member may be more efficient in processing information than a human teammate," he notes. The software's development has been financed under the Department of Defense's Multidisciplinary Research Program of the University Research Initiative, which splits the grant between Penn State, Texas A&M, and Wright State University. Yen, who started working on CAST when he was at Texas A&M, continues to refine the software with Texas A&M collaborators Michael Miller and John Volz.

  • "Pentagon Launches Internet Voting Effort for Overseas Americans"
    MSNBC (06/03/03); Boyle, Alan; Meeks, Brock N.

    American civilians and military personnel living overseas will be able to vote electronically in the 2004 elections through the Secure Electronic Registration and Voting Experiment (SERVE), according to a Pentagon announcement on Monday. Voters will receive digital signatures that will confirm their identity as they cast their votes over the Internet. Polli Brunelli of the Federal Voting Assistance Program (FVAP), which is overseeing SERVE, declared in a written statement that the project involves close collaboration between the Pentagon and state and local election officials. "They have to buy in on this thing, because theyre the ones doing the counting and [have to] make sure its 100 percent secure," explained the Pentagon's Glenn Flood. The eligibility of overseas Americans to use the e-voting system will depend on whether their home states and counties are participating in the program. Residents of Arkansas, Hawaii, North Carolina, South Carolina, Pennsylvania, Utah, Florida, Washington, Ohio, and Minnesota are expected to be eligible, although Washington is among several states that still need to enact legislation before their SERVE participation is approved. Flood said that 6 million voters could potentially participate in SERVE, although actual figures are difficult to estimate. SERVE builds on the government's public key infrastructure (PKI) program, which provides digital signatures to federal employees.

  • "Industrial Evolution"
    CNet (06/02/03); Frauenheim, Ed

    The life sciences field is investing heavily in IT equipment and services in order to handle a flood of biosciences data as well as make the pharmaceutical development process more efficient. International Data (IDC) estimates that the biosciences industry's IT expenditures will skyrocket from $12 billion in 2001 to $30 billion in 2006, while IDC's Debra Goldfarb predicts that the next 10 years will witness "the complete transformation [of the industry] to very database-intensive as opposed to wet-lab intensive." Dr. Michael Marron of the National Institutes of Health says that data generated by biomedical research has been growing 100 percent every 15 months since the human genome was decoded; meanwhile, the pharmaceutical industry has been under the gun to develop profitable new drugs since Lehman Brothers and McKinsey & Co. issued a report two years ago anticipating an unprecedented number of drug patent expirations by 2011. Conventional scientific experimentation is expected to transition to a mostly computerized methodology in which experiments are run "in silico." IT vendors large and small are entering the biosciences arena to reap profits: IBM has announced two grid computing products geared toward the biomedical field, while Intel is working on computer architectures to accommodate increasing amounts of research data. Customers of Structural Bioinformatics can purchase a database of 3D protein structures produced by labwork and computer simulation, as well as software designed to manage this data; the company is also devising algorithms that can mine medical data in order to recognize the onset of diseases. The advances needed to realize the biosciences field's goals may be hampered by the economic slump and industry consolidation. A public backlash based on the ethical questions surrounding such advances could also be a formidable barrier.

  • "Superhero Server Takes War on Hackers to Mythic Level"
    NewsFactor Network (06/03/03); Martin, Mike

    The Hydra server operating system is impervious to electronic attack by viruses or hackers, according to Bodacion Technologies. Former Motorola software engineers Eric Uner and Eric Hauk say Hydra is written completely from scratch and has several innovative technologies that make it impossible for hackers to crash. The Hydra kernel loads from embedded flash memory and checks for viruses each time it is loaded, while the RAM is continually monitored for viruses and other corruptions. Uner says other operating system kernels such as Linux, BSD, Unix, and Windows are built using fundamentally weak security models. Accenture analyst Denise Fillo notes that Bodacion's Hydra server is undergoing preliminary testing at NASA, the Federal Aviation Administration, and the National Security Agency. Uner explains that Hydra employs a mathematic model based on Chaos Theory--which also governs random, organic growth in living things--so that no session ID, customer ID, or other digital identifier can be deduced by a hacker. In an open challenge to hackers last year, Bodacion offered $100,000 to anyone who could compromise Hydra's security. None of the 200,000-plus participants could penetrate the system, though Steve Chapin of the Syracuse University Center for Systems Assurance says the most talented hackers do not reveal their secrets by taking part in advertised hacks. Still, Chapin, an associate professor of electrical engineering and computer science, says that "it's possible what they say is correct," although he says further evidence is needed to prove Bodacion's claims.

  • "In Future, Foot Soldier Will Be Plugged Into a Massive Network"
    Associated Press (06/02/03); Regan, Michael P.

    The "Scorpion ensemble" under development at the U.S. Army Soldier Systems Center is intended to increase mobility by reducing the amount of equipment troops must carry while boosting both safety and effectiveness in the field through numerous technological enhancements wired directly into the uniform. The ensemble will connect the wearer to the Future Combat System, a planned network of satellites, unmanned planes, and robot vehicles that provides soldiers with reconnaissance data in order to determine their location and coordinate military operations. Dennis Birch of the Army center's Objective Force Warrior technical program reports that developers are trying to enable soldiers to interact with the system through voice command, although a control panel integrated into the sleeve is also under consideration. The Scorpion ensemble will incorporate a sensor-studded undershirt to monitor the wearer's vital signs; built-in tourniquets that could expand or contract via remote control; an armored load carriage that stores circuitry, batteries, water, and ammunition; and a helmet equipped with cameras to capture concealed enemies, screen attachments to display camera images as well as Future Combat System telemetry, and a laser-engagement system that can distinguish between friends and enemies. Birch notes that new devices can be added to the ensemble as they are developed, and future enhancements in the planning stages include advanced camouflage. Meanwhile, Paula Hammond of MIT's Institute for Soldier Technologies notes that projects are underway to develop exoskeletons that boost soldiers' speed and strength, and apparel that can protect wearers from chemical agents or treat injuries.
    Click Here to View Full Article

  • "Clash Over Java Standard Heats Up"
    CNet (05/30/03); LaMonica, Martin

    JBoss Group, a small firm whose open-source server software programming tools capitalize on the Java Enterprise Edition (J2EE) standard, is being accused by Sun Microsystems of abusing the J2EE brand. Sun says the open-source, aspect-oriented JBoss programming application is billed as supporting the J2EE specification, but in fact has not been tested for compliance. Companies that adopt JBoss will eventually end up with applications not portable to other systems, Sun warns. JBoss President Marc Fleury claims Sun has deliberately stalled their efforts to buy the testing suite license, though he admits another hindrance is the cost of the tests, which involves programming for features that are hardly used. JBoss' "Beyond J2EE" strategy aims to allow programmers to code Java aspects, or applications, without having to know details about the J2EE specification or having to write their own Java application programming interfaces. The software is proving popular with cost-conscious companies that want to move away from Sun ONE or BEA Systems Web Logic products, notes Fleury. In version 4.0, just about 20 percent of JBoss code is actually derived from J2EE, he says. Forrester Research analyst Ted Schadler explains says the dispute is of more legal import than technical consequence, since J2EE compliance, though important, is just one aspect buyers consider when acquiring new technology.

  • "Net Attack Overwhelms Computers With Complexity"
    New Scientist (06/02/03); Knight, Will

    Rice University researchers Scott Crosby and Dan Wallach have outlined a form of cyberattack that can put Web-connected computers out of commission by sending data packets that force the system to carry out highly complex hash functions that eat up almost all of the machine's processing power. The researchers tested their technique on commercially available programs, and were able to successfully knock them out with a dial-up modem connection. Wallach says the only real defense against such an assault is to use more agile hashing algorithms that "are taught only in advanced computer science theory classes and rarely appear in commercial software." However, Phil Huggins of Information Risk Management explains that programmers may not be easily convinced to use advanced algorithms because such functions may be less efficient when it comes to processing regular data. Crosby and Wallach will deliver a paper on the attack methodology at the Usenix Security 2003 conference in August. Although Wallach realizes that the disclosure may spur hackers to exploit the method, he stresses that making people aware of the problem is more important. "The only way to fix a problem that affects so many different systems in different ways is to widely publicize the problem," he insists. Wallach also notes that the software engineers responsible for the programs the Rice researchers used in their attack tests were alerted to the vulnerability several months ago.

  • "The Hidden Cost of Software"
    EarthWeb (05/29/03); Chou, Tim

    Estimating software's total cost of ownership (TCO) to a company can be a challenging proposition, given the many factors involved, writes Tim Chou. Maintenance costs, which can often surpass the cost of software, also need to be gauged, studied, and reduced through an industry-standardized model. According to a recent Commerce Department study cited by Chuck Phillips of J.P. Morgan Chase, labor costs account for the bulk of software TCO, with 37 percent spent to support internal IT personnel and 33 percent going to outside implementation consulting, while software license expenditures constitute a mere 30 percent. Companies also spend a lot for the services of system integrators and consultants to handle upgrades and/or malfunctions that are beyond the scope of in-house staff, with upgrading especially critical for competing in a global economy. Nevertheless, Chou notes that IT groups are reluctant to upgrade because sometimes updating one software component can entail a complete system reconfiguration. Outsourcing to independent software vendors (ISVs) is becoming more and more palatable to firms that need to resolve security issues as well as cut costs, enhance services, and focus exclusively on core business. ISVs are particularly advantageous because they boast unmatched software knowledge, brand reputation, and the ability to manage a large user base via self-service, universal patches, and homogenized processes; however, Chou observes that outsourcing deals are often one-off agreements that take control away from customers and set up a conflict of interest. The Capability Maturity Model (CMM) co-developed by Carnegie Mellon University and a software makers' organization offers a template for evaluating and measuring a company's software management, writes Chou. CMM supports a four-step improvement scheme in which a software process is made repeatable and consistent; specialization is attained by establishing a system of checks and balances; standardization is reached by making specialized processes repeatable; and this leads to the fourth step, automation.

  • "The Next Ethernet"
    IDG News Service (06/02/03); Lawson, Stephen

    Even as Ethernet pioneers celebrate Ethernet's 30th anniversary, quiet talk is going on about the ambiguous future of the technology. Ethernet is now available at 10 Mbps, but past trends have indicated a tenfold increase every four years, according to IEEE 802.3 working group chair Bob Grow, whose committee heads Ethernet standards. Still, Grow says no formal call for interest has been issued for 100-Gigabit speeds and even 10-Gigabit Ethernet is not widely deployed today. The Burton Group analyst David Passmore says other parts of the infrastructure would match 40-Gigabit Ethernet better than it would 100-Gigabit Ethernet, including OC-768 WAN link specifications. He says that prior work would help in the design of new equipment. Cisco Systems' Luca Cafiero estimates 40-Gigabit Ethernet could be ready in two years. Atrica's Nan Chen, who helped write the Fast Ethernet standard in the 1990s, says groups needing 40 Gbps could hook up four 10-Gigabit Ethernet connections through a link aggregator rather than seek a new interface. For that reason, he expects IEEE work to start on 100-Gigabit Ethernet next year and products could be released by 2006. But while aggregating lines in an office network make some sense, it becomes more difficult when considering where Ethernet is deployed today, in WANs and data carrier networks. Server interconnect technology is just now catching up to 10-Gigabit Ethernet, but a faster Ethernet could find a ready market in carrier infrastructure and at the core of campus networks. In any case, IEEE's Grow expects sharp debate concerning how big of a step to make with the next Ethernet.

  • "Enclosing the Digital Commons"
    Information Today (05/03) Vol. 20, No. 5, P. 37; Poynder, Richard

    Experts are worried that the Internet's development and benefits are endangered by people trying to enforce patents on widely used Web methods, such as hyperlinking and the GIF file algorithm. "There is a tension between the view that says that the Internet should be free, and one that holds that anyone who enhances the Web should be able to patent their work and obtain revenue from it," observes John Collins of Marks & Clerk. Some experts claim that the threat posed by Internet-related patents is overblown: For instance, patent holders' demands for remuneration are often dismissed in court because their ownership is usually found to be invalid. But the fear of privatization extends to all categories of Web-related intellectual property, such as domain names and, perhaps more importantly, copyrighted content. Critics charge that the value of the hyperlink will seriously decline if such developments continue. Also distressing for critics is copy-control legislation such as the Digital Millennium Copyright Act (DMCA) and copyright extensions that could block the release of much copyrighted material into the public domain, including publicly funded research documents. Some publishers are not trying to appropriate free information, but rather the metadata and other tools needed to access it, which can be especially frustrating for users if the data is unavailable elsewhere in digital format, or non-digital format. On the other hand, Collins believes the Internet is simply evolving into an architecture that offers a core tier of free data and a second tier of premium features and services.

  • "Are You Ready for Social Software?"
    Darwin (05/03); Boyd, Stowe

    A Working Model managing director Stowe Boyd predicts that social software will effect dramatic changes in businesses' marketing strategies and customer interplay, and transform internal and external communication and collaboration. Boyd assumes that the purpose of social software will be the reverse of that of conventional groupware and other project- or organization-centered collaborative tools. Social software will support individuals' desire to combine into groups to pursue personal interests. Groupware and traditional software follows a top-down model that imposes a larger system (an organization or project) on individuals, while social software follows a bottom-up approach oriented around individuals, who establish relationships with others based on their personal goals, preferences, and connections; this intercommunication sets the foundation for a network of groups. Boyd defines social software as that which supports conversation between individuals or groups, social feedback, and social networks. The second capability enables a digital reputation to be built through group ratings of individual contributions, while the third allows people to digitally express their personal relationships and lay the groundwork for new relationships. Boyd reasons that social software is now poised to take off because low-cost, high-bandwidth tools such as blogs and network systems like Ryze and LinkIn are available. Boyd writes that this availability, "when coupled with the critical mass of millions of self-motivated, gregarious and eager users of the Internet, means social software is certain to make it onto 'the next big thing' list."

  • "Maximize Color and Contrast in Multimedia Images"
    Photonics Spectra (05/03) Vol. 57, No. 5, P. 74; Connolly, Christine

    The goal of color management is to develop methodologies that allow multiple digital devices to accurately reproduce colors. The International Color Consortium (ICC) founded 10 years ago by Eastman Kodak, Apple, Adobe, Microsoft, Sun Microsystems, Silicon Graphics, and Agfa was established to devise a global standard for digital color classification, and the ICC Profiling system is used by many professionals today. ICC profiling involves the use of tables to transform images to a profile-connection space; image rendering can be altered through profile editing. The ICC Web site offers information about color-management's theoretical and practical aspects, while ICC member companies supply a range of color-management enablement products for professionals. A number of software programs--ColorFlow from Eastman Kodak, for instance--provide ICC profile customization. So that acceptable color levels can be maintained as equipment behavior changes as a result of age, ICC profiling makes it the user's responsibility to regularly calibrate the devices and use proper profiles. SRGB, which is deployed on the Internet and within practically all computer input and output devices, was developed for the consumer market as an inexpensive standard-default color-management scheme. There are also products that use the profiles to facilitate interaction between input and output devices, QuarkXPress and Photoshop being two examples. Such tools enable color laboratories to improve the quality of color reproduction, particularly when images are rendered in variable media.
    Click Here to View Full Article

[ Archives ] [ Home ]