Volume 5, Issue 455: Friday, February 7, 2003
- "Bush Orders Guidelines for Cyber-Warfare"
Washington Post (02/07/03) P. A1; Graham, Bradley
Bush administration officials say the president has signed a secret order to outline a strategy for a cyber-strike on the computer networks' of America's foes. When asked whether such a strategy would be employed if military action in Iraq moves forward, one official simply said that "all the appropriate approval mechanisms for cyber-operations would be followed." Last month, White House officials set up a meeting at MIT to consult with experts outside the federal government, but several attendees worried that an American-directed cyberattack would invite similar retaliation that could be potentially devastating, given the country's reliance on computer networks. One key issue a cyberattack strategy would need to account for is the possibility of collateral damage, in which an enemy nation's civilians are affected by the assault, noted former White House cybersecurity czar Richard A. Clarke. "There also is an issue, frankly, that's similar to the strategic nuclear issue which is: Do you ever want to do it? Do you want to legitimize that kind of weaponry?" he noted. Maj. Gen. James David Bryan of the Joint Task Force on Computer Network Operations, which was organized to formulate military cyber-warfare and cyber-defense strategies, says the purpose of his group is threefold--to gain an understanding of cyber-weapons and their effects through experimentation; to incorporate such weapons into the U.S. arsenal; and to train a corps of cyber-soldiers. Since the effects of many cyber-weapons are still not fully known, the Pentagon's Joint Staff has issued classified "rules of engagement" in which a cyber-attack cannot be launched without top-level authorization. Four years ago, the Pentagon's legal counsel released a document suggesting that cyber-warfare should follow the "law of war" principles of proportionality and discrimination.
- "NSF Panel Recommends $1B Annually For Cyberinfrastructure"
Datamation/IT Management (02/05/03); Shread, Paul
A report from the National Science Foundation's (NSF) Blue Ribbon Advisory Panel on Cyberinfrastructure recommends that the organization spend $1 billion a year on cyberinfrastructure development so that it can take advantage of a "once-in-a-generation opportunity" to revolutionize science and engineering. "We're at a new threshold where technology allows people, information, computational tools, and research instruments to be connected on a global scale," observes Dan Atkins of the University of Michigan School of Information, who acts as chair of the advisory committee. The Panel calls for the establishment of an Advanced Cyberinfrastructure Program that would be tasked with building, implementing, and applying a national cyberinfrastructure in order to "radically empower" scientific and engineering research. Cyberinfrastructure promises to transform the abilities and broaden the skills of engineers and scientists, according to the report. Paving the way for this cyberinfrastructure-driven revolution are grass-roots efforts in the science and engineering research community backed by NSF and other agencies, examples of which include the Grid Physics Network (GriPhyN) and the Network for Earthquake Engineering Simulations (NEES). The Panel recommends that more basic computer science and engineering research is critical, because off-the-shelf-technology is not sufficient to build the required cyberinfrastructure. The Advanced Cyberinfrastructure Program would be an interagency effort coordinated internationally, but the report urges the NSF to act quickly, or face fundamental technological incompatibilities that could hinder interdisciplinary collaboration, data archiving, and other critical cyberinfrastructure projects.
- "Cascading Failures Could Crash the Global Internet"
NewsFactor Network (02/06/03); Martin, Mike
Arizona State University scientists Adilson Motter and Ying-Cheng Lai report that the global Internet could be brought down by hackers who target specific network nodes, triggering a cascade of overload failures. Although the Internet is composed of millions of computers, most of the data traveling across the Web is transmitted by a few thousand machines, according to Motter; these computers are highly important nodes because of the exceptionally large amount of information they are loaded with, so attacking them could conceivably cause a complete network failure. "Eliminating those [central] nodes is likely to cause subsequent failures and generate a cascade, while eliminating peripheral nodes will have little effect," posits Columbia University sociology professor Duncan Watts. Motter and Lai explain that key nodes are especially vulnerable to attack because of the Internet's highly heterogeneous load distribution. For hackers, identifying the higher-load network nodes is simply a matter of measuring network behavior, says Motter. He adds that extra protection should be deployed for these nodes, as should intelligent operations that can redistribute load if the nodes fail. Los Alamos National Laboratory researcher Zoltan Toroczkai notes that networks can be made more homogeneous and less vulnerable to hackers if nodes support a limited number of links. Motter and Lai's findings are presented in a paper published in Physical Review E.
- "Bush Data-Mining Plan in Hot Seat"
Wired News (02/06/03); Scheeres, Julia
The Total Information Awareness (TIA) project, which would use data-mining technology to search public and private databases as well as the Internet for signs of terrorist activities, has spurred grass-roots organizations to mobilize and call for more oversight of the initiative. Representatives of these organizations, which range from the left-wing ACLU to the right-wing Eagle Forum, held a press conference on Feb. 5 to promote legislation for a moratorium on TIA funding until the program's potential for abuse has been thoroughly investigated. Critics have charged the TIA as being little more than a surveillance tool designed to spy on innocent U.S. citizens through their financial, medical, travel, and educational transactions. The Senate unanimously passed an amendment from Sen. Ron Wyden (D-Ore.) last month to halt TIA funding until the Bush administration furnishes a detailed study addressing how the project could affect civil liberties. The Feb. 5 press conference also took the opportunity to criticize a proposed central database that would encompass Americans' personal information and transactions. "The mere gathering of this information is a risk," declared the Association for Computing Machinery's Barbara Simons, who added that such a database would be an invitation to hackers as well as terrorists hoping to commit identity theft. The ACM submitted a letter to Congress last month that raised doubts about the TIA's effectiveness at preventing terrorist acts.
Barbara Simons is co-chair of ACM's U.S. Public Policy Committee. For more
information about USACM, visit http://www.acm.org/usacm.
- "What Are the Chances?"
New York Times (02/06/03) P. E1; Schiesel, Seth
Evaluating the risk of "low-probability, high-consequence events"--natural disasters, nuclear accidents, and spacecraft catastrophes, for example--lies at the core of probabilistic risk assessment, which is used by mathematicians, engineers, insurance executives, businesses, and federal agencies thanks to the availability of conceptual and computing tools, while recent gains in computing power have boosted users' confidence in these methods. Probabilistic risk assessment relies on mathematics to measure the odds of a specific outcome using what is known or estimated about the myriad variables that might contribute to that outcome. A NASA consultant used probabilistic risk assessment in 1995 to determine that the odds of a catastrophic space shuttle failure were 1 in 145; similar techniques are used to make nuclear labs safer, gauge the health risks posed by toxic-waste sites, determine how safe and reliable cars and aircraft are, estimate insurance rates, and weigh the odds of terrorist attacks. For example, insurance companies employ probabilistic modeling to simulate how a hurricane might behave based on historical data in which a dozen variables--frequency, size, intensity, etc.--are involved. The potential storm patterns that emerge, which can range from 5,000 to 10,000, are tested randomly on models of the properties insured by a specific firm, a process known as Monte Carlo analysis. Using probabilistic risk assessment in industrial situations is even more complicated, because the variables can number in the thousands, tens of thousands, or even hundreds of thousands; rather than referring to a historical database, an engineer, for example, must use a computerized model to assess the physical and electromagnetic traits of each component in the machine he is designing prior to probabilistic analysis. The best application for industrial probabilistic models is in the design phase rather than after the machine or product has been put into operation.
(Access to this site is free; however, first-time visitors must register.)
- "Bush Database Plan Raises Privacy Concerns"
IDG News Service (02/06/03); Gross, Grant
President Bush's proposal for a Terrorist Threat Integration Center designed to mine federal databases for terrorists and terrorist activity is already drawing criticism from privacy advocates and could also run into trouble with Congress. The plan, which Bush announced in his state of the union address last week, calls for the center to be run by the CIA, which will share its data with that of the FBI, the Homeland Security Department, and other federal divisions. The plan appears to involve data mining through government databases only, as opposed to the Defense Department's Total Information Awareness (TIA) project, which would also carry out searches for suspicious activity through private databases; however, Electronic Privacy Information Center President Marc Rotenberg says the center could still be used to carry out domestic intelligence gathering, and should therefore be carefully examined by Congress and the public. "Are we seeing here a commitment by the administration to the kinds of data-mining fishing expeditions that we associate right now with Total Information Awareness, but packaging it somewhat differently?" asks Electronic Frontier Foundation staff attorney Lee Tien. "TIA is sort of an easy target, because its announced and declared purpose is so all-encompassing...and then you hit people with something much more limited, and they say, 'Compared to TIA, that's not so bad.'" A spending bill amendment recently passed for the Senate would restrict TIA and other government data-mining projects to overseas operations, while Sen. Ron Wyden (D-Ore.) has vowed to support the Bush center if it provides a database of known or suspected terrorists, or oppose it if it is used to carry out domestic spying. Tien says the chief concern with the plan revolves around how the collected data is used, how suspects are identified, and how those results can be questioned in the event of false positives.
- "Spam Deluge Leads to Search for Silver Bullet"
InfoWorld.com (02/03/03); Pruitt, Scarlet
Spurred by dire warnings that spam will soon overwhelm legitimate email, experts are considering a number of solutions, ranging from legislation to existing filtering tools to a "silver bullet" that can effectively demolish spammers' business model. Some believe the laws designed to protect consumers from market fraud cannot deal with spam, and Junkbusters President Jason Catlett favors legislation that allows spam recipients to sue spammers for $50 to $500. There are also several antispam bills that got as far as committee approval in the House and Senate, but are currently in limbo despite the added support of the Direct Marketers Association (DMA), which sees spam as a detriment to legitimate marketing. Some ISPs offer spam filters designed to reduce unintended consequences, such as legitimate or desired email being blocked or deleted: EarthLink and Yahoo! both use filters that block spam messages and file them in folders where users can check them later. Many filter vendors and ISPs are allowing clients to determine what they consider to be email. Meanwhile, last month's MIT Spam Conference was a rallying point for researchers and programmers trying to devise a single tool powerful enough to shrink the spam response rate to so low a level that spamming becomes unprofitable. Conference organizer Paul Graham thinks Bayesian filters are the answer. Mitsubishi Electric Research Laboratories' William Yerazunis claimed at the conference that he has devised a Bayesian filter that can block 99.9 percent of spam, based on his CRM114 programming language. A spam workshop hosted by The Global Internet Project earlier this month attracted Internet experts who recommended that the fight against spam should be multi-tiered, combining the adoption of new antispam technologies, end-user education, and stringent enforcement of proposed fraud regulations.
- "Computers Driving Shuttle Are to Be Included in Inquiry"
New York Times (02/07/03) P. A22; Lohr, Steve
The on-board computers were driving the Columbia space shuttle when it descended into the earth's atmosphere Saturday, Feb. 1, and ordered the ship to steer right slightly in compensation for drag registered on the left side. The shuttle's computer are still of the same basic design as when they were developed by IBM in the 1970s, but the software has never been faulted for safety reasons and is considered one of the best examples of robust code. Although relatively weak in processing power, the on-board computers, named AP-101, are extremely efficient at calculating and adjusting to readings coming in from sensors all over the ship, and work using a specialized programming language called high-order assembly language/shuttle (HAL/S). The programming on the shuttle is rated Level 5 by Carnegie Mellon University's Software Engineering Institute for its reliability and development process, a rating achieved by only a few projects worldwide. Before the crash, NASA was considering replacing the aged systems with newer versions for budgetary reasons, since finding component replacements was increasingly difficult. However, new shuttle software could introduce bugs because the software and hardware were designed in tandem. Europe's Ariane 5 rocket blew up in 1996 because of a small software error that came from porting code from the previous Ariane 4 rocket without adequately testing it in the new system. Although no one has yet blamed the AP-101 for the shuttle disaster, at least one engineer has said the computer may not have had all the data needed to do its job. Richard Doherty, a member of the investigative committee for the space shuttle Challenger, says that NASA officials could have sent updates to the AP-101 to account for damage to the left wing during lift-off, which may have caused extra drag during the descent.
(Access to this site is free; however, first-time visitors must register.)
- "'Slammer' Attacks May Become Way of Life for Net"
CNet (02/06/03); Lemos, Robert
The SQL Slammer worm that infected corporate servers at an unprecedented rate last month was able to severely affect customer-facing systems such as such as ATMs and email, something that few other viruses or worms have been able to do. Computer security experts say Slammer shows current strategies to ironclad systems is not viable, since it just takes one weak point for a worm to break in and begin proliferating. Slammer was just 376 bytes in size--compared to the 4 KB Code Red worm and the 60 KB Nimda--meaning it could infect a server in just one data packet without searching for open systems first. Its litheness meant it was able to infect 90 percent of all vulnerable servers in just 10 minutes, overwhelming security scrambles at large tech-savvy companies such as Siebel Systems, Bank of America, and Microsoft. FBI InfraGard national executive board chairwoman Phyllis Schneck says Slammer tied up database traffic, which made its affects more visible to customers even though it infected just 200,000 Microsoft SQL servers, or half the number of machines hit by Code Red. Microsoft CIO Rick Devenuti says security failed when Slammer was able to send itself through ports connecting different buildings on the corporate campus. Like Microsoft, nearly every company has a difficult time updating security patches, but some are making headway with systems that prioritize and even automate patch installation.
- "The Theory and Practice of the Internet"
Newsweek Online (02/04/03); Rogers, Michael
The overwhelming volume of data stemming from the history of the rapidly evolving commercial Internet has people clamoring for a simple theory that will help them cope and function better in the Web environment, writes Michael Rogers. Although such a theory has yet to be formulated, a trio of recently published books provide good starting points for such a theory. Albert-Laszlo Barabasi's "Linked: The New Science of Networks" establishes that mechanisms of the Web's evolution can be partly explained through mathematical theory. The emergence of hubs, which are sites on networks that grow into portals with many more links than practically all of the surrounding sites, is one phenomenon that the book dissects, but Barabasi acknowledges that real people and situations need to be added to the architecture of network science if his theory is to explain the Internet. "Smart Mobs" by Howard Rheingold posits how the Web might impact societal development; early examples are "smart mobs" of people that are becoming cohesive political forces or new social units through wireless communication technology, and Rheingold applies this principle to the root causes of cooperation and altruism, as well as to his own theory of network science. In his book, "Small Pieces Loosely Joined: A Unified Theory of the Web," David Weinberger argues that in certain ways the idea of the Web has greater significance than its underlying mechanism, and projects that the Web's biggest impact will be the transformation of society through online connections and virtual experience. Rogers writes that these books confirm some his own personal observations about the Web, including the assumption that anyone who claims to have a bead on the Internet's future does not truly understand it.
- "Quantum Computers Go Digital"
Technology Research News (02/05/03); Smalley, Eric
In an attempt to build a solid-state quantum computer, researchers at the University of Wisconsin at Madison have devised a method for reducing the number of errors produced when computing with qubits, or quantum bits. Traditional digital devices calculate each possibility sequentially, but quantum computers can process every possible calculation at once because of unusual physics at the quantum level. Unlike digital bits, which are easily determined to be either ones or zeros, qubits are measured by their spin and rotate between on and off states, easily introducing errors. University of Wisconsin physics professor Robert Joynt says his team's method reduces the amount of errors by two orders of magnitude. Once properly housed on a silicon chip, thousands of electrons would be manipulated to slide past one another instead of colliding. Joynt says this technique will allow a more definite, pseudo-digital reading of qubit states, depending on whether electrons are well separated or in close range to each other. However, serious obstacles remain for the University of Wisconsin effort, and the method has only been tested in simulation. "The real issue is fabrication of quite complicated nanostructures," says Joynt. The team is designing qubits made from a layered semiconductor material and a gate structure that controls the qubit's state. A quantum measurement device placed on the chip would also be necessary. Joynt predicts demonstration quantum computers to be available in 10 years, with full-scale quantum computing in 20 years.
Click Here to View Full Article
- "Pervasive Computing: You Are What You Compute"
HBS Working Knowledge (02/03/03); Silverthorne, Sean
Panelists at the recent Cyberposium 2003 focused on pervasive computing, and took the opportunity to note their respective companies and institutions' advances in that area. Stephen Intille of MIT commented that researchers there are investigating how minuscule sensors distributed throughout a house can biometrically monitor the health of its residents, which could be a very useful--and cheaper--alternative to conventional health care. It is estimated that there are 7.5 billion micro controllers worldwide; these sensor and controller chips are used for a wide array of operations, such as heating, ventilation, and air conditioning. Ember, represented on the panel by CTO Robert Poor, is developing wireless, self-healing networks to interconnect these myriad chips. Axeda's Richard Barnwell talked about how his company sells real-time performance tracking devices that help anticipate equipment failures, maintain machines remotely, and show manufacturers how their products are used by customers. When asked about how their data-gathering devices could potentially affect personal privacy, Intille replied that MIT's home sensors would be used only with user permission, and cited cell phones with GPS tracking capabilities as a more worrisome device to be concerned with. Barnwell said that the use of monitoring devices would be strictly regulated in certain spaces, such as the health care sector. One attendee asked who will be responsible for replacing empty batteries on such devices, and MIT professor and panelist Sandy Pentland suggested batteries that draw power from radio signals and other technologies as one solution.
Click Here to View Full Article
- "New Chapter in Success Story"
Financial Times (02/05/03) P. 6; Merchant, Khozem
India's IT market is soaring due to weak markets in mature economies, such as in the United States, which is driving software programming and mundane business processing overseas. Indian firms deliver quality code, such as at Infosys, where programmers average less than two errors per 1,000 lines of code. Meanwhile, last year's fears about an imminent war between India and Pakistan have dissipated, re-igniting growth in the Indian IT market. Infosys CEO Nandan Nilekani says, "Every twist and turn overseas in recent years...has been good for Indian IT." But although revenues and employment rolls are growing rapidly, Nasscom President Kiran Karnik says companies need to differentiate themselves based on other aspects besides cost. Major IT firms in India--Wipro, Tata Consultancy Services, and Infosys--are ramping up their services arms in an effort to offer broader solutions, even as global IT service providers such as Accenture and Cap Gemini Ernst & Young build up their own Indian components. Bombay-based software interest Mastek has taken a different route, partnering with Deloitte Consulting in a joint venture. Other concerns worrying Indian firms include increased scrutiny of overseas IT outsourcing in the United States, as well as the rise of eastern Europe as a low-cost competitor. Tata Consultancy Services' S Ramadorai notes that legal concerns about data protection have arisen in the United States after Sept. 11, 2001.
- "Hollywood and Silicon Valley: Together at Last?"
Salon.com (01/15/03); Mieszkowski, Katharine
Representatives from the music and technology industries agreed to reject government legislation on the use of copyright protection measures and devise their own solutions to curb digital copyright infringement, according to a recent accord between the Recording Industry Association of America (RIAA), the Business Software Alliance (BSA) and the Computer Systems Policy Project (CSPP). However, Electronic Frontier Foundation (EFF) senior staff attorney Fred von Lohmann says the agreement may not make a significant difference to the so-called digital pirates the entertainment industry wants to clamp down on. Among the flaws he sees in the agreement is the omission of fair use issues and the Digital Millennium Copyright Act (DMCA), and the lack of consultation and inclusion of other industry and consumer advocacy groups, such as the EFF, the Motion Picture Association of America (MPAA), and the Consumer Electronics Association. Von Lohmann adds that the agreement will not stop or slow down the deployment of digital rights management technology and other copyright safeguards that are accelerating the erosion of the public's fair use rights. Furthermore, he observes that the government is already too deeply involved in the digital rights issue, mainly because content providers lobbied for legislation such as the DMCA and are still clamoring for new mandates to be passed, such as Rep. Howard Berman's (D-Calif.) proposal for government-authorized interference with peer-to-peer networks. Von Lohmann also notes that the high-tech industry groups signing the agreement do not represent the entire tech industry. "There's a lot at stake here, and I don't think that we can leave it to inter-industry negotiations to decide the fate of our digital rights," he concludes.
- "Workin' on the Brain Gang"
Canadian Business (02/03/03) Vol. 76, No. 2, P. 108; Holloway, Andy
The majority of Canada's IT employees work less than 40 hours per week, and only 10 percent work overtime, according to a report authorized by Human Resources Development Canada and the Software Human Resource Council. Such findings contradict the popular belief that most techies are overworked; it holds true primarily for those in product development, says SHRC President Paul Swinwood. Typically, IT workers have 9 to 5 jobs at banks, stores, and manufacturing firms, and even at technology firms such as Unisys and IBM, the study shows. Canadian IT workers earn an average of $50,000 annually for writing code, repairing computers, and manning help desks. Some 420,000 people in Canada work in the IT field, and 10,000 full-time IT jobs have been created since the beginning of 2000. Swinwood says the industry as a whole is now maturing and considered a mainstream part of the job market. The study also found that 23 percent of Canadian IT workers are women, and 20 percent of managers work more than 40 hours per week, compared to 10 percent of lower-ranking employees.
Click Here to View Full Article
- "House and Senate Committees Unveil High-Tech Priorities"
Hill (02/05/03) Vol. 10, No. 5, P. S6; Dufour, Jeff
In both the U.S. House and Senate this year, congressional committees will push for a variety of initiatives in technology and telecommunications. Ken Johnson, the majority spokesman for the House Energy and Commerce Committee, says the committee will search for ways to boost "broadband competition" and the migration to digital TV. In addition, the committee plans to re-introduce a bill presented last year by Rep. Fred Upton (R-Mich.) that would allow spectrum to be auctioned off; and also plans to address several digital rights management issues. Meanwhile, Upton's subcommittee on Telecommunications and the Internet plans to work on anti-spam legislation, the E911 initiative (which lets emergency workers locate cell phone users who call 911), and finalizing the .kids Web suffix with ICANN. Similarly, Sen. Conrad Burns (R-Mont.), who heads the Senate Communications Subcommittee, plans to work on spectrum restructuring, anti-spam laws, and the E911 system before summer. In the spring, Burns wants to examine the role of ICANN, wireless privacy laws, and Internet privacy laws. By the end of 2003, Burns plans to pursue the growth of high-speed Internet access in non-urban areas in the United States, a Universal Service Fund to guarantee low-cost telecom service for rural citizens, digital parity between the United States and Asia, and bringing the legislative process to the Internet. Finally, the House Science Committee will oversee funding for cybersecurity research and development initiatives, says committee spokeswoman Heidi Tringe, which will be funded by $98 million as part of the law last year that created the Office of Homeland Security.
- "Chaos, Inc."
Red Herring (01/03) No. 121; Waldrop, M. Mitchell
Agent-based computer simulations based on complexity science are being used by companies to improve their bottom lines. Complexity science promotes the theory that all complex systems have common characteristics: They are massively parallel, consisting of adaptive, quasi-independent "agents" that interact simultaneously in a decentralized configuration. By following this theory, agent-based simulations can map out system behavior that spontaneously stems from many low-level interactions. The simulations are appealing to company executives because they are easier to understand than the highly abstract and mathematical underpinnings of conventional modeling programs, while Fred Siebel of BiosGroup notes that they allow "what if" scenarios to be played out over a much larger canvas. For instance, Southwest Airlines hired BiosGroup to model its freight delivery operations in order to make them more efficient; agents that represented freight handlers, packages, planes, and other interactive elements were developed and put through their paces, after which the rules of the system were changed and tested to find the most efficient behavioral pathway. By following the strategy outlined by the simulation, Southwest was able to shave as much as 85 percent off the freight transfer rate at the busiest airports, and save $10 million over five years. Other sectors that are taking an interest in agent-based simulation include the U.S. military, which is using it to coordinate the flights of unmanned reconnaissance aircraft, and the insurance industry, which wants to employ better risk management strategies. However, the agent-based simulation industry primarily consists of a handful of struggling startups, and one of the current drawbacks of their services is that the technology may be too state-of-the-art for most businesses, according to Assuratech CEO Terry Dunn.
- "Transforming IT"
Optimize (01/03) No. 15, P. 20; Allen, Bruce
IT is essential to business productivity, yet many corporate IT departments have not properly deployed the processes and metrics needed to optimize their IT efforts. To change this, some enterprising CIOs are following a three-year, seven-step transformation model with an iterative strategy. The first step is to focus on products and pricing in order to build a list of key services representing IT's value proposition; the second step of process refinement involves recognizing the processes to be included in a catalog, thus mapping out all operational factors; the third step is the creation of centers of excellence (COEs), which initially revolves around identifying processes that have the strongest relationship and need the tightest integration. The fourth phase, metrics requirements, will be driven by COEs, based on their needs and those of process, product, and service fulfillment. Three types of metrics--business results, human capital, and unit cost--and four types of performance results--financial, maturity, performance, and project--must be measured. Rapid assimilation, the fifth step, will allow the IT department to deal with unexpected projects and workloads and minimize operational disruption by deploying a formal structure, while the sixth step, organization, relies on the identification of COEs, products and services, and processes. The seventh and final step is the development of a game plan, which should yield a clear idea of the requirements for IT transformation as well as a solid foundation for future initiatives. Continuous improvement should be implemented across all levels of the transformation model, while CIOs can align their transformational strategies with their human capital through the establishment of human-capital management centers of excellence.