Volume 5, Issue 519: Monday, July 14, 2003
- "Funding for TIA All But Dead"
Wired News (07/14/03); Singel, Ryan
A 2004 defense spending bill that the U.S. Senate might pass as early as next week could effectively kill the Terrorism (formally Total) Information Awareness program through a defunding provision. The bill states, "No funds appropriated or otherwise made available to the Department of Defense...or to any other department, agency or element of the Federal Government, may be obligated or expended on research and development on the Terrorism Information Awareness program." TIA, which has a projected 2004 budget of $169 million, aims to uncover evidence of terrorist activity by mining a vast database of Americans' personal records, but both left-wing and right-wing critics oppose the program on the grounds that it would be used to spy on innocent American citizens. Following the Senate vote, a joint committee will have the final say on the TIA defunding provision, and seek to align the Senate bill with a House version that does not explicitly block TIA funding. Both bills indicate that the use of TIA on American citizens is dependent on congressional approval, a provision requested by Sen. Ron Wyden (D-Ore.), who amended an earlier bill to deny TIA funding unless the Defense Advanced Research Projects Agency (DARPA) furnished Congress with a detailed report on the program. DARPA submitted such a report in May, but privacy proponents such as the Electronic Frontier Foundation declared that the study merely paid lip service to TIA's privacy implications.
For more information on, and ACM's reaction to, TIA visit
- "Led by Intel, True Believers in Wi-Fi Say It Will Endure"
New York Times (07/14/03) P. C1; Markoff, John
Despite the popular notion that Wi-Fi is ruined as a business opportunity, some media and technology leaders gathered by investment banker Herbert Allen seem to think otherwise. Among the business luminaries were Intel's Craig R. Barrett and Andrew S. Grove, who have made Wi-Fi a central platform for their company's mobile computing business after earlier promoting a standard called Home RF. Barrett acknowledges Wi-Fi has succeeded in part because of its grass-roots advocacy; Intel is building Wi-Fi circuitry into all of its mobile computing chip sets, including the Centrino brand the company plans to spend nearly all its advertising budget on this year. In addition, Intel is going ahead with a pilot of WiMax, a wireless protocol that transmits data up to 30 miles at maximum speeds of 70 Mbps. Intel plans to use WiMax to close the "last-mile" gap between end users and high-speed applications such as digital high-definition television, and intends to use existing Wi-Fi antennas to pick up on the WiMax signal. Wayport CEO Dave Vucina says that even standard Wi-Fi access business plans show promise, given the right distribution point: He says subscription Wi-Fi and free Wi-Fi are complementary and not competing, pointing out that coffee drinkers still go to Starbucks even though they have free coffee at the office. Still, Wireless Internet and Mobile Computing publisher Alan Reiter says there is serious debate about whether to offer Wi-Fi for free or not from commercial establishments, given early evidence that free Wi-Fi boosts mainline business at hotels and restaurants. And then there are the grass-roots advocates, who continue their construction of entirely free Wi-Fi mesh networks in urban areas such as San Francisco; the largest free Wi-Fi groups in that area recently joined up to link 12 nodes, and plan to add another 20 nodes this month.
(Access to this site is free; however, first-time visitors must register.)
- "Job Exports May Imperil U.S. Programmers"
Associated Press (07/13/03); Konrad, Rachel
American programmers are worried their jobs are being endangered by the offshore outsourcing of highly-skilled tech positions to India, China, Russia, and other developing countries where the workforce is cheaper. Furthermore, outsourcing opponents are concerned that U.S. tech leadership could be shattered by this trend, which reportedly provides nations such as India and China with the support needed to bolster their own tech infrastructures. A November 2002 Forrester Research study estimates that the number of tech jobs farmed out overseas will balloon from about 27,000 in 2000 to 472,000 by 2015 if the current rate of outsourcing continues. Cap Gemini Ernst & Young reckons that the average wage for an Indian computer programmer, including benefits, is $20 per hour, while an American programmer with commensurate expertise earns an average of $65 per hour. Executives argue that outsourcing's advantages extend beyond cheaper salaries: Convergys' Jean-Marc Hauducoeur notes that Indian workers display greater staying power, thus saving a great deal of money in employee recruitment and training. The approximately 350,000 college engineering graduates India produces annually is also attractive, note many U.S. executives. Boeing, Dell, Motorola, and Intel have all opened Russian branches, while Microsoft is exporting software development jobs to its India Development Center in Hyderabad. Executives claim that programming skills alone do not guarantee American workers job security, and say the really secure positions are found in nanotechnology, biotechnology, and other emerging fields.
- "Gladiator-Style 'Wars' Seek Out Weak Programs"
New Scientist (07/09/03); Knight, Will
The ClusterWorld conference held in San Jose in June pitted 236 computer programs developed by universities, software firms, and government research departments around the globe against each other in a gladiator-style tournament where the winning algorithm would gain control over a parallel computer comprised of 2,500 computer processors. The competition, nicknamed Grid Wars II, "gave me the opportunity to test my algorithm," explains Mark Wenig of NASA's Goddard Space Flight Center. "It is a perfect environment to test and compare different approaches." Wenig's program, dubbed Rogue, was derived from genetic algorithms and modeled after the process of natural selection to build the optimum fighting code through evolution. Control of processors can be wrested away from occupying programs if a rival program can strike the other program three times using virtual "bullets." Programs can also correspond between occupied processors to launch a coordinated assault on other programs or put up defenses against attack. The contest eventually came down to Rogue versus Cobra, a manually-written program devised by a Moscow State University computing student. Cobra scored an eleventh-hour victory, which Matt Oberforger of Engineered Intelligence said may have been due to the program's ability to effortlessly communicate and proliferate rapidly.
- "The Future of Optical Computing, Now"
TechNewsWorld (07/11/03); Koprowski, Gene J.
Optical computing applications are arriving in the form of biometric identifiers and network switching equipment. Actual computers are not far behind, say those in the optical computing field. Driving demand for optical computing, which relies on quantum mechanics theory developed by Albert Einstein, are the physical limits faced by traditional electronic computing and Internet infrastructure. Demonstrations at the University of Rochester have shown optical computing allows near-instantaneous computation, and NASA researchers have built logic gate circuits entirely from optical components. Real-world applications of optical computing are seen in networking equipment from smaller firms such as Chiaro Networks, Procket, and Caspian Networks, all of which are challenging entrenched vendors that use mostly conventional packet switching technology. Chiaro Networks founder Eyal Shekel says his company's optical phased array and router design allow capacities up to maybe 1,000 times those provided with conventional technology. Optical computing technology is also paving the way for holographic and biometric identification applications similar to those depicted in the movie "Minority Report," according to A4Vision CEO Grant Evans. His company is working on a facial-recognition system using optical technology that the U.S. government could use to screen travelers entering the country. "There are many applications that will become available, a year or two from now, based on optical switching," proclaims Shekel. "The future of optical computing is now arriving."
- "India's Tech Industry Defends H-1B, Outsource Roles"
EE Times (07/10/03); Krishnadas, K.C.
India's National Association of Software and Service Companies (Nasscom) issued a report to quell concerned American and Indian parties' worries about outsourcing and the H-1B visa program, which critics claim are taking jobs away from American workers. The report estimates that the number of H-1B visas approved for Indian engineers declined by more than 50 percent between 2001 and 2002, and is anticipated to fall even further this year. In an effort to forestall a ban on outsourcing, Nasscom contends that the practice has enabled some U.S. companies to avoid mass layoffs: The association claims that transferring jobs to India allowed U.S. banks, insurance companies, and financial-services firms to save $6 billion over the past four years, and channel part of those savings into the creation of 125,000 additional jobs. Nasscom estimates that Indian IT firms employed almost 60,000 people in the United States in 2001, while roughly 170 such companies possess U.S. branches. To reassure people that outsourcing to India will not end, Nasscom released a McKinsey & Co. report projecting that Indian software and service exports to the United States will total $8.5 billion in 2003-04, while the United States will save between $10 billion and $11 billion through outsourcing to India in the same period. Meanwhile, the United States is expected to export $3 billion in high-tech products and services to India. India's software export services industry could be hit hard by H-1B visa restrictions, given that Germany recently decided to set limits on the number of annual green card allotments to foreign professionals. Meanwhile, outsourcing restrictions are particularly worrisome for India's IT-enabled services sector.
- "Thousands to Cast Ballots by Internet in 2004 Elections"
Associated Press (07/12/03); Hananel, Sam
U.S. military personnel and American citizens living overseas will have the opportunity to cast absentee ballots for the 2004 elections over the Internet, courtesy of the Pentagon's $22 million Secure Electronic Registration and Voting Experiment (SERVE), which Federal Voting Assistance Program director Polli Brunelli envisions as an alternative for people who cannot vote by mail. Voters participating in SERVE can register to vote and cast ballots from any Internet-accessible, Microsoft Windows-enabled computer terminal; local election officials will employ the SERVE system to process registration applications, transmit absentee ballots, and instantaneously receive voted ballots. SERVE participants will include eligible voters with homes in Hawaii, South Carolina, and certain Arkansas, Florida, Minnesota, Ohio, North Carolina, Utah, Washington, and Pennsylvania counties. One of online voting's perceived advantages is the elimination of arduous absentee ballot counting, but critics such as Harvard University's Rebecca Mercuri say there is no reason to think that Internet voting is any more immune to hacking, computer viruses, and other forms of electronic tampering than other supposedly "secure" systems. Brunelli claims that her office is implementing exceptional security measures to guarantee the SERVE system's protection, including redundant firewalls, intrusion detection tools, and attempted break-ins by "white hat" hackers. Advanced encryption technology will be utilized to muddle messages containing the ballots while digital signatures will be used to confirm voter identity.
Click Here to View Full Article
For information regarding ACM's work in the area of e-voting, visit
- "U.S. Firms Scramble to Get the Lead Out"
Investor's Business Daily (07/11/03) P. A4 Alva, Marilyn
In anticipation of a pan-European prohibition on high-tech exports that contain six toxic substances--lead, mercury, cadmium, hexavalent chromium, polybrominated biphenyls, and polybrominated diphenyl ethers--several major American electronics manufacturers are racing to eliminate these materials from their products through the institution of new standards and the refinement of their design, equipment, and manufacturing processes. Among the components known for containing lead are printed circuit boards, which are incorporated into nearly every consumer electronic product made in America. Though American companies have already embarked on efforts to jettison most of the other banned materials, it is the ban on lead that promises to have the most wide-ranging effects on the U.S. electronics sector. Motorola, Intel, IBM, Texas Instruments, Agilent Technologies, Hewlett-Packard, and Celestica are working to establish industry standards for environmentally safe manufacturing through the National Electronics Manufacturing Industry (NEMI). NEMI believes lead should be replaced in electronic products by a composite material made of tin, copper, and silver, with the trade-off being higher cost and potential reliability downgrades. Among the companies that have made notable progress in eliminating lead from their products is Solectron, which was prompted to do so in order to satisfy the needs of clients such as Sony and Olympus Optical. However, Sanmina-SCI's Eamon O'Keefe notes that some customers do not want environmentally-friendly products unless they have no other choice. The European Union-approved toxic substances ban goes into effect on July 1, 2006, and it will be preceded by the August 2005 enactment of Europe's Waste Electrical and Electronic Equipment Directive, which calls for electronics manufacturers to recycle obsolete products.
- "Privacy Rights Under Threat by Lawmakers"
SiliconValley.com (07/13/03); Gillmor, Dan
Dan Gillmor writes that California legislators are displaying contempt for the public good by failing to institute privacy legislation because such actions benefit their financial supporters rather than their constituencies. Gillmor cites the political death of SB 1, a bill that would have given people more control over how financial giants handle their personal information, as an example of this perfidy: Most legislators on a key committee did not vote on the measure, which in their view demonstrates that they were not taking an anti-privacy stance. Gillmor hopes that voters will make such "craven" politicos pay for such behavior by ousting them from office and passing the California Financial Privacy Initiative next year. Following the enactment of the financial privacy law, Gillmor believes elected officials and the public should then focus on radio frequency identification (RFID) tags, an emergent technology that could make supply-chain management cheaper and more efficient, but could also be exploited as a surveillance tool. He argues that the RFID industry is not being honest with the public about how the technology could affect privacy: Embedding the tags in products could make them easier to track in a supply chain, but could also be used to keep tabs on the consumers who purchase such products long after they have left the store. Gillmor contends that the RFID threat should be dealt with on a federal level, but concedes that state legislation may be the only answer, given the current "pro-business tilt" on Capitol Hill. He suggests that people who shop at Wal-Mart, a major RFID advocate, should warn the company that deploying RFID tags would make them alter their shopping behavior. Gillmor recommends that voters concerned about privacy contact their U.S. representatives and senators and urge them to either refuse to weaken already-enervated federal legislation further or institute solid privacy provisions.
Click Here to View Full Article
- "Quantum Deep"
ZDNet Australia (07/10/03); Kidman, Angus
Quantum computing, thought by many to be the next logical evolutionary step in information processing, is supposed to harness the unique nature of quantum physics, which allows for the superposition of quantum states. "Quantum computing begins where Moore's Law ends--about the year 2020, when circuit features are predicted to be the size of atoms and molecules," proclaims MIT academic and former IBM Almaden Research Center director Isaac L. Chuang. However, commercial applications of quantum computing remain elusive: The technique is mainly fueling interest among researchers struggling to solve complex mathematical problems that conventional systems are ill-equipped to handle, such as the factorization of higher and higher prime numbers. IBM has made strides, first with a five-qubit machine that could carry out order-finding for a given function, and later with a seven-qubit system that successfully factored the number 15; however, Chuang contends that a practically useful quantum computer will have to manipulate several dozen, if not thousands, of qubits. One of the immediate barriers to a successful quantum computer is the difficulty in controlling atomic behavior so that the system remains effective, and most quantum computer models feature error correction systems that can reduce the theoretically available processing power. In addition, quantum computers may not be necessary if processing power continues to increase and more economical technologies such as grid computing flourish and mature. One of quantum computing's most intriguing potential applications is quantum encryption, which directly relates to number factorization. Quantum encryption rests on the theory that quantum computing could be used to provide crack-proof data exchange as well as break existing cryptographic systems.
Click Here to View Full Article
- "Feds Far From Securing Cyberspace"
PCWorld.com (07/09/03); Roberts, Paul
Speaking at a recent forum in Boston for chief security officers, former White House cybersecurity advisor Richard Clarke said the U.S. government is not yet a leader in cyberspace security. In fact, many U.S. agencies have been rated unsatisfactorily in the area of IT security, according to assessments by the General Accounting Office and other groups. Clarke said that both public and private organizations must strive to reduce product flaws. This includes educating software programmers to write more reliable code and eliminate such security flaws as buffer overflow vulnerabilities. Clarke also believes that software devolvement and distribution need to be overhauled. He advocates greater cooperation among the following corporate departments: IT security, physical security, human resources, and the legal division. Companies that do this will be in a better position to predict and take action against security threats, he said. Clarke added that although the president's National Strategy to Secure Cyberspace is a positive step, the agency in charge of cyberspace and threat assessment--the Department of Homeland Security--still needs to harmonize a total of 22 disparate departments, which could take five to seven years.
- "And Now, the Weather Report for Your Neighborhood"
New York Times (07/10/03) P. E8; Eisenberg, Anne
Several companies, IBM among them, are developing weather simulation programs that tap the number-crunching power of supercomputers to provide precise, high-resolution local forecasts. IBM's effort, Deep Thunder, collects forecast data culled by the National Centers for Environmental Prediction's network of satellites and other equipment and narrows its focus to supply detailed 3D graphical visualizations of expected weather patterns in local areas. One forecast from the federal agency splits North America into a grid of squares that measure 12 kilometers on each side, with future climate conditions simulated at the corner of each square. Deep Thunder scales down the boxes to one square kilometer, narrowing estimates accordingly, and then lets the simulation model generate the 3D visualization. Deep Thunder produces forecasts with 1-kilometer resolution for the New York City metropolitan area every three seconds, and 4-kilometer forecasts for a region extending from Hartford, Conn., to Philadelphia every 12 seconds. The IBM researchers generate two high-resolution forecasts per day, as a demonstration to prospective customers and as a way to check the precision of the simulation models as they boost Deep Thunder's processing power. IBM researcher and Deep Thunder development leader Lloyd Treinish notes that the system successfully predicted with astonishing accuracy the severity of a Presidents' Day snowstorm for Westchester Country and northern New Jersey nine hours before the National Weather Service released an official forecast. However, some meteorology experts see problems in high-resolution weather simulation: Clifford F. Mass of the University of Washington cautions that slight errors in initial climate estimates can lead to major forecasting mistakes.
Click Here to View Full Article
(Access to this site is free; however, first-time visitors must register.)
- "Sizzling Storage? Just Wait"
CNet (07/02/03); Oltsik, Jon
Technology trends point toward a revolution in the storage market by 2006, when today's leading vendors will either have capitulated their proprietary bastions or become obsolete, writes Jan Oltsik. Cheapening hardware components are making advanced storage features available at low price points, while Ethernet and Internet Protocol (IP) continue their advance on the esoteric Fibre Channel standard. Ethernet and IP mean storage networking technology opens up to far more technology workers familiar with those popular protocols, and also allows new application capabilities such as backup, remote mirroring, and content distribution. In addition, network intelligence means data does not have to be kept at end points, but can be stored on the network itself--a far more scalable model. J2EE and Web services programming tools can be used for storage applications once network intelligence is introduced, replacing vendor APIs. Currently, storage industry leaders such as EMC are slowly moving from their proprietary technologies, and do so at their own peril. Just as IBM was unable to slow the advance of client/server computing, today's storage vendor leaders need to adopt cheaper, network-based technology or lose out to smaller, more innovative firms. As this shift continues, corporate buyers would do well to postpone large storage initiatives until the industry landscape adapts.
- "Internet Can be Upgraded 'Without Boosting Backbone'"
ZDNet UK (07/07/03); Sharwood, Simon
Tom Leighton, the chief scientist of Akamai, recently spoke at the Fifth International Congress on Industrial and Applied Mathematics about his belief that operation of the Internet is hindered by inadequate use of Border Gateway Protocol and Domain Name Services, as well as peering relationships that interfere with efficient data routes and the technique of servicing content from a central location. Arguing that additional backbone capacity is unnecessary, Leighton says, "You've got 70 million users on an unreliable, overused network." He says instead of scaling the Internet to support more centrally-hosted applications, applications must move to the edge of the network using a distributed framework that brings content closer to users using existing routes. Akamai has coordinated a number of mathematicians to help determine more streamlined methods of content distribution, using Combinatory Optimization, which has allowed Akamai to release new traffic-direction algorithms from time to time. Leighton says, "The best we can hope for is an overlay to hide [the Internet's] worst deficiencies."
- "New Software Allows You to Log on By Laughing"
New Scientist (07/09/03); Nowak, Rachel
Computer scientists at Monash University in Melbourne, Australia, have developed a program that will automatically log someone onto the nearest computer at the sound of their voice or laughter. In an effort to make it easier for their staff to log onto networked computers, the researchers have created SoundHunters, software that makes use of sound recognition and intelligent agent technology. With the aid of microphones on computers, SoundHunters is designed to recognize a voice, and then use strategically placed intelligent agents to determine the location of the individual in relation to the nearest computer. Intelligent agents can listen to the footsteps of a person moving throughout an office. "Once the agents have worked out the direction the person is going, they would even be able to stay one or two steps ahead," says researcher Arkady Zaslavsky. Although the researchers have more work to do to improve SoundHunters' ability to distinguish voices, marrying sound recognition and intelligent agent technology has become a reality.
Click Here to View Full Article
- "White House: 5 Priorities to Guide R&D Funding in FY 2005"
White House Weekly (07/08/03) P. 5; Nance, Scott
A June 5 White House memo sent to federal agency directors indicates that President Bush's proposed research and development budget for fiscal 2005 will have five priorities: Counterterrorism, nanotechnology, networking and IT, environment and energy, and understanding life processes on the molecular level. The anti-terror R&D priority advises agencies to focus on research into improving the detection and response to chemical, biological, and nuclear attacks; the development and implementation of technology designed to help first responders; and new technologies that can turn huge volumes of intelligence data into "actionable knowledge." The nanotech priority reinforces the essentialness of the National Nanotechnology Initiative to the current administration, as it relates to home security, biomedicine, and material science. The networking and IT R&D goals stress the importance of cybersecurity, network innovation, and high-end computing, while the memo says environmental/energy R&D is key to economic growth as well as national security. The memo, authored by White House science advisor John H. Marburger and former White House Budget Director Mitchell Daniels, also detailed criteria that would be used to assess the effectiveness of federally funded R&D programs. The criteria, which are a component of the Program Assessment Rating Tool (PART), focus on three areas: Relevance, quality, and performance. In terms of relevance, federal R&D projects are required to be clearly outlined, in keeping with national priorities and customer needs, and pertinent to the dedication of taxpayer resources; quality assurance requires the use of a clear and justifiable technique for allotting the lion's share of R&D programs' funding; and guaranteeing performance depends on programs setting and adhering to multi-year goals marked by yearly performance metrics and watersheds.
- "Researchers Spawn Innovative Technologies"
Network World (06/30/03) Vol. 20, No. 26, P. 27; Dubie, Denise
Since the inauguration of MIT's Project Oxygen roughly four years ago, researchers have churned out approximately 20 advanced communications technologies designed to "[promote] human interaction, more natural interaction between people and machines," according to MIT computer science professor Randall Davis. Examples of Project Oxygen innovations, which were available for public perusal when MIT recently opened up its Computer Science and Artificial Intelligence Laboratories, include the Cricket Indoor Location System, a network of beacons and sensors that can be distributed throughout a room to track the location of tagged objects within a few centimeters using radio frequency and ultrasonic technology. Scalable Location-Aware Monitoring (SLAM), another Project Oxygen technology, allows tag systems or sensor data to be integrated with distributed software applications. Researchers note that the SLAM architecture could be used to assist the fire department in getting people in a burning building to safety, as well as enhance inventory tracking for retailers in conjunction with radio frequency identification technology. The project's Mobile Teleconference application is designed to facilitate adaptive teleconferences by assessing the available lines of communication and choosing the optimum communications link between user handhelds. The Resilient Overlays Network application works along similar lines by selecting the fastest route between IP addresses, cutting packet loss and making latency a thing of the past. Sponsors of the five-year Project Oxygen initiative, including Nokia and Hewlett-Packard, have collectively donated $50 million and could employ technologies yielded from the project within their own products.
- "IT Does So Matter!"
Computerworld (07/07/03) Vol. 31, No. 33, P. 36; Melymuka, Kathleen
A number of leading IT experts repudiate Nicholas G. Carr's assertion that IT's strategic value has been lost to commoditization, although Andrew McAfee of Harvard Business School thinks Carr's stance has had one positive effect--it brings the IT debate into sharp relief. Cutter Consortium analyst Tom DeMarco says Carr makes a poor case for his argument by presenting three graphs for railroads, electric power and IT; though all three graphs follow similar curves to indicate commoditization, DeMarco says that Carr makes the mistake of using computers to symbolize IT in general. Harvard Business School's Rob Austin elaborates on the flaw in Carr's reasoning, saying Carr believes that IT's chief problem is its ubiquity, when in fact a paucity of ability to innovate with IT is the real problem. NASA acting CIO Paul A. Strassman argues that how an organization uses IT, rather than just buying IT, is the critical issue. DeMarco declares, "you can't count on a single-shot competitive advantage, but you can gain a continuing advantage by being a continuing innovator in IT." Carr postulates that CIOs will revert from strategists to bean counters, but McAfee says the role a CIO plays depends on circumstances, and notes that major organizations will require CIOs to play many parts, including strategists, cost minimizers, business needs assessors, and organizational change specialists. Austin adds that a CIO must be a visionary, and be able to communicate his vision clearly to less technologically inclined people. DeMarco concludes that this "element of vision...can't be commoditized."
Click Here to View Full Article
- "Building Blocks for Converged Applications"
Business Communications Review (06/03) Vol. 33, No. 6, P. 38; Hettick, Larry
Applications convergence is expected to revolutionize the enterprise, and key to this development is a "logical enterprise" model based on an array of building blocks. These fundamental elements include IP telephony or Voice over IP (VOIP); unified messaging; Computer Telephony Integration (CTI); Extensible Markup Language (XML); VoiceXML (VXML); Session Initiation Protocol (SIP) and SIP-supported "presence;" and the Speech Applications Language Tags (SALT) protocol. VOIP will emerge as a complex application, while SIP and presence will allow users to be contacted according to preference. XML and VXML, in concert with text-to-speech, speech-to-text, and other technologies, will interface with legacy unified messaging and call-center programs. This will clear the way for the integration of enterprise communications applications and end-user and business applications. The logical enterprise model is split into a data layer, an applications layer, and a customized user portal--however, some of the model's building blocks will be commercially unavailable for a few years. The maturation of XML, VXML, and SALT, combined with the increased cost-effectiveness of multiple applications accessing a common data set, will allow data storage to become more and more independent of applications. Parts of the applications convergence model are currently being deployed by vendors such as Cisco, Siemens, Avaya, Nortel, and Alcatel. "This technology shift allows us to rethink our approach to communications in terms of interactions between people and resources," notes Chris Vuillaume of Alcatel.