Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to technews@hq.acm.org.
Volume 6, Issue 653:  Monday, June 7, 2004

  • "Finding Out What's Bugging Computers"
    The Star-Ledger (06/06/04); Coughlin, Kevin

    One solution to the rampant problem of buggy software is to test the code thoroughly before its public release, which is a much more complex endeavor than it sounds. "There's not enough time in the life of the universe to check one million lines of code," explains Bell Labs director of systems software research Howard Trickey, whose Orion project aims to streamline the testing process by flagging only genuine errors. A 2002 National Institute of Standards and Technology study attributes software's general shoddiness to an accelerated time-to-market and a lack of liability among vendors. "We as buyers don't put the market pressure on enough to push for bug-free software," comments Stevens Institute of Technology computer scientist Rebecca Wright, while Eugene Spafford of Purdue University reports that the problem is complicated by vendors adding more and more sophisticated features to software. He further points out that the likelihood of bugs cropping up is rising now that software is being increasingly geared for computer neophytes, while colleges are caving to employer demands for people skilled in the latest computer languages rather than older ones with greater reliability. Orion is designed to be an improvement over other programs by lowering the occurrence of false positives, and Trickey and his research team assert that they successfully reduced the number of false positives in checking a 369,000-line program by 50 percent. The Orion project has received a four-year grant of $640,000 from NASA and the National Science Foundation. Computer security analyst and author Michael Erbschloe argues that Orion and similar approaches will not eliminate the need to test software in real-life scenarios, and explains that this task must be carried out by rotating pools of test users of varying competence.
    Click Here to View Full Article
    (Access to this site is free; however, you will need to enter your zip code, year of birth and gender to gain access to the site.)

    Eugene Spafford is co-chair of ACM's U.S. Public Policy Committee (http://www.acm.org/usacm).

  • "Human Responses to Technology Scrutinized"
    Washington Post (06/07/04) P. A14; Vedantam, Shankar

    Studies show that people make a strong emotional connection to technology, and tend to anthropomorphize it. Rosalind Picard, director of Affective Computing Research at the MIT Media Lab, stated at a recent American Association for the Advancement of Science lecture, "The way people interact with technology is the way they interact with each other." Such observations are encouraging affective computing research, which is viewed as a potential revenue-generating stream by marketers--but it is also bringing strong ethical concerns to the fore. Picard's research demonstrates that people often grip the mouse tighter and assume a tenser sitting position when they grow irritated with the computer, while other studies indicate that many people react violently, inflicting physical or verbal abuse upon the machines. "Emotionally intelligent" computing solutions are being investigated by researchers in several ways: One encourages users to attach social attributes to the machines, which can actually intensify the user's frustration when such an approach is done incorrectly. The method also uses technology designed to appeal to the user's pride; Stanford University professor Clifford Nass cites a spell-checker that compliments users when they successfully spell difficult words as an example. In another solution, a person's posture, mood, and stress levels are read by sensors and processed by the computer, which then uses software to respond in an appropriate manner in an effort to relax the user if he or she is frustrated. Nass believes that computerized learning can be enhanced by software that establishes a virtual "teacher" and a virtual "student" so that the real student does not feel that he or she is being singled out all the time.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Building a Better Bot"
    Ottawa Citizen (06/05/04); Staples, Sarah

    Automated machines that work in teams are being developed for such tasks as building scientific outposts on the moon, studying a planet's makeup for signs of past life, and paving the way for space colonization. NASA Jet Propulsion Laboratory computer scientist Terry Huntsberger is leading researchers in the development of Cliffbots, two-wheeled cooperative devices that work in teams of three: Two Cliffbots attach themselves to the edge of a canyon and lower a third down cliffs; the robots could be employed to establish a permanent lunar base within 10 years. The Defense Advanced Research Projects Agency has developed Centibots, three-wheeled machines that are being trained to perform military reconnaissance and search-and-rescue missions, but whose applications could extend into space. Cooperation between Centibots is coordinated by NASA's CAMPOUT software, which sets up a decision-making framework in which each individual unit makes decisions that can be precluded by other robots higher up the chain of command. CAMPOUT could not have been created without research from the University of British Columbia's Laboratory of Computational Intelligence, whose pioneering work in cooperative robot technology includes training robots to play soccer. In a few months, NASA will unveil a mockup of a lunar habitat that would be constructed by robots coordinated by Earth-based operators, while researcher William L. Whittaker is working on small robots designed to fix spacecraft by scuttling over the surface by molecular tension. Palo Alto Research Center scientists have created PolyBots, tiny machines equipped with sensors, computers, and motors that can link into long chains to perform various functions. Such machines could ultimately lead to robotic building blocks that can morph into different practical objects as well as repair themselves.
    Click Here to View Full Article

  • "First Quantum Cryptography Network Unveiled"
    New Scientist (06/04/04); Biever, Celeste

    BBN Technologies' Quantum Net (Qnet) was inaugurated on June 3 with the first transmission of quantum encrypted data packets across the network. The Cambridge, Mass.-based Qnet consists of a fiber-optic cable network threaded 10 kilometers from BBN to Harvard University, while the network's half-dozen servers can be integrated with conventional servers and clients on the Internet. The data piped along Qnet is encrypted using keys extracted from the exchange of a set of individual, polarized photons, and any attempt to intercept the photons changes their quantum state and tips off the network to the presence of eavesdroppers. Qnet, whose development is funded by the Defense Advanced Research Projects Agency, is the first network comprised of over two nodes to use quantum encryption. The network's inventors say the deployment of additional nodes in banks and credit card companies could make the exchange of sensitive data over the Internet even safer. However, BBN's Chip Elliott reports that quantum cryptography is not a totally secure solution, as the real-world implementation of quantum keys still leaves room for hackers to eavesdrop without being detected: For instance, a hacker could potentially exploit a laser's tendency to occasionally generate more than one photon by siphoning off the surplus photons and cracking the quantum key. Quantum encryption-capable computer systems tend to be large and costly, while their photon transmissions become disrupted by noise across distances that exceed 50 kilometers, though the distance problem could be mitigated by the air transmission of quantum keys via tiny, aligned telescopes.
    Click Here to View Full Article

  • "What Is Google's Secret Weapon? An Army of Ph.D.'s"
    New York Times (06/06/04) P. BU3; Stross, Randall

    Google, the next challenger to Microsoft's dominance in personal computing, may have an advantage in that its entire culture is infused with research lust: The company famously hires Ph.D.'s and encourages them to pursue independent projects alongside their everyday work at the company. Though Google has not recently updated its numbers of Ph.D. employees, there are probably more than 100 working for the 1,900-employee company, including its CEO Eric Schmidt. Among the revelations Google made in its Securities and Exchange Commission filings was that it gave employees a number of workplace-friendly perks, including on-site doctor visits and washing machines. Google is trying to create an atmosphere where innovation comes with day-to-day work, while Microsoft, in contrast, counts 700 employees in its separate research group and last month said it was cutting benefits for workers. University of Washington computer science professor Ed Lazowska says Google has captured as many computer science Ph.D.'s from the Seattle university as has Microsoft, which is about 30 times larger and based nearby in Redmond, Wash.; he notes that Google's practice of allowing employees one day per week to pursue individual projects is a big reason for the disparity. Not all technology companies that focus on Ph.D. talent succeed, however: The Xerox Palo Alto Research Center and Steve Jobs' NeXT Computer created laudable computer designs but failed in the marketplace, and the same could happen to Google if the company fails in business acumen while it succeeds in innovation. Microsoft is sticking with its practice of hiring bachelor's and master's degree holders, according to company college recruiting head Kristen Roby, who says, "We're huge believers in hiring potential." She adds that Ph.D. holders are less likely to fit into Microsoft's demanding production schedule, which is much different from Google's continuous online updates and beta tests.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "ACM's CHI 2004"
    Wave Report (05/28/04); Latta, John

    The CHI 2004 conference showcased ideas that will transform how people interact with technology, including practical demonstrations and cutting-edge concept presentations; the conference was hosted by the ACM Special Interest Group on Computer-Human Interaction (SIGCHI), and included demonstrations by major consumer electronics players. CHI 2004 revolved around several main threads: The shift from desktop computing to ubiquitous computing environments, the display and mobile technologies that support this shift, and the ambient intelligence concept that shapes this environment. Sony highlighted a number of ongoing projects, including using pen devices for data manipulation across multiple machines. Sony Computer Science Laboratories researcher Jun Rekimoto described the Pick and Drop concept where a pen device is assigned a portion of data and can then move it from one computer to another; his research is important in that it involves principles of direct manipulation taken from the real world and applies them to human-computer interfaces. Intel demonstrated new RFID research that focused on cashier-less retail checkout, and the study showed possible complications and the importance of learned user patterns and user confidence in the checkout environment. Hewlett-Packard studied how ubiquitous technology could aid working parents by enabling them to coordinate important tasks from their car, such as assigning a trusted individual to pick up their child from school if they were running late, for example. Fraunhofer Institute AMBIENTE research head Robert Steitz poised this question at CHI 2004: With the disappearance of the desktop as we know it, will people even need buildings? He said the answer was yes, but that buildings would have to focus on facilitating cooperative group activities and spontaneous communication. Samsung also presented research on gesture-based controls for television at the conference.
    Click Here to View Full Article

  • "Once More Please, With Feeling"
    IST Results (06/02/04)

    Brigitte Krenn, coordinator for the Information Society Technologies program's Net Environments for Embodied Emotional Conversational Agents (NECA) project, believes the Internet's appeal could be significantly expanded through more "human" computer interfaces. Her project aims to make embodied conversational agents (ECAs), or avatars, more credible as persons by imbuing them with "multimodal" means of communication and personality through the employment of technologies such as speech synthesis and situation-based natural language generation. "We simulate embodied conversation, by generating stories involving animated characters who work with each other," Krenn explains. NECA's Web site features a pair of research demos: One of them, eShowroom, involves a simulated dialogue between a car salesman and a customer designed to engage the visitor while simultaneously relaying useful data. Each avatar's personality--its friendliness, politeness, etc.--must be programmed prior to the beginning of the conversation, while the visitor chooses the level of relevance for various automobile features. The characters use facial expressions, posture, gestures, and voice modulation to communicate their emotional states. The second NECA demo employs the same conversational animation technology, but applies it in a different manner by using the existing virtual community of Spittelberg. "Our community has real people in it, each represented by an avatar, with email and chatroom facilities," notes Krenn.
    Click Here to View Full Article

  • "Access Patterns Organize Data"
    Technology Research News (06/09/04); Patch, Kimberly

    Old Dominion University researchers have devised a method that mimics the brain's ability to order information so that connections can be automatically established between digital objects; such a technique could one day enable information repositories to self-organize based on the way users access data. "We want to put the information itself in the driver's seat where possible, and give it the capabilities to adapt and maintain its own integrity," explains Old Dominion researcher Aravind Elango. The technique was tested using 150 "buckets" that supplied data about a music band and were set up with random connections; the connections were changed as users traversed the system in accordance with Hebb's Law of Learning, which posits that the linkage between two neurons grows stronger as the neurons are triggered in rapid succession. Elango says that each individual bucket tracks the two most recently visited buckets and sets up new connections to those buckets according to users' travels, so that the more relevant links are ranked higher in each bucket. "The system showed that after significant user traversal, each bucket contains a set of links [that] are more relevant to the music band," states Elango, who adds that this configuration took place despite the fact that the users were assorted and not enthusiasts of any specific musical genre. The demonstration serves as proof that digital objects can yield dynamic recommendations for users, if enough intelligence is added. Elango says the buckets' ability to contain any information allows the system to be used with Web pages. The technique could not only be diversely applied to Internet recommender systems, but be employed to tailor the examination of the Internet's characteristics according to users' preferences.
    Click Here to View Full Article

  • "Survey Finds Older IT Employees"
    Federal Computer Week (05/28/04); Michael, Sara

    A new report by the CIO Council's Workforce and Human Capital for IT Committee suggests that the information technology workforce of the federal government is aging, and may need an upgrade in skills. The approximately 20,000 responses to the Clinger-Cohen Assessment Survey reveal that the typical government IT worker is between the ages of 46 and 50, has reached the general schedule 13 level, has worked in government for more than 20 years, and does not have much experience working in the private sector. Moreover, the typical IT worker will probably move to another agency within the next three years, and retire within 10 to 20 years. "Although the workforce is aging, it appears that those closest to retirement do not plan on retiring when they are eligible," according to the survey analysis. The survey also finds that federal IT workers were proficient in IT-related skills such as word processing, email, and spreadsheet software but were lacking technical competencies such as artificial intelligence and embedded computers, and that general competencies such as interpersonal skills, problem solving, and contracting and procurement were needed the most. The survey recommends that the CIO Council's workforce committee collaborate with the Office of Personnel Management on using career development and guidelines on training, education, certification, and experience to improve the quality of the IT workforce. The groups also are encouraged to focus on government-wide strategic workforce planning and workforce elements in federal enterprise architecture.
    Click Here to View Full Article

  • "Researchers Build What They Envision as Wearable Computers"
    East Valley Tribune (AZ) (06/01/04); Taylor, Ed

    Frederic Zenhausern of Arizona State University's Applied NanoBioscience Center and University of Arizona professor Ghassan Jabbour have teamed up to create two prototype "biometric bodysuits" that incorporate microelectronics. The Scentsory Chameleon Bodysuits were showcased last month at NextFest 2004 in San Francisco. One prototype is a "personal wellness garment" fashioned from clear vinyl and white plastics: Such apparel could be employed in the future as a disease diagnosis or drug-delivery system, as well as a tool to monitor the wearer's vital signs and engage the user in interactive games and other leisure activities. Zenhausern noted that the fabric's appearance could become tunable by downloading colors and patterns from the Internet. The second prototype is a military uniform enhanced with sensors to detect chemical or biological agents in the air, low-temperature fuel cells to power the soldier's equipment, and a flexible electroluminescent screen that the user can access for information. Institute for Global Futures CEO James Canton said the military will likely be the initial adopter of biometric garments, but civilians could embrace the technology in light of personal security concerns. Visiting professor at the Harvard University Graduate School of Design Sheila Kennedy noted that the technology could be applied to buildings, one example being window shades embedded with organic light emitting diodes that tap sunlight to generate electricity.
    Click Here to View Full Article

  • "'NIP' and Tuck?"
    Tech Central Station (06/03/04); Basulto, Dominic

    The federal budget plan for fiscal year 2005 has added more fuel to the debate over innovation, writes Dominic Basulto. The proposed budget would cut funding for research and development for 21 of the top 24 federal agencies, while advocates of innovation are concerned that basic research in areas that are not the most exciting or immediately promising financially will suffer. Although the Bush administration has plans to increase R&D spending by another $5.45 billion, $4 billion would go to R&D that would be of interest to the Pentagon at a time when Defense-related R&D already represents more than 55 percent of all federal R&D. The benefits of R&D spending on innovations is obvious to advocates of a national innovation policy (NIP), who can point to the impact of Defense Advanced Research Projects Agency (DARPA) funding on the development of the Internet. The Bush administration suggested that it backs important technology in a series of specific policy measures issued April 26 such as universal broadband Internet access and better health care IT, but it will let the private sector carry most of the burden of bringing technologies to the marketplace. Now may be the time to be proactive about innovation, considering U.S. competitiveness appears to be on the decline and companies are moving R&D offshore. The industry-sponsored Task Force on the Future of American Innovation says innovation in the U.S. has reached a "tipping point" while competition from Asia increases.
    Click Here to View Full Article

  • "A SuperMap for Special Ops--Or Business Travelers"
    USC Information Sciences Institute (05/28/04); Mankin, Eric

    A team of researchers led by computer scientist Craig Knoblock of the University of Southern California's Viterbi School of Engineering Information Sciences Institute developed HeraclesMaps, a program that aggregates many years' worth of geographical data and presents it as a simple interface that both special operations forces and business travelers can use. Knoblock noted earlier that data culled from numerous sources may possess variable deviations, and "The applications that integrate information from various geospatial data sources must be able to overcome these inconsistencies accurately, in real time and for large regions." He explains that HeraclesMaps absorbs satellite imagery and mapping data, and boasts an interface that anticipates inquiries so that users can access the full range of the date rapidly and intuitively. Examples of such inquiries include instructions to find a route between two locations that soldiers can take without being observed by enemy forces. HeraclesMaps, which was developed in collaboration with special operations forces veteran consultants, stems from the Heracles project, which focuses on the extraction and dynamic rearrangement of information gathered from disparate sources through artificial intelligence and other technologies. The addition of data from Web-based or other sources could be supported by other Heracles applications, which would allow people to find all-night restaurants that serve a particular cuisine, for instance.
    Click Here to View Full Article

  • "Hacking Sparks Need for Complex Passwords"
    Associated Press (06/01/04); Jesdanun, Anick

    Regular passwords combined with a variable physical component, dubbed two-factor authentication, are becoming more popular as scammers get more clever about stealing passwords. Those who use static passwords often use combinations that are easy to guess, or use the same password for multiple sites. Scammers can use keystroke recorders, phishing emails, software, and viruses to gather passwords, and analysts say that insecure passwords are responsible for corporate network hacking, unauthorized financial transfers, and privacy breaches. Vasco Data Security International provides small devices that accept a regular password and issue a second unique code, based on the time and the unit; that code is entered into a Web site to complete a transaction. MasterCard International has been testing similar systems, using smart chips in credit cards and one-off passwords. Two-factor authentication has remained limited in the United States because companies are fearful of inconveniencing their customers, explains American Bankers Association senior policy analyst Doug Johnson. U.S. businesses are trying to keep regular passwords strong, rejecting Web site passwords that are easy to guess, for example. For two-factor authentication to spread, either password-generating devices must be cheaper, or biometric readers must be standard on computers. RSA Security's Jason Lewis believes that companies will have to create services so that one device can work at many sites. Scandinavian banks are working with government agencies and utilities to develop universal two-factor authentication systems, while the U.S.'s Liberty Alliance Project is developing identity management standards.
    Click Here to View Full Article

  • "Your Next Computer"
    Newsweek (06/07/04) Vol. 143, No. 23, P. 50; Stone, Brad; Flynn, Emily

    The evolution of cell phone technology is characterized by a shrinking form factor coupled with increasing intelligence, speed, and connectivity, and this trend is inspiring advocates to forecast the eventual replacement of PCs by mobile phones. This concept is sharply criticized by PC adherents who describe cell phones' Internet linkage as too slow to effectively perform operations comparable to those within the capabilities of current computers. "Hundreds of millions of people are not going to replace the full screen, mouse and keyboard experience with staring at a little screen," argues Intel's Sean Maloney. Yet proponents such as PalmOne CTO and Palm Pilot inventor Jeff Hawkins believe future technological breakthroughs will allow mobile phones to transcend their current limits. "One day, 2 or 3 billion people will have cell phones, and they are all not going to have PCs," Hawkins boasts. A growing number of cell phones can browse the Web, send email, and even take pictures, while forthcoming products feature built-in digital cameras and keyboards, memory slots, Global Positioning System antennas, and Wi-Fi hotspot access. Industry observers expect the next few years will witness the emergence of handsets with 1 GB of flash memory, if not more, as well as "location-based services" facilitated through mobile phones' geolocation ability. Adding usable keyboards to cell phones is a key challenge that scientists are hoping to crack through products such as Canesta's "projection keyboard," or circumvent altogether via speech-recognition technology.
    Click Here to View Full Article

  • "New Threat, New Defense"
    Aviation Week & Space Technology (05/31/04) Vol. 160, No. 22, P. 46; Fulghum, David A.

    The threat of an attack by missiles laden with chemical or biological materials is spurring the development of network-centric warfare solutions, among them a battle management, command and control (BMC2) component for next-generation E-10A ground surveillance aircraft, which the Pentagon calls the cornerstone of cruise missile defense. J. Michael Borky, who is leading a team of Raytheon and Lockheed Martin researchers to develop such a system, says the goal is to significantly automate and enhance the quality of the myriad decisions operators must make to select targets based on intelligence reports. BCM2 would be based on a variety of technologies, including multi-frequency sensors, automatic target recognition, machine-to-machine links, automated decision-making, and sophisticated horizontal communications. Borky says the Air Force is seeking an "intuitive" process in which "I put a cursor over a target, and down the sides of my head-up display comes everything I wanted to know." Any net-centric operation must fulfill two key criteria: A unified perspective of the battlefield and rapid data flow. It takes the Air Force about 10 minutes to destroy a target, but the researchers say they can potentially cut that time by 50 percent. The Multi-Platform Radar Technology Insertion Program will also be essential. Additional components likely to have important roles in cruise missile defense and E-10 BMC2 include an open architecture; access to the synchronized database via a template or subscription; "engagement folders" abstracted from objects of interest to enable the smoother movement of data around the network; automated, machine-to-machine connections between intelligence and targeting; software radios; transformational communications technologies such as broadband; and multi-level classification.
    Click Here to View Full Article

  • "Send in the Swarm"
    Fortune (06/14/04) Vol. 149, No. 12, P. 52; Brown, Stuart F.

    iRobot is developing robots that exhibit group behavior modeled after insect swarms for military applications such as reconnaissance and land-mine disposal with funding from the Defense Advanced Research Projects Agency. The Software for Distributed Robots project involves the development of tiny wheeled machines dubbed SwarmBots that are equipped with electric motors, rechargeable batteries, a microprocessor, a "bump skirt" to detect and avoid collisions, a small color camera to recognize simple objects, and light-sensitive sensors. An array of infrared transmitters and receivers are used to facilitate communications between the robots. Some military strategists envision a scenario in which ground-based swarm robots coordinate their activities with unmanned aerial combat drones. In a SwarmBot demo designed to simulate the exploration of an unfamiliar interior environment, the machines boast multicolored lights that help people monitor their actions: A blue light represents a robot moving into unknown territory; a red light indicates that a robot is trying to maintain uniform spacing from its neighbors; a green light signals when a robot is returning to the charging dock to replenish its power supply; and the simultaneous flashing of all three lights means that robots have encountered and are "guarding" an object. Robots flash blue on the leading edge of the swarm, indicating their function to expand the frontier, while behind them are robots that attempt to maintain equidistant positions from each other. When the robots head back home, some robots elect to serve as "bread crumbs" that maintain line-of-sight communications. iRobot researcher James McLurkin notes, "If you have a whole lot of robots and they spread out evenly across an area, you no longer need to map the area. You can simply map the robots."
    Click Here to View Full Article

  • "The Eyes of the Machine"
    EDN Magazine (05/27/04) Vol. 49, No. 11, P. 65; Cravotta, Robert

    EDN technical editor Robert Cravotta offers up machine-vision systems as an example of real-world data recognition systems, and explains that many factors have to be considered when choosing the best system. Users must select the kind of illumination or signal source that best accentuates and contrasts the key elements of target objects or materials; this signal source could emanate from the object, consist of natural light reflected off the object, or originate from light sources specifically associated with the machine-vision-design arrangement reflected off the object. Users can make data-processing requirements less burdensome by lowering or eliminating signal noise and variability and augmenting the contrast between desirable features. Cravotta cites stereoscopic data-capture systems employed by autonomous vehicles and in robotic assisted surgery as examples of machine-vision systems that use multiple sensors to view overlapping data in order to perceive depth accurately. Choosing the best camera component for the machine-vision system requires users to know what features must be highlighted so that they can be accurately and reliably extracted from the produced image: Traits of the features to be considered include the texture, color, size, quantity, markings, contours, and orientation of the target object, as well as the illumination requirements, space limitations, the impact of shadows and bright reflections, and environmental control. Cravotta notes that opting for a high-resolution system generates very precise images but increases the bandwidth and data-processing load, which can be reduced by lowering the frame rate. When image processing is assigned to the PC processor, users can avail themselves of a wide selection of third-party vision-software products; however, they have less latitude with non-PC-based vision systems that relegate image processing duties to DSPs and FPGAs. Cravotta writes that the application determines the optimal image processing configuration.
    Click Here to View Full Article

  • "10 Best Bet Technologies"
    Network Magazine (05/04) Vol. 19, No. 5; Wittmann, Art; Greenfield, David; Dornan, Andy

    Network Magazine pinpoints 10 technologies that will likely transform corporate networking, and that are connected in being horizontal, able to benefit the bottom line by boosting productivity and reducing costs, in a nascent stage of development, and likely to remain both radical and practicable for the next two to five years; all of the technologies are attuned to the overall vision of corporate networking's maturation into a user-centric process characterized by intelligent infrastructure, ubiquitous access, and a service-oriented platform. Federated identity management will supply a unified authentication platform for companies to securely identify participants in Web-based transactions, resulting in a slimming down of their internal business processes. Virtual collaboration and interaction between distributed workgroups will be transformed by presence technology, which tags online identities with availability markers and makes resources and people easier to track; trusted computing will facilitate the widespread implementation of Public Key Infrastructure by embedding a security and identity processor into all PCs, a development that will beef up protection against malware and bugs. UltraWideband radios and adaptive mobile networks will converge to create low-power, next-generation wireless that will revolutionize network infrastructure by incorporating everything within radio range into the user's personal network, while the replacement of InfiniBand and Fiber Channel connections with high-speed Ethernet via the deployment of the Ethernet/IP data center will lower the cost of rolling out a tiered, service-oriented infrastructure. XML switching will enable numerous automated services by serving as the platform for business-to-business transaction processing, and service-based management will simplify, accelerate, and lower the cost of services delivery by allowing IT to coordinate network equipment platforms. The maintenance and administration of WANs will be streamlined with Ethernet in the First Mile and Ethernet Virtual Connections, which will also cut costs and offer higher access speeds, while several proposals and initiatives are on the boards to revise TCP's windowing to support faster data transfers, which would allow transfer-intensive applications to tap existing networks for high-speed transport. Finally, virtualization technology will replace networking products, increase security, cut costs, and supplant proprietary management systems with standardized ones.
    Click Here to View Full Article