Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM. To send comments, please write to technews@hq.acm.org.
Volume 5, Issue 580:  Monday, December 8, 2003

  • "Nations Chafe at U.S. Influence Over the Internet"
    New York Times (12/08/03) P. C1; Schenker, Jennifer L.

    World governments are frustrated at the singular control the United States exerts over Internet governance, and officials at the United Nations are readying a new Internet governance structure that would make it a more multilateral endeavor. Critics of the current system say the U.S.-established Internet Corporation for Assigned Names and Numbers (ICANN) needs to be opened up to international influence. U.S. influence in Internet structure is obvious in many cases, such as the fact that MIT has more available Internet addresses than China, which will be home to more than 50 percent of all Internet users by 2007, according to some estimates. Under current proposals, ICANN's oversight committee would have permanent representatives from several U.N. organizations and elected representatives from each continent. The United States representative would permanently hold the oversight committee presidency, said United Nations Information and Communication Technology Task Force vice chairman Talal Abu-Ghazaleh. Among the invited attendees at U.N. meetings to discuss future Internet governance are European IT commissioner Erkki Liikanen, U.N. Secretary General Kofi Annan, and technology experts Tim Berners-Lee, Esther Dyson, and Nicholas Negroponte. Global corporate representatives from companies such as Boeing, Siemens, Alcatel, Vodafone, and Microsoft will also be present at scheduled meetings this month. ICANN President Paul Twomey, whose role has been severely diminished in the discussions, argued that ICANN does not prevent the international community from addressing many of the stated non-technical concerns, such as how to manage spam, pornography, and the growing digital divide between nations.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "For Some Older IT Workers, Job Prospects Improve"
    EarthWeb (12/03/03); Gaudin, Sharon

    Older IT professionals have had to face not just the economic recession, shrinking IT budgets, and increased offshore outsourcing, but age discrimination as they struggle to find work. Over-50--sometimes even over-30--IT workers are often cast aside by younger hiring managers as unimaginative or out of step with current work trends and technology. Yet many tech companies, smarting from the dot-com meltdown and depressed economy, can boost their survivability by employing the experience and business acumen such workers possess. Unemployed IT worker Allan Shapiro, 61, says that younger managers feel threatened by older subordinates because of their advanced knowledge, and are unwilling to consider how their experience can be leveraged to help the company. Shapiro adds that older workers are also more willing to work longer hours. A recent report from Challenger, Gray & Christmas indicates that the value of older IT workers has begun to rise from the doldrums: The average job search time for out-of-work managers and executives age 50 and up is declining 10 times faster than that of younger job seekers; over-50 search times fell 19 percent from 4.9 months in the fourth quarter of 2002 to 4 months at the end of the third quarter in 2003, while under-50 search times experienced only a 1.8 percent falloff to 3.8 months over the same period. Challenger, Gray & Christmas CEO John Challenger reports that younger job seekers could soon find their job-search times outpaced by older IT job seekers. "Companies have recognized that experience is an essential piece of the puzzle," he declares.
    Click Here to View Full Article

  • "NASA: Looking Back at Earth"
    TechNewsWorld (12/06/03); Halperin, David

    NASA conducts a number of programs designed to model how the Earth works, and develops new computer technologies to help in its task. NASA operates 18 satellites that gather raw data such as ocean current, temperatures, and spread of invasive species for analysis back on Earth. The analyses yield a number of insights into geology, oceanography, and climatology studies, among others. NASA's Computational Technology Project project manager James Fischer says his group's work involves finding outside research groups to develop advanced technology platforms, especially software. The Earth System Modeling Framework, for example, is meant to allow different research organizations conducting earth modeling projects to interchange data and model components. NASA also pioneers hardware advances, demonstrated by its SGI Altix supercomputer with 512 processors running a Linux-based system. The machine allows the most detailed Earth model using real-world data, called Estimating the Circulation and Climate of the Oceans (ECCO). The single system image afforded by the supercomputer allows for a simpler programming development environment and faster interconnections, says Bob Ciotti of NASA Ames Research Center's Terascale Application Group. Another Ames project is the Information Power Grid, which will link major NASA computer resources together for both better utilization and capacity, something Advanced Supercomputing division head Walt Brooks says will save significant amounts of money for the agency.
    Click Here to View Full Article

  • "Wi-Fi to Face Interoperability Challenges"
    eWeek (12/04/03); Hachman, Mark

    Industry executives gathered at the recent Wi-Fi Planet show in San Jose stressed that establishing interoperability among an "alphabet soup" of IEEE 802.11 technologies will be a major challenge as the industry embarks on a project to implement multimedia services across Wi-Fi starting in 2004. "Here we have an established base of interoperability among Wi-Fi manufacturers, but it will be more critical as we move into the next phase," commented WiFi Alliance technical director Greg Ennis. The development of 802.11e will directly affect voice-over-IP networks, which will have to be upgraded at least twice in order to use the protocol to its full potential, according to Texas Instruments' Ian Sherlock. The first half of next year will see the rollout of the Wireless Multimedia Extensions (WME) substandard, which uses a less complex contention-based scheme known as Enhanced Distributed Channel Access, explained Sherlock; his company discovered in simulated tests that a WME-based network can accommodate approximately 50 handsets. Sherlock said the second 802.11e phase, Wireless Scheduled Multimedia (WSM)--which is currently undergoing definition by an 802.11e subgroup--should debut in the latter half of 2004. WSM will use Hybrid Controlled Channel Access (HCCA) to initiate exchanges between access points by occasionally "polling" handsets, and Sherlock noted that HCCA is less vulnerable to data loss under load, adding that the protocol can probably support double the number of handsets as WME can. The IEEE intends to finalize the specification for 802.11h, an addition to the Mac layer for 5 GHz WANs that is Europe-specific, by the end of 2003; the Japanese 802.11j spec and the 802.11i security standard should be finalized in 2004. The 802.11n spec, which should exceed 100 Mbps, will be fleshed out by the IEEE in 2005.
    Click Here to View Full Article

  • "Linux Guru: Move Quickly to New Kernel"
    CNet (12/05/03); Shankland, Stephen

    Marcelo Tosatti, who maintains the 2.4 Linux kernel by the authority of Linux developer Linus Torvalds, announced via a Dec. 1 posting to the Linux Kernel Mailing List that the 2.6 kernel is sufficiently mature to form the basis of new projects; he declared that he will accept some changes and support for specific new hardware in version 2.4.24, but versions 2.4.25 and up will only be released to address critical issues such as security holes. Some programmers are unhappy with Tosatti's decision: Jan Rychter aired his concerns on Dec. 3, noting that 2.4 is being transferred into a "pure maintenance" mode while people are under advisement to migrate to 2.6. "While people slowly start using 2.6, Linus starts 2.7 and all kernel developers move on to the really cool and fashionable things," he speculated. "2.6 bug reports receive little attention, as it's much cooler to work on new features than fix bugs." D.H. Brown analyst Tony Iams defended Tosatti's decision as sensible, arguing that both commercial and open-source software releases follow such a model. The 2.6 Linux kernel maintainer Andrew Morton said last month that the kernel should be released in December, while Tosatti predicted in his posting that the kernel could debut either this month or next month. Monday's debate about 2.4 and 2.6 was spurred by Silicon Graphics (SGI) programmer Nathan Scott's request of Tosatti to include SGI's XFS file system software in the 2.4 kernel, which Tosatti refused; XFS was originally designed for SGI's Unix-based Irix operating system, but Iams said a change in emphasis to the Linux-based Altix Itanium server could work to SGI's benefit. However, the Fermi Accelerator Laboratory's Dan Yocum sent a posting in which he stated his organization would not adopt 2.6 until it is truly stable.
    Click Here to View Full Article

  • "Robotics Revolution"
    PC Magazine (12/03/03); Ulanoff, Lance

    Lance Ulanoff writes that robotics technology showcased at Comdex was impressive enough to make him consider that the field could one day become the savior of the tech industry. He was particularly taken with Paro, a Japanese robot that resembles a baby seal and reacts to petting as a therapeutic tool for the sick and infirm; Paro developer Takanori Shibata said the machine features sensors and voice recognition, so it can learn its name and differentiate between gentle and more aggressive stroking. Ulanoff also witnessed a demonstration of many robots, both autonomous and remote-controlled, at a nearby restaurant as part of a conference backed by VIA Technologies, whose low-power Mini-ITX motherboard was employed by all the machines. The VIA motherboards make the machines bot-PC hybrids, which could aid in the accelerated development of new robots. One of the companies showcased at the VIA conference, Robodynamics, showed off its Personal Droid Assistant, which can reportedly be used for security, entertainment, and telepresence. Robotics company representatives such as Robodynamics CEO Fred Nikgohar believe that robotics' movement into the consumer space is being driven by the maturation of enabling technologies such as wireless connectivity, sensors, microchips, and speech recognition, while Ulanoff writes that nanotechnology and imaging also play a significant role. Evolution Robotics' ER Vision--a software program that learns to recognize people and objects--was especially interesting. Paolo Pirjanian of Evolution Robotics said his company is creating the core components of a robotics platform that consists of object recognition, localization and mapping, and architecture, and commented that future robots will need to learn and "become capable of not just one application, but many--true multipurpose [robots]."
    Click Here to View Full Article

  • "Hackers Steal From Pirates, to No Good End"
    New York Times (12/08/03) P. C2; Schwartz, John

    Designers of rogue "Trojan horse" programs are adopting a decentralized peer-to-peer (P2P) network model that threatens to make the spread of their malware impossible to halt, warn computer experts. LURHQ computer specialist Joe Stewart found evidence of such a development when disassembling the Backdoor.Sinit program, which has been wending its way throughout the Net since late September. Machines taken over by Sinit cohere into a decentralized P2P network similar to Kazaa and other programs usually used to exchange music files. Until now, Trojan networks were vulnerable to shutdown because they relied on a single site for downloading instructions and software updates--but Sinit has eliminated this reliance. Worse, Stewart observes in a research paper that Sinit's presence on machines commandeered to serve pop-up ads and to download "porn-dialer" programs indicates that the program was primarily designed to be a profit-generating tool. "This Trojan is...further evidence that money, not notoriety, is now the major driving force behind the spread of malware these days," he contends. Counterpane Internet Security founder Bruce Schneier agrees with Stewart that the discovery is proof of a migration in hackers' motivation away from thrill-seeking and towards turning a profit from their malicious activities. Sophos development director Jesse Dougherty estimates that at least one third of all online spam is distributed by PCs contaminated by malware, and adds that the MiMail program, a Trojan that causes infected machines to attack the systems of anti-spam organizations, is further proof that Trojans are employed in spamming.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Smart Assistant Will Cut Driver Distraction"
    New Scientist (12/07/03); Graham-Rowe, Duncan

    Laws banning the use of certain devices by drivers in order to reduce the number of accidents attributed to distraction have not halted the flood of in-vehicle electronic devices, so BMW and Robert Bosch engineers, partially funded by the government of Germany, are developing a system that determines when the driver is too busy to handle incoming emails, phone calls, text messages, and other activities. The system employs an array of sensors to detect the layout of the road, other vehicles, and the driver's actions, and processes this data to anticipate upcoming situations and assign them a complexity rating. Should the rating pass a specific threshold, all distracting functions are turned off and incoming messages diverted until the complexity level drops to acceptable limits. "We not only take into account the current situation, but also the oncoming road situation," observes Walter Piechulla of the University of Regensburg. He tested the system on a dozen motorists in Germany, and found that the tool significantly reduced their mental stress, no matter what their level of driving experience. In its final form, the system would alert drivers that gadgets are deactivated by use of an audio signal or a dashboard light, while an "danger earcon" would go off during a phone call to make both parties on either end aware that the call is distracting the driver from more critical functions. Paul Green of the University of Michigan's Transportation Research Institute remains unsure that such electronic aides are effective solutions, but notes that in-vehicle electronic device manufacturers are being pressured by the cell phone industry to make such technologies work. A U.K. Department for Transport representative says the British government would welcome additional in-car gadgets, as long as they do not distract motorists.
    Click Here to View Full Article

  • "Gartner Sees "Massive Disruption" to IT Workers"
    CNETAsia (12/04/03); Lui, John

    By 2008, the IT industry is expected to experience a "massive disruption to the IT workforce," according to Gartner, which predicts huge changes to the industry over the next few years. Software code will become more modular and reusable by 2008, as more companies tie their business processes to IT within a Web services framework. "Software code will be generated for a Web services environment," says Ian Bertram, vice president at Gartner Asia Pacific. "I can get a new service, and suck that code into my environment, rather than re-coding and paying for a coder." Large programming teams that keep month-long schedules will be replaced by smaller teams of business analysts using more robust data-modeling and code-creation tools. There will be more mergers, acquisitions, and bankruptcies involving technology suppliers, and the IT outsourcing trend will continue to grow. Today's programmers should expect that "pieces of their jobs will go away," says Bertram, and prepare for that change by acquiring broad business skills.
    Click Here to View Full Article

  • "Japan's Robot Developers Go Linux"
    Linux Insider (12/03/03); Krikke, Jan

    Japanese companies say the robotics industry could outpace the PC industry: In anticipation of this development, firms such as Mitsubishi are designing robots for practical use that are also convenient, inexpensive, ubiquitous, and powered by Linux software so that they may run on a standard platform. The incentive for developing practical robots is an economic one, since Japan's elderly population is increasing, birthrates are declining, a labor shortage is on the horizon, and the country's domination in the consumer electronics and household appliance industries is losing ground to other Asian competitors. Among the robots about to be rolled out or already in use are Sanyo's Hopis, an on-site physician's aide that can diagnose for eye disease, take digital thermometer readings, obtain glucose levels of diabetics, question patients via a speech synthesizer, and transmit medical information wirelessly; Fujitsu's Maron-1, an electronic sentry that can notify owners of intrusions and operate household appliances, and is remotely controlled by an Internet-enabled phone; and Wakamaru, a machine from Mitsubishi that handles many of the duties a human nurse performs. Wakamaru's software platform is based on MontaVista Software's embedded Linux distribution and tool suite, which was selected on the basis of its sophistication, networking capabilities, and reliability. A major issue for robot developers is real-time computing, which along with reliability is essential for factory automation. Honda Research Institute's Tino Lourens notes that RTLinux combines quality with affordability, but "does not guarantee absolute time restrictions." Linux's emergence as Japan's standard robot platform is not assured--it faces competition primarily from Microsoft's CE and Wind River Systems' VxWorks.
    Click Here to View Full Article

  • "That 1994 Feeling"
    Salon.com (12/04/03); Rosenberg, Scott

    Like HTML in 1994, RSS technology today promises to revolutionize the way people receive and distribute information. While RSS has yet to receive an easily understood moniker, such as HTML's World Wide Web title, it is slowly gathering grass-roots support in the same way HTML did. RSS is a "push" technology that allows users to subscribe to different feeds and receive content update notices, among other things. Currently, the most common application is for update notices on blogs, news sites, and whatever other resources people want to keep current on. But because the RSS format uses structured XML data, developers have many options to create new types of RSS-based applications in the future. Another obstacle holding back faster RSS adoption is the lack of very intuitive user interfaces and simple subscription methods. For example, clicking on the orange "XML" buttons representing RSS links on many sites only produces pages of pure XML data for users, which is very confusing for non-technical people. It requires some special knowledge to understand the proper way to use RSS links is by copying the URL. Despite these relatively simple hurdles, a large and growing body of RSS users is already experiencing a new type of Web, where users create "human filters" for their information and do not have to work so hard to find the most current news and resources. RSS aggregators, available for free download online, allow users to easily tap hundreds of news and resource sites they favor--doing away with browser bookmarks. RSS technology also allows the user to easily become a publisher and distributor of online content, and will fulfill the blogosphere vision where "everyone will be famous for 15 people."
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "Software Paraphrases Sentences"
    Technology Research News (12/10/03); Patch, Kimberly

    Cornell University researchers are developing computer programs that can automatically paraphrase sentences, a capability that could prove useful in machine translation, technologies to help disabled people, and computer processing of natural language. The technique borrows from computational biology and was applied to online journalism. The researchers gathered stories about an ongoing news topic covered by Reuters and Agence France-Presse, in this case Middle East violence, and used that body of work as source to paraphrase. MIT researcher Regina Barzilay, who recently worked on the Cornell project, says sentence patterns were found, as well as key facts and arguments; these basic elements are similar to the evolutionary traces common between genes, and are deduced using the same techniques used in computational biology. The software program also uncovered journalistic bias as it sometimes paraphrased "suicide bomber" as "Palestinian suicide bomber," and assumed people killed were Israelis. Barzilay says the difficult part of developing the program lay in determining which variances between reports are due to different subjects entirely and which are due to paraphrasing. The paraphrase software is part of the Columbia News Blaster project, which aims to automatically summarize news reported online without human aid, and the next step is to paraphrase entire documents instead of just sentences. Eventually, the software will be able to paraphrase language in ways easily understandable to humans, and likewise understand what humans write or say. The Cornell researchers' project was underwritten by the Sloan Foundation and the National Science Foundation.
    Click Here to View Full Article

  • "Technology Facilitates Data Gathering"
    Daily Pennsylvanian (12/03/03); Khetan, Sameer

    Noun-phrase coreference resolution was the subject of a speech at the University of Pennsylvania delivered by Cornell University computer science professor Claire Cardie. A process for identifying nouns or phrases that refer to the same entity, noun-phrase coreference resolution allows researchers to collect data in a more comprehensive manner. For example, noun-phrase coreference resolution would enable a researcher to find the various references to "New Jersey," such as "the Garden State," in hundreds or thousands of different documents. The process method would be able to identify other synonyms of "New Jersey" in documents after identifying "the Garden State" as a reference to the state. Members of the Computer and Information Science Department at Penn have used noun-phrase coreference resolution to assist with medical research. "There are many articles on medical science, and researchers cannot keep up with them," says CIS Department Chairman Fernando Pereira. "Noun-phrase coreference resolution allows us to mine huge texts for specific medical information." Cardie says noun-phrase resolution is part of an effort to "combine computing with trying to understand language."
    Click Here to View Full Article

  • "Section 508: Built Into Business"
    Washington Technology (11/24/03) Vol. 18, No. 17, P. 1; Emery, Gail Repsher

    Government agencies, contractors, and vendors have normalized Section 508 accessibility requirements as part of their business, said policymakers at the Interagency Disabilities Educational Awareness Showcase (IDEAS) conference last month. White House Domestic Policy Council associate director Chris Kuczynski said that as a blind person himself, he experienced firsthand the effects on government Web sites and other technology over the last two years. General Services Administration (GSA) acting deputy administrator David Bibb noted that the IDEAS conference drew 1,250 attendees this year, twice the 2002 number. Nearly every government agency now has at least an accessibility program manager, while the GSA and other government groups are working on tools to help agencies specify what accessibility standards apply in their request for proposals, a clarification that often currently substituted for a general certification requirement. IBM's Brad Westpfahl warned that a lack of specificity could hamper accessibility efforts on a larger scale, as states and other nations each begin to look at their own standards. U.S. Patent and Trademark Office IT coordinator Fred DiFiore said that once standards are in place, enforcing them strictly is the easiest way to avoid problems in the future. FAA Section 508 coordinator Deborah Douglas Slade says another task is simply creating awareness, something the FAA is doing through a training program.
    Click Here to View Full Article

  • "Blueprint for Web Services"
    InfoWorld (12/01/03) Vol. 25, No. 47, P. 32; Knorr, Eric

    Web services are being planned by companies eager to establish a service-oriented architecture (SOA) that turns applications into services and rolls them into other applications, supporting application reusability and low-cost integration. What is difficult to imagine is how a given organization's suite of Web services--which expands constantly--will work together in an SOA, while developing enterprise-wide rules for writing the Web Services Description Language interfaces that define each Web service's role is also a tough challenge. To set up an SOA, an organization must re-evaluate its business processes and flesh out how they can be described in software. The first step in the SOA roadmap is to ensure that everyone possesses the correct tools, including the most recent integrated development environments and Web services-enabled application servers. Step two is to identify the first several applications that have enterprise-wide usability and reveal them as Web services, while step three is to set up a proprietary or Universal Description, Discovery, and Integration directory for publish/subscribe Web services as they spread, and implement portals to supply a presentation layer for services. The final step is to mesh existing middleware with the Web services infrastructure. Web services' reliance on text-based XML documents has raised performance and security issues, and Forrester research director Ted Schadler contends that "You have to learn the [message-based security] principles, which of course include encryption and having a way to authenticate without opening up the entire message." He notes that organizations stand to save a lot of money through the ubiquitous deployment of Web services, which support practical automation of many manual processes.
    Click Here to View Full Article

  • "A Look at Telematics"
    EDN Magazine (11/27/03) Vol. 48, No. 26, P. 51; Cravotta, Robert

    Telematics is the integration of location-based information services with two-way wireless communication, and telematics systems have wide applications in automotive services such as customized infotainment, location-based safety, security, notification, and tracking. Telematics encompasses a location or positioning module (which can be GPS- or cell-based depending on coverage needs), a user interface (which is usually hands-free and incorporates speech-recognition technologies), and a chassis-level network that supplies a portal into the vehicle-command, -control, and -diagnostics and infotainment systems. A successful telematics system offers persistent two-way data transfers in which users receive relevant information wherever and whenever they need it, but this vision faces several challenges: For one thing, such services are scant in remote areas, while a lack of interoperability and interface standards also complicates matters. The first problem could be solved via reliance on multiple communications technologies and the expansion of transmission range, while the adoption of modular architectures by telematics-system integrators could help alleviate the second. System costs are another challenge, and many telematics-platform providers are trying to meet it by offering highly integrated platforms whose various components are derived from multiple suppliers. Telematics designers must also address obsolescence, taking into account the discrepancies between the life cycles of consumer electronics and automotive products; enabling platform independence through the deployment of Java-based software is one possible answer. Telematics systems will probably be kept separate from critical vehicle operations, given the liability risk to vehicle makers, but security and privacy issues need to be addressed through wireless upgrades. Additional concerns stem from battery depletion while the vehicle is idle.
    Click Here to View Full Article

  • "Malcode Melee"
    Scientific American (11/03) Vol. 289, No. 5, P. 20; Gibbs, W. Wayt

    Last August's Welchia worm looked like a "white hat" worm to some because it automatically patched computers and deleted the Blaster worm, released just one week earlier. And despite causing Air Canada flights to be cancelled and disrupting the U.S. Navy-Marine Corps Intranet, Welchia contained a subroutine set to self-destruct the worm on Jan. 1, 2004. It seemed as though a well-intentioned hacker wanted to repair the harm done by Blaster, even though it did cause some negative side-effects. CERT/CC incident-handling team leader Marty Lindner says Welchia is a bad worm like others, except that it is better suited to survive in the Internet ecology, where increasing network connections mean heightened competition between worms and viruses. By blocking the hole through which it came and eliminating Blaster, the Welchia worm strengthened its hold over the host machine, which Blaster set to continuously reboot. Lindner also points to the file transfer server function opened by Welchia, which is like a backdoor left propped open, even after the worm self-deletes on New Year's Day. Machines wiped clean of Welchia would still have this secret backdoor open to the hacker. But University of Illinois malware researcher Michael Liljenstam says that if the Internet ecology scenario is true, then good worms could prevent harmful infections, in the same way that farmers release beneficial insects in a field to keep out harmful ones.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only [pay-per-article option available].)

  • "What Do Users Want From Converged Multimodal Communications?"
    Business Communications Review (11/03) Vol. 33, No. 11, P. 44; Rosenberg, Art

    The deployment of unified messaging and unified communications has become possible through the development of a converged voice/data network framework, while the market movement toward instant messaging and IP telephony is driving the vision of converged multimodal communications closer to reality. Meanwhile, increasing numbers of enterprise end users requiring wireless handheld modality is bolstering the need for communications convergence. Modality switching offers users the ease and flexibility to maintain communications across changing circumstances, while cross-network, multimodal capability that dynamically coordinates the requirements and priorities of both communicating parties will provide the intelligence needed to keep confusion in establishing contact with others to a minimum. Enterprise users are expected to enjoy emerging converged and mobile communications alternatives, but their precise desires will remain vague until they employ the technology. Enterprise managers are chiefly concerned with communications technology development and maintenance expenses, while user worries and requirements are markedly different: Their needs include the initiation of successful outbound contacts; incoming and outgoing communication management personalization; ease of response to messaging; dynamic conferencing initiation and modality management; wired and wireless portability of communication devices; automated "intelligence;" privacy control for end-user access and communication content; and integration of information with communication contacts. Addressing these needs will require new designs for devices and user interfaces, client software, and server functionality, while mobile, converged communications will need an additional layer of interface design complexity and modality control to be considered. Of paramount concern to users is how fast they can elicit a response to communications they initiate, rather than how fast they can respond to others. Communication initiators will be more productive if they can achieve a successful contact with only one try.
    Click Here to View Full Article