Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to technews@hq.acm.org.
Volume 6, Issue 670:  Monday, July 19, 2004

  • "Software Group Enters Fray Over Proposed Piracy Law"
    New York Times (07/19/04) P. C8; Lohr, Steve

    Supporters of a controversial copyright bill that seeks to establish legal liability for anyone who "intentionally aids, abets, induces, counsels or procures" a copyright violation are using an IDC report commissioned by the Business Software Alliance as ammunition. The report pegs the annual losses from software piracy at $29 billion--$13 billion more than previous estimates--although IDC research director John Gantz says he would have preferred to describe the amount as "the retail value of pirated software." Critics of the bill claim the proposal only furthers intellectual property owners' goal of securing their copyrights by instituting even more draconian regulations, and warn that if enacted the measure would stifle innovation. Over 40 companies and organizations fired off a letter to bill co-sponsor and Senate Judiciary Committee Chairman Sen. Orrin Hatch (R-Utah), urging him to set up a hearing on the bill, and Judiciary Committee press secretary Margarita Tapia reports that a hearing is slated for July 22. A staff counsel for the Judiciary Committee confessed that the bill's wording is worrisome, but insisted the legal liability standard was based on behavior and the "totality of your conduct" rather than the technology or equipment provided by a company or organization. Stanford Law School professor Lawrence Lessig thinks the legislation's bipartisan supporters, which include many Senate leaders, were misled as to the bill's true intentions. The BSA study has been faulted by Computer and Communications Industry Association President Edward Black and Consumer Electronics Association President Gary Shapiro, who claim in a letter to the alliance that the findings are distorted and incorrectly assume that every instance of unauthorized software duplication translates into lost sales revenue.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "IT May Help Clean a Polluted Sea, Say Researchers"
    NewsFactor Network (07/16/04); Martin, Mike

    NOAA oceanographer Christopher Sabine reported in a recent edition of Science that almost half of all carbon dioxide emitted by the consumption of fossil fuels is being absorbed by the world's oceans, but computer science researchers at the Informatics and Telematics Institute Center for Research and Technology in Greece claim in a report that software and IT may be just as valuable as anti-pollution hardware in correcting the problem. Ioannis Athanasiadis and Pericles Mitkas describe "a multi-agent system for monitoring and assessing air quality attributes" by analyzing meteorological data from multiple sensors. Their report was published in an issue of Management of Environmental Quality. The monitoring of the environment by software agents is part of an "enviromatics" movement that studies how IT can be applied to environmental science, surveillance, evaluation, administration, and protocol. The Greek research team wants to reduce the human factor in environmental monitoring in order to eliminate obstacles to rapid and unprejudiced decision-making through the application of the O3RTAA multi-agent intelligent software system. Mitkas notes in the study that the system boasts a distributed-agent architecture to support the monitoring of both meteorological and air pollutants, the assessment of air quality, and the triggering of environmental damage warnings; he writes that the system extracts knowledge through the use of machine-learning algorithms and data-mining techniques. Athanasiadis and Mitkas posit that O3RTAA "improves existing environmental-monitoring systems by adding customized intelligence." Agentis CTO David Kinny notes that practical software agents require the automation of a wide spectrum of decision-making behaviors, and though automation methodologies exist, transferring them from research facilities to industry is a difficult proposition.
    Click Here to View Full Article

  • "Open-Source Is More Than Just Linux"
    Associated Press (07/19/04); Fordahl, Matthew

    Experts say open source software is fundamentally changing the computer industry, forcing businesses to rethink strategies and enabling new companies to upset established ones. Technology publisher Tim O'Reilly says companies able to understand and act on this shift will reap the benefits, similar to how Intel and Microsoft profited from open hardware standards in the 1980s. Many of the most famous open source projects got started in the last decade and have roots in academia: The Apache Web server emerged from the National Center for Supercomputing Applications at the University of Illinois, the same as the Mosaic Web browser and many of the early Netscape Communications developers; the institution made the server source code available to anyone, and Brian Behlendorf and seven other programmers started the Apache project after sharing patches for the server via an email discussion group. Open source software usually does not indemnify its end users from possible patent infringement, and companies such as SCO Group have sued several large users of Linux. Sendmail, the email server claiming about 40 percent of the worldwide market, incorporated into a company that does provide legal protection to its users. Although Sendmail adheres to open source principles, co-founder and Chairman Greg Olson says the company's allegiance to open source is not political, but based on business realities. He says open source provides innovation and a good standards process, while MetaGroup analyst Corey Ferengul says the conversion of many companies to open source is not altruistic, but the result of software commoditization. Companies want to sell more products, and find open source platforms serve as a basis for their other offerings.
    Click Here to View Full Article

  • "Loose Clicks Sink Computers"
    Baltimore Sun (07/19/04) P. 6A; Stroh, Michael

    Stray signals discharged from an electronic device can unintentionally reveal sensitive data, a phenomenon known as "compromising emanations" that has long been an attractive area of study for civilian computer researchers. In one experiment, Cambridge University computer scientist Markus Kuhn can intercept radio waves emitted by laptop video connectors, and he says that "There are probably a half-dozen or dozen exciting phenomena yet to be discovered." In another experiment, Kuhn was able to rebuild the image on a computer screen by analyzing its reflected glow on a nearby wall, while Lockheed Martin Space Systems' Joe Loughry and Auburn University's David Umphress learned that the patterned blinking of light emitting diodes embedded in hardware components can give hints about the information passing through the machine. The exploitation of compromising emanations has been a longstanding tradition, and about four decades ago the U.S. military started a highly classified project run by the National Security Agency to develop hardware that could sense and block such signals. Electromagnetic radio waves have long been the most worrisome kind of compromising emanations, but more subtle electronic signals have been uncovered in recent years. A pair of IBM researchers, for example, developed a relatively inexpensive technique to figure out what a person is typing by training neural network software to translate unique sound waves produced when the keys strike a membrane between the keyboard and its base; the use of a parabolic microphone allowed the experimenters to listen in from a distance of almost 50 feet. Meanwhile, Eran Tromer of the Weizmann Institute revealed at a May conference that encrypted data could theoretically be cracked by monitoring high-frequency noise emitted by Intel Celeron microprocessors.
    Click Here to View Full Article

  • "E-Voting Suit Highlights Legal Lag"
    InternetNews.com (07/16/04); Kuchinskas, Susan

    Linda Soubirous, who lost a recent election for a seat on California's Riverside County Board of Supervisors, has filed a lawsuit against Riverside Registrar of Voters Mischelle Townsend for allegedly refusing to disclose information pertaining to a recount Soubirous requested when her campaign aides reported witnessing employees of e-voting system vendor Sequoia Voting Systems tampering with the central tallying computers in one precinct. Any voter can request and review all relevant documentation relating to a recount under California law, but Soubirous claims in her suit that Townsend said the electronic information she asked for--including the audit logs, redundant memory stored in the e-voting machines, "logic and accuracy" test results, and chain-of custody records for the system components--were irrelevant. Soubirous says the relevance of the internal information is indisputable, as it establishes whether the votes were registered and tallied properly within the machine. Electronic Freedom Foundation (EFF) legal director Cindy Cohn thinks the suit is the first in the United States to study the issue of what information citizens are entitled to in electronic voting systems, and believes that the ruling will set a new legal standard in California and possibly the rest of the country. Co-plaintiffs in Soubirous' suit include the Verified Voting Foundation; the EFF is providing funding for the suit and acting as a friend of the court. Attorney Gregory Luke, who is handling the case pro bono, says, "This case raises critical questions about whether elections officials can effectively erase whole chapters of the Election Code that guarantee all citizens the right to request a meaningful recount." The suit asks the judge to rule that all future Riverside elections produce a voter-verifiable paper trail and comply with security regulations instituted by the California Secretary of State.
    Click Here to View Full Article
    For information on ACM's e-voting activities, visit http://www.acm.org/usacm

  • "Quantum Crypto Network Debuts"
    Technology Research News (07/21/04); Smalley, Eric

    A collaborative effort between researchers at Harvard University, Boston University, and BBN Technologies has yielded a six-node quantum cryptography network that operates without interruption to facilitate the exchange of secure keys between BBN and Harvard, which are separated by a distance of approximately 10 kilometers. A failing link or node will not compromise the quantum cryptography because any node in the network, dubbed the Defense Advanced Research Projects Agency (DARPA) Quantum Network, can serve as a relay to link two other nodes. BBN scientist Chip Elliot observes that the organizations can split the costs of the fiber infrastructure because the network is switched, and says that one of the network nodes will soon be transferred to BU; "BBN and Harvard can both talk to BU by sharing a single fiber, rather than each requiring its own fiber," he notes. Two of the nodes are connected using wireless optics while the remaining four are linked with fiber-optic cable. The network is currently constrained to urban areas, and increasing its range will depend on the development of quantum repeaters that transfer the quantum state of one photon to another via quantum entanglement or photon interactions. The network currently uses heavily filtered lasers to generate photons, which are not very bright and sometimes discharge multiple photons, which limits efficiency and the security of quantum bits against eavesdropping. The BU researchers are developing better and brighter photon sources that would release entangled photon pairs and eliminate the danger of multiple photons. The network interoperates with Internet protocols and builds a virtual private network, which would in principle secure communications even if an eavesdropper can tap into a line.
    Click Here to View Full Article

  • "Homeland Security: High-Tech Tool Improves Incident Planning and Response for Emergency Management Officials"
    Georgia Institute of Technology (07/15/04); Sanders, Jane

    The Geographic Tool for Visualization and Collaboration (GTVC) developed by the Georgia Tech Research Institute (GTRI) is a collaborative mapping tool designed to aid law enforcement and emergency management officials so they can improve their coordination of event and incident planning and real-time response. The GTVC can render high-resolution imagery at 1-meter resolution statewide, while the maps are scalable and retain all the markings added to them. The system, which was originally designed as a military application, was tweaked over a nine-month period in preparation for the G-8 Summit in June; the GTVC's reliability and robustness was improved, secure encryption for communications was inserted, and maps with six-inch resolution for areas of G-8 interest were added. The GTVC's development and implementation is being funded by the Georgia Emergency Management Agency (GEMA), and GEMA's Ralph Reichert notes that the system extended the monitoring and tracking capabilities of law enforcement teams at the G-8 Summit so that they were always a step ahead of protestors, and enabled consequence-management staff to check the availability of key resources. "Furthermore, and probably most importantly, command staff could immediately get a snapshot of what was going on without relying solely on traditional voice communications," Reichert explains. The G-8 test yielded valuable lessons for GTVC developers, who plan to incorporate a plethora of upgrades into the next version of GTVC software, such as the display of real-time GPS-based tracking of personnel and vehicles; the improvement of information reporting capabilities through the addition of icons and text; and easier network connectivity. "We believe the new version of the GTVC software will provide an easier interface for users who only want to observe situations, rather than enter information into the system," comments GTRI research engineer Kirk Pennywitt.
    Click Here to View Full Article

  • "Polite Computers Win Users' Hearts and Minds"
    New Scientist (07/14/04); Biever, Celeste

    Jeng-Yi Tzeng of the National Tsing Hua University in Taiwan believes that computers would be less intimidating to users if software and operating systems were programmed to respond apologetically to errors. He theorized that people would be more inclined to forgive a courteous computer, and tested his theory by writing brusque and apologetic versions of a computer-based guessing game, which were tried out by over 250 students. Players who used the contrite software were more likely to think the game was fun, and 60 percent of the players said their enjoyment of the game was enhanced by the apologetic feedback. On the other hand, 25 percent said apologies made no difference, while 12 percent got the impression that the game was rigged. Eric Horvitz of Microsoft's Adaptive Systems and Interaction Group says such reactions are in keeping with his expectations: "Arrogant software rubs people up the wrong way just like an arrogant person would," he observes. Jonathan Klein, a builder of robotic toys at Massachusetts-based iRobot, cautions that apologies will eventually lose their sincerity if they are repeated too frequently, and he thinks a much more effective solution is to employ artificial intelligence that can craft empathetic responses. Tzeng says this solution will remain invalid until AI becomes capable of determining users' emotions. Jakob Nielsen with the Nielsen Norman Group sees value in apologetic systems, as the insensitive responses of current systems can make users feel stupid and discourage them from using computers to their full potential.
    Click Here to View Full Article

  • "IBM Tool Has an Eye for the Blind"
    InternetNews.com (07/15/04); Wagner, Jim

    Available for download on IBM's alphaWorks emerging technologies Web site is aDesigner, a 4.6 MB Java-based application that shows how a site appears and sounds to visually impaired users. ADesigner, which was developed in the IBM Tokyo Research Lab, assesses sites on their colors and font choices, their ability to change them upon request, how well they comply with accessibility guidelines, alternate text for images, and link navigation. The program splits the browser into a five-pane window: One pane displays the site as it would appear to most people, a second pane depicts the site as it would appear or read for people with poor vision, and the remaining three panes list and outline problematic details of the site. The application ranks the site's compliance, navigability, and listenability on a scale of 1 to 100, and assigns the site an overall grade. Braille and Technology Center for the National Federation of the Blind manager Steven Booth says site accessibility to visually impaired users is being steadily improved by the developer community, although hurdles still remain: "Labeling [text with images] is a big problem, as well as having too many links per page--it's too confusing to get through--or have forms that aren't labeled so you can't tell what field you're in," he notes. ADesigner is only supported on Windows 2000 and XP, although IBM officials say they are considering adding support for other platforms in the future. IBM's Jim Chou says that "anything you can do" to make using computers and the Web easier "is going to help a lot of people." Chou says few such tools are currently available, but for now IBM has no plans to commercialize the product.
    Click Here to View Full Article

  • "The Internet Funnel"
    Baltimore Sun (07/15/04) P. D1; Bishop, Tricia

    The automatic collection of information from multiple Web sites based on users' interests is the principle behind RSS (really simple syndication), an information management tool whose use is spreading rapidly. RSS alerts people to the most recent data about subjects they have chosen and sends it to their computers; eWeek.com editor Sean Gallagher postulates that the technology could potentially become "as big as the Web itself." Syndication's appeal stems from its ability to relieve the user of the burden of hunting for content, which can be a painstaking, time-consuming process. RSS software--a lot of which is free for download--enables users to program "readers" to troll the Web, automatically capturing updated site content and delivering it to their systems; advocates believe the technology will eventually enable people to construct personalized TV networks as the industry makes the transition to digital, search for and automatically install software updates, and perhaps even service a computer-equipped automobile on the road. University of Maryland senior Anthony Casalena explains that RSS technology puts the subscriber, not the publisher, in control. Bloggers were the first enthusiastic adopters of RSS, but now more mainstream media such as Yahoo! and newspapers support RSS feeds, although the technology is still not very well known or in use among the majority of Internet users. Online PR company owner Marci De Vries thinks RSS feeds and readers have value for casual users, particularly as replacements for corporate "blast emails" that could reduce the spam factor dramatically. "Publishing new information is critical to getting people back to their site and making them seem like real resources on the Web," she says.
    Click Here to View Full Article

  • "Copyright Scheme Could Fuel Transition to High-Definition Systems"
    EE Times (07/15/04); Merritt, Rick

    Disney, IBM, Intel, Panasonic, Sony, Toshiba, and AOL Time Warner have announced that they are developing a copyright protection system for high-definition DVDs that is stronger and more flexible than the current DVD copy protection scheme. A year-long effort, the Advanced Access Content System (AACS) will provide rules on the sharing of content between systems and over home networks, and the group of film, consumer electronics, and PC companies believes the new system will spur the growth of the HD market. "A lot more business models could flourish from this work than just today's Blockbuster and pay-per-view models," says analyst Richard Doherty. AACS will make use of 128-bit AES encryption, a stronger software renewal and revocation scheme, a new media key block strategy for matching keys on a DVD disk with keys on a DVD player, and network authentication tools, as well as ask for authentication of hard disk drives. With Disney and AOL Time Warner controlling about 40 percent of the home video market, AACS could force changes to manufacturers' hardware and software. Nonetheless, DVD makers, studios, and end users still could reject AACS. The group expects AACS to be available for licensing by the end of 2004.
    Click Here to View Full Article

  • "IT Workers Stay Put in Less-Promising Careers"
    InformationWeek (07/12/04) No. 999, P. 62; D'Antoni, Helen

    The unemployment rate for information technology professionals remains the same as last year, at 3 percent, even though the industry is said to be losing jobs to outsourcing, reveals a new report by InformationWeek Research. The 2004 National IT Salary Survey shows that the majority of IT professionals appear to be satisfied with their current positions, and are not actively looking for a new job. Half of 5,321 IT staffers were found to be completely satisfied with their jobs, while 56 percent of IT managers felt the same way. Nonetheless, the general satisfaction with current jobs comes at time when tech workers acknowledge that a career in IT has lost some of its luster over the past five years. IT professionals added that they are working longer. IT staffers put in an average 66 hours, including hours on call, per week, while IT managers devoted 72 hours per week to their jobs. Staffers and managers said the combination of their long hours and high demand for their skills has raised the stress level of their work environment over the past 12 months.
    Click Here to View Full Article

  • "Radio Sans Frontieres"
    New Scientist (07/10/04) Vol. 183, No. 2455, P. 24; Daviss, Bennett

    Software defined radio (SDR) that can autonomously select optimal radio signals based on surrounding conditions and frequency activity promises to eliminate interference and redefine ownership and regulation of airwaves, much to the chagrin of license holders. SDR requires the presence of an antenna and amplifier, but these are not beholden to specific signals, while an analog-to-digital converter builds the digital approximation of a seamless signal; the software processes this approximation to scan for any signal in range, and can alter its search to look for different signals on the spur of the moment. "What we're talking about is virtually limitless wireless bandwidth," asserts David Reed of the MIT Media Laboratory, who explains that radio signals do not actually interfere with each other: The information in each signal remains intact, and the distortion generally referred to as interference is actually caused by poorly designed receivers. License holders are resisting SDR, given how much they have paid for exclusive access to portions of the available radio spectrum, but Reed estimates that most of the licensed spectrum is in use only about 10 percent of the time--a fact that is galling to regulators. Regulators and licensees are also concerned about SDR's potential for abuse when it is made accessible to the public. Three strategies to avoid such a scenario are being explored by regulators: A scheme in which hardware devices authenticate the software's origin, the scanning of downloaded software by the receiving device to guarantee that it is behaving appropriately, and the design of devices that cannot physically run software that transgresses certain rules. The opinion of most observers is that regulators will favor a shared-frequency architecture, although the transition will be a slow process. The growing sophistication of software radio technology has also partly spurred the "open spectrum" movement, which calls for the rescinding of all regulations and the opening of the entire radio spectrum to all users.

  • "Analysis Tools Aren't Static"
    Software Development Times (07/01/04) No. 105, P. 27; Schindler, Esther

    Code analysis tools have come a long way since the first code analyzers were developed, and today are used for rapid technology adoption, the code acceptance process (particularly for enterprises that are outsourcing software development), corporate acquisitions, and new software product purchases, among other things. The chief advertised benefit of code analysis tools is their ability to reduce development time and detect flaws earlier, thus saving companies a lot of money; these advantages impact application development by speeding up programmer productivity and boosting confidence in the code's quality. Code analysis products can generate reports with valuable metrics, which can facilitate process changes that affect a development team's efficiency positively. A potential downside is that management may require developers to strictly conform to measured expectations, though Smart Bear CEO Jason Cohen observes that such a policy could breed more careful programmers, and by extension more quality code. Developers and even vendors warn against overreliance on code analysis tools, which is very common. It is also critical that the proper tools be applied at the proper points in the development process, and human analysis should not be left out of the equation. AtStake research and development VP Chris Wysopal cautions that a company could be legally liable for using code analysis tools. This is especially true for security tools, where a company's conscious decision not to resolve issues uncovered with existing tools can lead to charges of willful negligence if security bugs crop up later.
    Click Here to View Full Article

  • "Ready to Buy a Home Robot?"
    Business Week (07/19/04) No. 3892, P. 84; Edwards, Cliff; Rowley, Ian; Petty, Andrew

    Future Horizons reports that the electronics industry is nearing a watershed in which labor-saving domestic robot devices will explode and proliferate to the tune of $59.3 billion and 55.5 million units in the world by the end of the decade. Entertainment robots such as Wow Wee's Robosapien and Sony's robot dog, Aibo, are popular toys, although distinguishing between people and objects is beyond their capabilities. Japan is developing more sophisticated robots such as Honda's bipedal, walking ASIMO and Sony's QRIO, which can perform complex motions such as jogging and dancing. Appliance robots such as iRobot's Roomba vacuum cleaner are appealing not only for their labor-saving abilities but for their "cuteness" or "fun" factor, while home security is another application these machines are useful for. Assistive robots are expected to be highly desirable as caregivers for aging populations, while other examples under development include wearable strength-enhancement exoskeletons and autonomous robots that can transport important items to battlefield commanders over treacherous terrain and scope out enemy positions. Immobots are mostly stationary devices with embedded software programs and Internet connections, and that work in teams to communicate labor-saving commands to each other: For instance, when it is time to wake up, an alarm clock might signal the coffee maker to activate and alert the medicine cabinet to check the weather and pollen count, which would then inform the wardrobe closet to suggest appropriate apparel. The apotheosis of robot technology, in the public's mind, are androids, or humanoid robots that people expect to have human emotions and responses, according to MIT researcher Cynthia Breazeal. One of the biggest challenges is inventing software that enables robots to adjust to unanticipated obstacles.
    Click Here to View Full Article

  • "Exploiting Software: The Achilles' Heel of CyberDefense"
    CyberDefense Magazine (06/04) Vol. 2, No. 6, P. 20; McGraw, Gary; Hoglund, Greg

    Analyzing real software attacks and understanding how they occur is the only true way to devise effective countermeasures. Software has become a tool for modern espionage, and attacks can range from exploitation of existing vulnerabilities (design flaws, programming bugs, etc.) to the insertion of subversive code that can carry out any mix of data collection, stealth, covert communication, and command and control operations. Few available books on computer security detail exploitation from a technical programmer's point of view, nor do they teach software practitioners to recognize new exploits. Tackling the software security problem is becoming all the more urgent as system complexity, built-in extensibility, and connectivity--the "trinity of trouble"--increase, and these same factors are also responsible for the growing ease of cyberattacks. Software vulnerabilities are classified as either bugs (easily repairable implementation level problems that exist only in code) or flaws (subtler, more deeply embedded problems that can be present at both the code and design levels). The importance of specific vulnerabilities is in a state of constant flux because of the growing sophistication of attacks, and determining the presence of design-level flaws in a program is a complex task that is especially difficult to automate. Because software exploitation techniques are fairly specific and relatively small in number, new software exploits can often be uncovered by applying common techniques, and hacking is best learned by becoming familiar with standard methods and attack patterns (attack blueprints that outline vulnerabilities and show the hacker how to exploit the targeted system) and seeing how they are deployed in specific attacks. Exploits are defined as an attack pattern designed to penetrate a specific piece of target software; the act of conducting an exploit is known as an attack; and attackers are those who facilitate an attack through an exploit, although their intentions may not always be malevolent.
    Click Here to View Full Article

  • "WiMax Hits the Road"
    Business Communications Review (06/04) Vol. 34, No. 6, P. 30; Finneran, Michael

    WiMax marks the advent of next-generation wireless data technologies that are unencumbered by the limited range and data orientation of wireless local area networks, and has the added advantage of a flexible infrastructure that meets the needs of both fixed and mobile users operating in licensed or unlicensed bands, delivering both consistent- and variable-delay services while running in a carrier-scale environment. Some backers of WiMax expect the technology will enjoy success comparable to that of Wi-Fi, although the latter has had more time to gain a following, and is targeted at end users rather than carriers and equipment providers. WiMax Forum President Margaret LeBrecque says her organization expects the rollout of WiMax to take place in three stages: The deployment of fixed location private line services or hot spot back-haul; the implementation of broadband wireless access and wireless digital subscriber lines; and the delivery of mobile WiMax services. This last deployment is anticipated to be especially turbulent, what with Cisco and Motorola supporting the competing Mobile-Fi standard. The 802.16a standard issued last January serves systems that operate between the 2 GHz and 11 GHz frequency bands, which support non-line-of-sight. The most lucrative bands in this range are licensed 2.5 GHz Multichannel Multipoint Distribution Service, licensed and unlicensed 3.5 GHz, and the 5 GHz Unlicensed National Information Infrastructure band. WiMax boasts a media access control protocol that enables the radio channel to be used by hundreds of users while delivering quality of service, and a Request/Grant mechanism does away with inbound collections and supports consistent- and variable-delay data services.
    Click Here to View Full Article