HomeFeedbackJoinShopSearch
Home

Be a TechNews Sponsor

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM. To send comments, please write to [email protected].
Volume 7, Issue 815:  Wednesday, July 13, 2005

  • "The Battle for Control of the Internet"
    Toronto Star (07/11/05); Geist, Michael

    Michael Geist of the University of Ottawa's Faculty of Law expects the conflict over Internet governance to reach a climax at ICANN's final 2005 meeting in Vancouver. ICANN was originally set up with the promise that the United States would eventually relinquish control of the domain name root servers to the organization, but such a phase-out has not taken place due to ICANN's failure to reach needed milestones. Internet governance is a significant issue for many national governments, especially as it pertains to their national country-code domains; but their sense of powerlessness and displeasure with ICANN prompted the U.N.'s International Telecommunications Union (ITU) to organize its own Internet governance proposal, which is slated for release later this month. Certain observers believe the report will advocate international control over the domain name root servers and legitimize national prevalence over country-code domains, but U.S. officials issued a policy statement earlier this month indicating that they no longer plan to give the root servers to ICANN, but instead retain their "historic role in authorizing changes to the authoritative root zone file." One of the more unsettling projected outcomes of the ICANN-ITU showdown could be national governments' decision that country-code sovereignty is inadequate without internationalized root server control. This could lead to the creation of an alternate Internet that generates turmoil. Geist concludes that the public should take an interest in Internet governance, as domain name system administration policies carry implications for free speech, privacy, and other issues that directly affect the public.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Keeper of Expired Web Pages Is Sued Because Archive Was Used in Another Suit"
    New York Times (07/13/05) P. C9; Zeller Jr., Tom

    The Internet Archive, a searchable online repository of defunct Web sites and other multimedia content, has been targeted by a copyright infringement suit filed by Healthcare Advocates, which claims that access to its old Web pages in the archive was unlawful and unwarranted. Named as co-defendant in the suit is the firm of Harding Earley Follmer & Frailey, which searched the Internet Archive two years ago to unearth Healthcare Advocates' old Web pages as material for their defense of Health Advocate, which the similarly named Healthcare Advocates was suing for trademark violation. The Internet Archive makes copies of publicly accessible sites periodically and automatically through the use of Web-crawling "bots," and then stores the copies on its servers for later retrieval using the Wayback Machine, the archive's search tool. The latest lawsuit argues that Harding Earley representatives should not have been able to access the old Healthcare Advocates Web pages because the organization, shortly after filing the suit against Health Advocate, put a robots.txt file on its own servers that was supposed to tell the Wayback Machine to disallow public access to the historical versions of the site. The suit states that the Digital Millennium Copyright Act was breached when a member of Harding Earley requested old versions of Healthcare Advocates' Web site hundreds of times using the Wayback Machine, and that in almost 100 instances the robots.txt failed to block the requests. The Internet Archive's failure to honor the robots.txt file constitutes a breach of contract and fiduciary duty, according to the lawsuit. However, lawyer William F. Patry contends that such a contract does not exist because the robots.txt file is part of a wholly voluntary system.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "New E-Mail Authentication Spec Submitted to IETF"
    eWeek (07/11/05); Roberts, Paul F.

    The Internet Engineering Task Force (IETF) has received new email authentication specifications from a coalition of technology companies that includes IBM, Yahoo!, Cisco Systems, and Microsoft. DomainKeys Identified Mail (DKIM), which integrates Yahoo!'s DomainKeys technology and Cisco's Internet Identified Mail technology, enables receiving domains to identify authentic senders and filter out spam and phishing emails that use forged addresses by tagging messages with digital signatures via public key cryptography. Email domain owners will create a public and private cryptographic key pair and publish the public key in their Domain Name System record, storing the private key on their email servers. Yahoo!'s Miles Libbey says Identified Internet Mail's header-signing technology will be utilized to sign emails, and notes that a DKIM-enabled software plug-in will need to be installed on mail servers. Sendmail CTO Eric Allman says the DKIM standard will be released as an IETF Internet Draft through the IETF Web site, while the authoring companies intend to license DKIM for free and possibly release it to the open-source community. Cisco's Jim Fenton reasons that DKIM could be widely adopted as a standard for protecting email communications and halting phishing attacks and email counterfeiting. DKIM will be discussed at the 63rd IETF conference scheduled to begin on July 31, while top mail experts are convening in New York City this week to promote the deployment of email authentication.
    Click Here to View Full Article

  • "Computer Scientists Focus on Developing Programs That Can Learn Game Rules"
    Stanford Report (07/13/05); Madden, Kendall

    Stanford Logic Group computer science professor Michael Genesereth and doctoral student Nathaniel Love contend in AI Magazine's summer 2005 issue that more intelligent game-playing computer programs should be capable of winning more games as well as provide programming insights. Genesereth says general game playing (GGP) is beyond the scope of programs such as IBM's Deep Blue because of their inability to learn and understand rules, while the relative intelligence of GGP programs can be compared by pitting them against each other. At the core of the philosophy behind GGP is the belief that a program should be able to adapt to new situations, a skill that is distinctly human. Programs with such a capability could extend to business management and many other fields outside of gaming. A course led by Genesereth culminated in a competition between student-designed GGP programs demonstrating that more complex systems were usually less reliable, and Love notes that deeper reasoning programs often exhibited less stability in the contest. GGP programs revive an old concept of computer systems that can reach independent decisions by synthesizing a multitude of diverse inputs. Designing such a system would be substantially more complex than a system that relies on individual programs with particular functions, says Genesereth. The Stanford Logic Group hosted a GGP contest at the 2005 American Association for Artificial Intelligence conference.
    Click Here to View Full Article

  • "Laptops Are Hot; Maybe Too Hot"
    Wired News (07/13/05); Gain, Bruce

    Though laptops are generally cooler than they were five years ago, making allowances for more powerful components that generate additional heat is an increasingly formidable challenge for designers and manufacturers. Hewlett-Packard mechanical-development engineer Jeff Lev says CPU temperatures are on the rise again as customers demand faster, more powerful laptops. Most laptop cooling systems involve the distribution of heat away from the chipset via liquid coolants, while fans can control temperature to a certain degree. Heat is often a bigger problem for thinner and lighter notebooks, while smaller laptops have less space for ventilation and heat dissipation. But analyst Rob Enderle says while larger models can control heat more easily, the device's bottom, keyboard, or wrist rest often become uncomfortably hot even with state-of-the-art processors. Processors from Intel and AMD can more efficiently measure and deactivate CPU clock cycles when power is unnecessary; Intel's SpeedStep and AMD's PowerNow service regulate multiple processor clock speeds and lower speeds in accordance with the computing task, thus reducing heat levels and conserving battery life. Intel and AMD say thermal output per processor will not necessarily climb with the wide introduction of dual-core x86 PC processor architectures in 2006. More intelligent BIOS software created by laptop PC makers is also designed to reduce heat and boost power efficiency by lowering the brightness of displays.
    Click Here to View Full Article

  • "Tapping Into Tinkering"
    Washington Post (07/12/05) P. D1; Musgrove, Mike

    Some electronics companies are starting to reevaluate those who tinker with their products as a source for new ideas--and indeed, promote those enthusiasts' modifications or incorporate them into their wares. Institute for the Future director Paul Saffo says watching what tinkering users do is a sensible business strategy, arguing that it is the "fanatics and renegades and people in garages" who are truly responsible for dramatically boosting a product's value with their innovations. However, such activity makes many consumer-electronics companies nervous for various reasons, including its potential to void a product's warranty or threaten the company's bottom line. Sony is trying to discourage hackers from reprogramming its PlayStation Portable (PSP) by issuing software that owners must install to play the latest games and that destroys unauthorized modifications. Sony's argument is that the same tinkering that lets the PSP perform seemingly innocuous functions--sending instant messages, for example--also enables more sinister activities, such as game piracy. Make magazine editor Phillip Torrone thinks this approach could ultimately prove ineffective, as attempts to deter hackers often strengthen their resolve. He believes that "the really smart companies should release their products to the alpha geeks for six months and let the alpha geeks play around with them," a strategy that would cut research and development costs as well as increase the companies' chances of releasing much better products.
    Click Here to View Full Article

  • "Apache Falls Victim to OASIS Shelter"
    ZDNet (07/11/05); Berlind, David

    The Organization for the Advancement of Structured Information Standards (OASIS) operates under the pretense of open standards, though in reality OASIS specifications often render services unwieldy and inaccessible, writes David Berlind. Many open source promoters held a boycott of OASIS this year to protest its misleading policies, and the Apache Foundation recently encountered licensing obstacles when it tried to implement its WS-Security. Apache thought the Web service was open, but it learned that licenses were required with both IBM and Microsoft. Definitions of open source vary, as some advocate reasonable and non-discriminatory (RAND) discretion for patent holders, while others believe that RAND clauses undermine open source altogether. Even if they are conceived in the letter of open source, some license agreements violate that spirit with non-transferability clauses, Berlind notes. Also, some open source license agreements, such as the one between IBM and the Eclipse Foundation, sanction the theft of intellectual property from the licensee with no threat of reprisal. In the case of IBM, the Common Public License dictated that were the aggrieved party to sue for misappropriation, it would lose its license. As it functions now, OASIS is a patent shelter, protecting its clients through coercive licensing agreements and impeding open source development, Berlind concludes.
    Click Here to View Full Article

  • "Turning Your Life Into Bits, Indexed"
    Los Angeles Times (07/11/05) P. A19; Hiltzik, Michael

    Vannevar Bush's vision of Memex, an electronic archive of a person's entire life, is becoming a reality some 60 years after the science advisor penned his provocative article "As We May Think" in a 1945 issue of the Atlantic Monthly. Microsoft distinguished engineer Gordon Bell has developed MyLifeBits, a storage project that he estimates can house the substance of an entire lifetime, including letters, telephone conversations, home movies, and photographs. Capitalizing on cheaper data storage and increased digitization, MyLifeBits, soon to be renamed Memex in an homage to Bush, will be able to house all this data on a 1 TB hard disk. Bell launched the project when he received several boxes of his old papers, and set out to digitize all of them before turning to other memorabilia from his life; in total, Bell found that his life amounted to 16 GB. Bell and research associate Jim Gemmell devised a cross-referencing index system that links different formats of data to mirror the way our mental associations work. As the project has evolved, it now includes 84,300 emails Bell has sent and received, and 53,400 Web pages he has visited. Bell and Gemmell have pitched their idea to neuroscientists for applications such as helping Alzheimer's patients. So far the response has been muted, though the researchers believe that widespread adoption of the MyLifeBits concept is inevitable. Gemmell says, "It's like bucking the trend toward literacy. That reduced our ability to remember things orally, but gave us the permanency of books."
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "New-Age Keyboard: Trace, Don't Write"
    CNet (07/11/05); Kanellos, Michael

    IBM Almaden Research Center scientist Shumin Zhai promoted the experimental Shorthand-Aided Rapid Keyboarding (Shark) system at the New Paradigms for Using Computers conference on July 11. Shark is a pen-based shorthand technique whereby users enter words into mobile devices by tracing them letter by letter on a virtual keyboard. "It uses geometric patterns to represent words," Zhai explained. Shark operates on the principle that users start to commit common words and word elements to memory fairly quickly, which lowers their reliance on visual guidance; the user's final pattern is evaluated, interpreted as a word in the database, and converted into on-screen text by the computer. IBM says Shark circumvents many of the difficulties associated with speech and handwriting recognition, and the keyboard system aligns well with natural English language patterns. Complex scribbles do not overstress the system, since the average word is 4.7 letters long. Shark interoperates with numerous keyboard types, although QWERTY and alphabetically arranged keyboards are not well-suited to the system. IBM is experimenting with Shark using a tweaked version of its Atomik keyboard, which maximizes letter associations through its unique key arrangement.
    Click Here to View Full Article

  • "DHS Information Security Plans Lacking, GAO Says"
    GovExec.com (07/11/05); Pulliam, Daniel

    A new Government Accountability Office (GAO) report determined that the two-year-old Department of Homeland Security (DHS) has struggled to comply with the 2002 Federal Information Security Act and stills lacks an adequate information security program. The 36-page report, requested by Sen. Joseph Lieberman (D-Conn.), concluded a nearly year-long review of the department's cybersecurity procedures. Assessing five security areas, including risks, security plans, security testing and evaluations, corrective action plans, and continuity of operation plans, the DHS' four major divisions were found lacking in most areas with no one division satisfactory in more than two areas. The report said DHS currently has only a limited ability to protect its data and computer systems. Lieberman says, "How can the department possibly protect the nation's critical cyberstructure if it cannot keep its own house in order?" The report said DHS has made "significant progress in developing and documenting a department-wide information security program" that could be used as its security framework, but it has not been implemented. DHS CISO Robert West says the agency is making progress to improve its cybersecurity, noting the success of a pilot certification and accreditation program and next month's completion of a department-wide inventory project.
    Click Here to View Full Article

  • "The mSpace Classical Music Explorer: Improving Access to Classical Music for Real People"
    University of Southampton (ECS) (07/03/05); Schraefel, M.C.; Smith, Daniel Alexander; Russel, Alistair

    The mSpace Classical Music Explorer is designed for classical music enthusiasts who lack domain expertise and therefore derive little value from common Web search tools such as keyword search, and the mSpace framework can be extended to apply to virtually any domain of interest. Research demonstrates that the mSpace Explorer can augment users' experience and ability to access domains through effective multicolumn layout and associated information: Each column stands in for a domain dimension (era, composer, genre, etc.), while a pane below the columns displays information appropriate to a selected item. A critical element of the mSpace Explorer is the preview cue, which allows a user to quickly sample portions of the domain to determine which areas merit further investigation, and facilitates information triage. Analysis shows that performance is not reliant on how many cues are used, although users of multiple cues tend to explore selected areas more deeply, while single-cue users perform faster triaging of a broader domain area. Preview cues can be selected at random, matched to expert lists, and community-recommended. MSpace Explorer complements preview cues with a spatial multicolumn layout as opposed to a temporal layout, thus allowing the user to maintain awareness of contextual information while making remembrance of what went before the current selection less critical. The mSpace software framework is open source, and lets the interface accommodate any data source through the enablement of dynamic slicing, sorting, addition, subtraction, info views, and preview cues. Semantic Web technologies support the automated processes that allow new information to be discovered and associated so that the domain can be enhanced.
    Click Here to View Full Article

  • "New Battle Brews Over UCITA, Software Licensing Terms"
    Computerworld (07/11/05) P. 1; Thibodeau, Patrick

    Although state-by-state adoption of the Uniform Computer Information Transactions Act (UCITA) was shot down by heavy resistance, critics contend that the act is still very influential, and the software users who led the opposition are developing their own model software-licensing law. "If there is a void and UCITA is the only thing to take the place of the void, this could end up being the model almost by default rather than choice," warns Riva Kinstlick with Prudential Financial. John McCabe with the National Conference of Commissioners on Uniform State Laws is skeptical that UCITA will hold much legal sway, arguing that courts will emphasize prior cases rather than unpassed legislation. UCITA proponents claimed the law would give online commerce a legal framework, while critics countered that UCITA's default rules primarily serve vendors, and permit such underhanded practices as shipping products known to be defective. Despite the suspension of UCITA's state-by-state adoption in August 2003, University of Arizona law professor Jean Braucher says the law can still be used as a model for licensing contracts. The Americans for Fair Electronic Commerce Transactions (AFFECT) are working on a model software-licensing law based on principles that are more permissive than UCITA's; for example, the AFFECT principles allow reverse-engineering of software products. Phil Zwieg with the Society for Information Management says smaller businesses that lack the legal staff or influence to negotiate licensing contracts will especially benefit from the AFFECT initiative.
    Click Here to View Full Article

  • "Many Minds, One Goal: Curb Bad Traffic"
    Network World (07/11/05) Vol. 22, No. 27, P. 1; Greene, Tim

    More than 50 academic and commercial experts convened at the Steps to Reducing Unwanted Traffic on the Internet (SRUTI) workshop last week to discuss new approaches to fortifying the Net against malware, spam, and other forms of network-strangling traffic. MIT professor and SRUTI co-chairman Dina Katabi says the 13 papers presented at the event provide a starting-off point for creating a faster Internet, although many proposals invited criticism. One suggestion for an email encryption technique that would also authenticate the sender and receiver was rejected when someone noted that the method would neutralize spam filters that search content and subject lines for key words. Jussary Almeida of Brazil's Universidade Federal de Minas Gerais outlined an anti-spam algorithm for analyzing email senders and receivers that, when used with a traditional spam filter, can reduce the amount of good email falsely tagged as spam by roughly 20 percent. The algorithm gauges the likelihood that any email sent from a specific group of senders to a specific group of recipients is spam. Researchers from the University of North Texas, Denton, claim a voice spam detection server they developed can recognize senders of spam over IP telephony (SPIT) after only three calls to users in a given group. A proposal from Mike Handley of University College London is designed to block the majority of distributed denial-of-service attacks near their sources through the use of devices near servers and at ISPs' edge routers that mark and observe traffic.
    Click Here to View Full Article

  • "The Internet's Next Evolution Beckons"
    Electronic Design (06/30/05) Vol. 53, No. 14, P. 59; Frenzel, Louis E.

    Internet traffic is increasing exponentially thanks to expanding IP video, Internet audio, and Voice over IP (VoIP), as well as the Net's status as a leading news source. An even bigger explosion in Internet traffic and data will stem from machine-to-machine (M2M) communications, which any Internet-connected device could potentially be capable of; M2M's ultimate goal is networking every concealed computer-based device to facilitate transparency in monitoring and control for automation. Scalability is key for the Internet to accommodate the massive load yielded by M2M, and several elements are ready or nearly ready to help realize this vision. The adoption of Internet protocol version 6 (IPv6) should provide enough Internet addresses for M2M, and also upgrade routing and autoconfiguration. A central component for M2M-enabling the Net is 100 Gbps fiber, and already existing dark fiber will help ease this deployment, although dispersion compensation and repeaters will need to be added. Data rates will later graduate to the terabit and petabit levels through wavelength-division multiplexing with multiple "colors" of light on a fiber simultaneously. Another factor in the Internet's evolution is the rollout of faster wireless services such as WiMAX broadband, advanced Wi-Fi, and Ultra-Wideband, while passive optical networks will enable broadband triple-play services to penetrate small businesses and households. The Internet2 consortium nurtures research and development that is expected to further enhance the Internet as well.
    Click Here to View Full Article

  • "Quantum Computing: Teaching Qubits New Tricks"
    Science (07/08/05) Vol. 309, No. 5732, P. 238; Seife, Charles

    One of the long-perceived obstacles to building a practical quantum computer was the apparent inability to correct errors in quantum information without destroying the information itself. The very act of measuring a quantum object wipes out the original as its information is converted to another format; however, a group of physicists that included Ray Laflamme of the University of Waterloo discovered in the mid 1990s that error correction is possible without transgressing the laws of quantum theory, if the information is stored not on a single quantum object but on several objects simultaneously. Information stored in the relationship between those objects is immune to error and does not violate the no-cloning rule of quantum mechanics, since there is no need for duplication or reading. Laflamme and colleagues have taken this concept to an even higher level in a recent paper that argues for the storage of information on the relationship among the relationships between quantum objects. Such an approach is efficient enough to facilitate the generation of error-correcting codes with even smaller groups of quantum objects, and proves that several seemingly dissimilar techniques of quantum error correction are really the same. "This paper shows that you can reduce the passive kind [of error correction methods] to the active kind," says Williams College physicist William Wootters. Laflamme admits a killer application for this technique remains elusive, although he says the discovery demonstrates that "quantum error correction is much richer than we had thought."

  • "What Will Top the IT Agenda in 2010?"
    eWeek (07/11/05) Vol. 22, No. 27, P. 39; Coffee, Peter

    In a recent roundtable discussion, IT experts at work in different industries outlined their visions of how the IT world will develop in the next five years. Tom Miller of FoxHollow Technologies believes service-oriented computing will be a key driver, so that infrastructures can scale to fluctuations in growth; he also said limitations of mobile devices, such as bandwidth and availability, pose a central challenge for IT, as well as personalized information management. Duke Energy's Kevin Wilson believes security will be a prime mover in the future, possibly forcing companies to significantly control their information flow. He says in five years, "Maybe the world is a big spider web of virtual networks instead of a big, open highway." Kevin Baradet, CTO at the S.C. Johnson School of Management at Cornell University, said academia is modeling itself after the business world in creating the secure, non-Internet networks Wilson outlined, as well as voluntarily complying with the Sarbanes-Oxley Act; he also predicted the move to more larger servers and virtualization supplanting physical PCs. Robert Rosen, CIO at the National Institute of Arthritis and Musculoskeletal and Skin Diseases, predicted that mobile technology will play an important role as telecommuting becomes more pervasive, and that IT will transcend the arena of cost reduction to become more solution-driven. Rosen also identified ongoing innovation and grid computing as areas of focus. Sutter Health CIO Nelson Ramos predicted a trend toward standardization and formalization, noting that HIPAA compliance will still dictate many disclosure issues for the health care industry. In terms of human talent, the group agreed that a diversity of skills and willingness to learn will be critical, as well as a greater understanding of the business process.
    Click Here to View Full Article

  • "Assertive Debugging: Correcting Software as If We Meant It"
    Embedded Systems Programming (06/05) Vol. 18, No. 6, P. 28; Halpern, Mark

    Programmer/software designer Mark Halpern describes the Assertive Debugging System (ADS) as a scheme that will abridge the current debugging process and enable the systematic, documentable debugging of software objects, which he believes will soon become a legal requirement. ADS deals with bugs in program implementation that stem from substantive or logic errors; these are the most dangerous kinds of bugs because they are easy to introduce, trivial, often unnoticeable, and not immediately dangerous. Halpern's approach is designed to induce the manifestation of bugs at the earliest possible time so corrective action can be taken before their existence is covered up by continued program execution, and this is done by monitoring the behavior of many variables at run time, in search of violations of assertions the programmer made when defining those variables. The assertions are relayed within a notation that is a natural offshoot of the programming language, and they can be clustered variably to allow the programmer to trigger or mute sets of related assertions with a single command. As a subject program is compiled, the activated assertions produce code within the object program that can be used to check the applicable variables for any breaches of the behavioral constraints defined by the programmer. Once the code detects a violation, the program's execution is stopped and the programmer-specified exception action is taken. Most programmers argue that ADS is unaffordable, but Halpern counters that the approach yields value in every execution, a claim that cannot be made for current debugging practices. In addition, the cost of ADS in machine cycles is more than offset by what it conserves in project schedule slippage, software-engineer time, and time-to-market.
    Click Here to View Full Article

  • "The Call to eConnect"
    Governing: Outlook (06/05) P. 10; Carey, Mary Agnes; Reichard, John

    President Bush wants electronic medical records for most Americans within the next 10 years, but a considerable amount of financial investment and cooperation is necessary. To make the private sector more accepting of health-information technology, Bush administration officials will probably push for a public-private alliance whereby the government sets guidelines or objectives and helps the private sector achieve them through collaboration. The initial capital for the hardware, software, and brainpower to make the technology work must come from somewhere, and government officials will need to work with health care providers to ensure the security of medical records as well as the easy retrieval of information by health professionals. Establishing interoperability between health care computer systems throughout the nation is the goal of an effort between a federal task force and health care industry experts, but the financial responsibility for putting health care information systems into operation lies with doctors and hospitals. Their reluctance to pay for such systems themselves could be overcome through a "pay for performance" system proposed by the Medicare Payment Advisory Commission; the scheme would reward health care providers with higher payments if their quality of care receives high marks. Another potential impediment is older physicians' unwillingness to incorporate IT into their practices, while a further challenge to the government is encouraging industry adoption of standards to guarantee data-sharing between computers. Secretary of Health and Human Services Michael Leavitt reports that improving direct communication between health care providers could shave 20 percent off federal health care expenditures, and some estimates indicate that a switch from paper to electronic medical records could save $78 billion a year within a decade.

  • "How to Ride the Fifth Wave"
    Business 2.0 (07/05) Vol. 6, No. 6, P. 78; Copeland, Michael V.; Malik, Om

    Computing is poised for a fifth wave of evolution, impelled not by a single invention or its manner of corporate deployment, but by the convergence of inexpensive computing devices, ubiquitous bandwidth, and open tech standards that will together make computing universally accessible. The four previous computing waves, spaced roughly a decade apart, were marked by the acceptance of mainframes by the business world in the 1960s; the widespread use of the minicomputer in the 1970s; the advent of the PC in the 1980s; and the emergence and adoption of networking, the Internet, and distributed computing in the 1990s. The fifth wave distinguishes itself from the other waves in that it is primarily consumer-fueled, and that the technological infrastructure is already established and awaiting software applications. A major uptick in corporate IT spending will be concurrent with and driven by the fifth wave, which will provide an environment for cultivating new industries as well as giving entrenched ones extreme makeovers. The wave is expected to be a great leveler, enabling startups and entrepreneurs to compete with the old industry behemoths thanks to the vast number of opportunities, cheap technologies, and increasingly lower barriers for market penetration. A defining characteristic of the fifth wave is a company's ability to effortlessly scale services from a single customer to millions, which is possible via software "Amazon-ification," says venture capitalist Gordon Ritter. He believes all software will soon follow the Amazon model, offering customization to individual users and inferring user preferences.
    Click Here to View Full Article
    (Access to the full article is available only to paid subscribers and AOL members.)


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM