Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either Thunderstone or ACM. To send comments, please write to technews@hq.acm.org.
Volume 7, Issue 776: Friday, April 8, 2005

  • "U.S. Slips in Coding Contest"
    CNet (04/07/05); Frauenheim, Ed

    China's Shanghai Jiao Tong University, Moscow State University, and the Saint Petersburg Institute of Fine Mechanics and Optics took top honors at ACM's 2005 International Collegiate Programming Contest held in Shanghai this week. The University of Illinois ranked 17th in the world finals, representing a new low for U.S. teams in the 29-year history of the competition that experts say reflects the nation's declining global technology leadership, concurrent with the ascendancy of international competitors such as China, India, and Korea. It has been eight years since a U.S. school captured the top honors. U.S. tech companies shifting some of their research and development operations to Asia is one indication of world tech leadership's migration from America to Asian countries. Meanwhile, Intel's Craig Barrett and other tech leaders note that the dot-com bust and reports of offshore outsourcing are discouraging American students from enrolling in computer science courses, and fewer foreign students are also applying to U.S. graduate schools. Observers suggest that U.S. elementary and secondary schools step up their efforts to boost students' interest in technology, and higher teacher salaries and education tax incentives that let parents pay for private-school tuition are just some of the educational reforms proposed. Increasing the wages of U.S. tech professionals and a bigger budget for federal computing research have also been suggested as ways to cultivate U.S. tech leadership.
    Click Here to View Full Article

    For information on the results of ACM's 2005 International Collegiate Programming Contest, visit http://icpc.baylor.edu/icpc/Finals/default.htm.

  • "As Government Cap on Work Visas Rises, So Does Confusion"
    Washington Post (04/08/05) P. E1; Kalita, S. Mitra

    When U.S. businesses complained that the government's limit on H-1B visas for foreign workers had already been reached by the time fiscal year 2005 began, Congress passed a measure last November raising the 65,000 cap by an additional 20,000 visas for graduates who earned advanced degrees from U.S. institutions. However, last month's statement by U.S. Citizenship and Immigration Services that the H-1Bs could go to anyone with a bachelor's degree has bred uncertainty among businesses and immigration lawyers. The H-1B program's critics claim employers are using the visas as an excuse to bring in cheaper labor instead of looking for domestic talent, and are also paying imported workers less than the prevailing wage mandated by the program. Employers refute this accusation, citing substantial increases in H-1B application filing fees as well as lawyers' fees and moving costs; Granite Services' Dierdre Spear Petee argues that "It is really and truly for our business meant to find qualified workers." Some entrepreneurs say the investment costs of H-1Bs, combined with uncertainty about the H-1B program, makes offshore outsourcing more attractive. Meanwhile, businesses' H-1B decisions will have to wait as the criteria for the remaining 20,000 visas is still under review by the Office of Management and Budget and Homeland Security. Citizenship and Immigration Services spokesperson Christopher S. Bentley says, "The hows, where and whats will all be summed up in that guidance...We want to make sure this is done properly."
    Click Here to View Full Article

  • "View From the High Ground: Yale's Joan Feigenbaum"
    Technology Research News (04/13/05); Smalley, Eric

    Yale University computer science professor and ACM Fellow Joan Feigenbaum lists the continuing decline of data storage costs and the increasing penetration of computers into everyday life as two of the most significant IT trends; the result is the increased creation, capture, and storage of sensitive data whose potential for abuse has ignited widespread concern about its privacy, accuracy, security, and appropriate use. Feigenbaum foresees the continuation of these trends, as well as the pervasive presence of various surveillance systems and sensor nets, and her recommendation is to couple sensitive data to policy metadata that is simple to understand and enforce. She cites the Electronic Privacy Information Organization's Marc Rotenberg, who said that cyber-world development projects should include privacy-impact analyses, in much the same way that real-world development projects should include environmental-impact analyses. Feigenbaum argues that properly designed and deployed digital rights management (DRM) can be "one, relatively small part" of a content-distribution service, and notes that the media tends to oversimplify the DRM issue by characterizing it as a clash between technology and intellectual property. She says it is her hope that Congress will end the overriding influence of "analog-era content distributors'" interests and bring copyright law up-to-date once the effective use of the technologies by commercial and non-commercial distributors is clearly understood. Feigenbaum laments DARPA's cutbacks on basic research funding, which plays a key role in the development of many socially beneficial innovations. She believes that most of the computer science research community will be chiefly focused on ACM President Dave Patterson's Security, Privacy, Usability, and Reliability (SPUR) agenda for the next decade or two.
    Click Here to View Full Article

  • "Open-Source Referees Change the Rules"
    eWeek (04/07/05); Vaughan-Nichols, Steven J.

    In an effort to curb the proliferation of open-source licenses, the Open Source Initiative (OSI) board of directors instituted new open-source license approval criteria and a new system for classifying existing licenses at this week's Open Source Business Conference. The OSI declared that nonduplicativity, clarity, understandability, and reusability are now prerequisites for license approval, and announced its migration to a three-tier classification system that labels licenses as preferred, ordinary (approved), or deprecated. The organization said the new policy will discourage the approval of asymmetrical, corporate licenses that began with Mozilla. "The central activity of the open-source community is to create, reuse and recombine source code," stated the OSI board, noting that the resulting mix of code written under different licenses sows confusion among users and distributors as to their rights and responsibilities. The group promised that community stakeholders will have a say in the grading of licenses via a public-comment process, though its modus operandi is a work in progress. The nonduplicative element of the new open-source licensing protocol eliminates licenses that merely copy terms already found in previously approved licenses. The OSI said the point of the three-tiered classification system is "to define a small enough set of preferred licenses to make the interactions among them manageable." The OSI seeks to remove legal impediments to open-source projects with this new license approval and classification scheme.
    Click Here to View Full Article

  • "Game for Learning"
    Technology Review (04/07/05); Krotoski, Aleks

    The incorporation of computer gaming into the classroom is being encouraged by efforts and studies indicating that games can engage students more personally in the learning process and significantly improve their creative thinking and test scores. A 2001 U.K. Home Office report concluded that regular players of computer and video games have a higher probability of academic success, university enrollment, and gainful employment, while a 1998 report found that children using interactive entertainment improved their reading and comprehension skills faster than those taught by traditional tutorial processes. Meanwhile, a 2004 study from the University of London's Institute for Education (IoE) argued that computer games can help develop critical thought and social skills, and could be employed as a text to analyze narrative structure and character development. Commercial video games are especially valued by teachers because they often have an epic scope and focus on real-time resource administration. Chew Magna Carta School students in Tim Rylands' class learn about creative writing through a modified version of the role-playing game "Myst," while Microsoft's historical real-time strategy game "Age of Empires" is valued as a tool for educating learners about the social and mechanical processes that nurtured the Bronze and Iron Ages. IoE report co-author David Buckingham cautions that teachers are still not very games-savvy, and should acquaint themselves fully with the subject before incorporating it into their curricula. He also noted at a recent seminar that "The evidence that this kind of learning will motivate all students is questionable." The public's negative feelings toward ultraviolent commercial games could also complicate the matter.
    Click Here to View Full Article

  • "A Long and Winding Road for IT Women"
    Computerworld Canada (04/01/05); Ho, Vanessa

    With the percentage of women in technology fields dropping to mid-1960s levels in Canada, attendees at a Canadian Information Processing Society gathering worried a greater "geek" stigma would be attached to the field and that the industry would not be able to meet future recruiting requirements. Analyst Roberta Fox said school counselors were still discouraging girls from pursuing careers in IT, on top of the biased advice many girls received from friends and family. For that reason, Fox continues to attend school career days to encourage young women to explore careers in technology. The Software Human Resources Council reports that percentages of women in IT have dropped from 25.4 percent in March 2000 to just 22.8 percent in November 2004. While the decline is unexplained, it could make the IT workplace even less appealing to new female entrants. Fox said there were general differences between men and women IT workers, especially women's tendency to focus on relationships at cost of their own needs and careers. But relationship-building skills are increasingly in demand as the IT workforce is pushed closer to business functions, she noted. Many women in IT find roles as project managers, business analysts, and help desk staffers because of their relationship-oriented skills.
    Click Here to View Full Article

    For more information on ACM's Committee on Women in Computing, visit http://www.acm.org/women.

  • "DNS System in Need of Upgrade"
    VNUNet (04/05/05); Sanders, Tom

    The Internet's Domain Name System (DNS) is crucial to the functionality of the Internet and therefore must be improved in order to handle threats from hackers and the continued growth of worldwide Web domains, warns a group of leading computer scientists in a report released by the National Academies Research Council. Thirteen operators around the world run the Root DNS, a network of servers containing records of all Internet domains that are needed to direct users to Web sites. Last summer, the system faced an attack that left several search engines down for about two hours. In addition, the DNS has faced recent threats from "pharming," a practice in which hackers re-route Internet traffic to fraudulent sites. The report, called "Signposts in Cyberspace: The Domain Name System and Internet Navigation," notes that technologies such as DNS Security Extensions software and Anycast servers can help combat such threats, but they need to be deployed faster and more efficiently in order to do so. In addition, study chairman Roger Levien recommends several improvements in the governance and administration of the DNS. The study also advocates the autonomy and continued authority of ICANN, which has been facing increased pressure from commercial and political interests.
    Click Here to View Full Article

  • "Testing Time for Operators in a Brave New World"
    Financial Times-IT Review (04/06/05) P. 1; Taylor, Paul

    The telecommunications industry is eyeing IP Multimedia Subsystem (IMS) technology as way to unify disparate communications channels, ease deployment of new services, enable new capabilities, and reduce costs. The migration path to IMS from traditional circuit-switched networks, however, is not clear as only a few carriers have begun deployments in earnest; telecommunications equipment vendors, however, have made serious moves to shore up their position as IMS leaders by purchasing IMS specialist firms and teaming with services partners. Moving from the current infrastructure, which uses the same fundamental circuit-switched networks as when the telephone was invented, to IMS means deconstructing vertically integrated networks and rebuilding them around a standards-based, horizontal network infrastructure. Network engineers describe the new architecture as having three layers: An access layer, a session control layer, and an application layer. IP-based soft switches, media gateways, and session initiation protocol (SIP) technology are also important elements that will allow presence capabilities over various mobile and fixed-line terminals and seamless conversations carried from mobile to fixed-line networks, for example. Lucent CEO Patricia Russo says IMS will allow "blended lifestyle services" where customers can choose from a bevy of features, such as real-time video sharing, interactive gaming, and videoconferencing. The new technology infrastructure is also expected to dramatically cut costs for carriers, and BT expects a significant payoff from its current multi-billion-dollar 21st Century Network project to create a single IP-based infrastructure. Wireless carriers in China and Eastern Europe are also moving quickly toward digital networks, and Sprint in the United States made IMS part of its 3G network contract with Lucent.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "Designing a 'Bionic Eye'"
    myDNA.com (04/04/05); Levy, Dawn

    Stanford University physicists and ophthalmologists disclosed the design of an artificial vision system that can stimulate a retina with sharp enough resolution to enable a visually impaired person to orient himself toward objects, identify faces, watch television, read large fonts, and live independently, in the Feb. 22 issue of the Journal of Neural Engineering. The researchers see the device as being particularly helpful for people left blind from retinal degeneration, but tests with human subjects are at least three years away; until then, the system is being tested in rodents. The system incorporates a small video camera that captures light from objects and transmits the image to a wallet-sized computer processor that in turn sends the image to an infrared LED-LCD screen mounted on a pair of transparent virtual reality style goggles; the goggles reflect the infrared image into the eye and onto an implanted light-sensitive retinal chip, stimulating its photodiode array. Software that links image processing to motion detection takes advantage of the eye's natural image processing strengths and is a key part of the system's advance over existing technologies. Stanford researcher Daniel Palanker of the Department of Ophthalmology and the Hansen Experimental Physics Laboratory says the optimal scheme involves implanting the chip in the nuclear layer of the degenerated retina, while its placement on the side of the retina facing the eye's interior attempts to harness the retina's remaining processing power. The image is amplified and additionally processed in system hardware external to the eye, while a battery implanted in the iris powers the device via solar energy. Palanker notes that the Stanford prosthesis also tracks rapid intermittent eye movements needed to facilitate the natural perception of images. The design gives users a visual acuity of 20/80 by employing a maximum pixel density of 2,500 pixels per millimeter.
    Click Here to View Full Article

  • "Magic Pen Writes New Computer Tech Chapter"
    China Daily (04/07/05)

    Wang Jiang and colleagues at Microsoft Research Asia spent four years developing a pen interface that enables users to modify digital documents by scribbling text on printed versions of the documents and converting them to electronic text on the onscreen versions. Wang, who was an engineering psychology professor at Zhejiang University before going to Microsoft, supported the concept that an appropriate interface and software could allow a document to remain digital even when printed, and his team used a simple digital camera as the basis for the pen's sensing apparatus. Wang's team developed software that lays down an almost invisible pattern on standard copy paper as a document is printed so that the computer can determine the pen's position relative to the document as well as the specific document being altered, as each page boasts a unique code. The camera is activated by a pressure sensor when the pen makes contact with the paper, and takes snapshots of the user's handwriting and stores them on a memory chip. Then the pen wirelessly sends the images via a Bluetooth link to a PC or laptop equipped with the appropriate software in close proximity. Advanced computer vision algorithms categorize sequences of marks as manipulable words, shapes, or diagrams, and then character recognition software understands the handwritten text. The user's marks manifest themselves on the computer screen as handwriting embedded in the document, which can then be transformed into typed text and rendered graphics with software tools. Wang says the interface can allow multiple collaborators to make comments on separate document printouts, which the computer can combine within a single file.
    Click Here to View Full Article

  • "Lessons in Cybersafety"
    ITworldcanada.com (04/05/05); Parkins, Robert

    The current Internet structure makes security breaches inevitable since it assumes reasonable behavior, warned Harvard Law School Internet and society executive director Jonathan Zittrain. Because attackers use the same information avenue machines receive legitimate input from, there is always the chance that incoming data could be used to control computers. This situation is eroding privacy, Zittrain told attendees of the sixth annual privacy and security conference hosted by British Columbia's Ministry of Management Services. One way to solve the problem would be the creation of separate virtual networks that run atop the current infrastructure, but are controlled so as to ensure the identities of participants; these secure networks would probably be administered by software companies, but their development prodded by government agencies who use their purchasing clout to demand greater security. Government and industry are colluding to conduct surveillance on citizens, warned ACLU Technology and Liberty Project director Barry Steinhardt. Private data brokers and "policy laundering" practices by government effectively negate domestic review of controversial government activity; policy laundering refers to government use of international organizations to develop policies by proxy outside of normal domestic purview, such as how new passport standards are being developed by the International Civil Aviation Organization. Secured Services chief technical officer Michael Smith said many IT security problems could be traced to application-centric architectures that create redundant accounts and complicated authentication processes. Identity lifecycle management systems can help streamline IT security by centralizing the creation, maintenance, and audit of identities.
    Click Here to View Full Article

  • "Designing Science-Friendly Supercomputers"
    HPC Wire (04/01/05)

    The PC era saw the relative decline of supercomputers as vendors invested more money in lucrative business and personal computing efforts. When the Japan Earth Simulator debuted in Spring 2000 with a speed five times that of the nearest competitor, it sparked new discussion about the direction of high-performance computing. While off-the-shelf components were useful in building some supercomputers, standard PC technologies are inherently disadvantaged because PC applications do not access memory the same way scientific applications do. The High End Computing Revitalization Task Force concluded in 2004 that applications of national importance required alternatives to off-the-shelf technology and that federal agencies would have to work together. Computer scientists and mathematicians at Berkeley Lab recently published research in the Journal of the Earth Simulator that says the most effective way to design supercomputers is to examine the underlying algorithms and target those algorithms. Berkeley Lab scientists involved a wide range of researchers that used high-performance computing in order to get a balanced view of what was needed for scientific computing. The team also worked with IBM and partners from other institutions to design the Blue Planet supercomputer that emphasizes flexibility. The computer design addresses memory contention issues in previous IBM microprocessors and includes a new Virtual Vector Architecture for harnessing multiple processors in a single node. The technology will be included in Lawrence Livermore National Laboratory's ASC Purple computer expected to go online this summer with approximately 100-tflops performance. Berkeley Lab staff say the best overall approach to advancing high-performance computing involves building balanced scientific computers that are not made from off-the-shelf components, but also are not too specialized.
    Click Here to View Full Article

  • "New Protocol Can Defuse Turf Wars Over Information Sharing Among Federal Agencies"
    Kansas City infoZine (04/03/05)

    Following the Sept. 11 terrorist attacks, the U.S. government established 24 e-government programs designed to effect collaboration and the exchange of information between intelligence agencies with the aim of bolstering homeland security, but Penn State information science and technology professor Dr. Peng Liu says those programs do not offer strong enough incentives for efficient information sharing. A Penn State project has devised a new protocol, deployed with XML Web Services, that "provides incentives to engage in the trust-building process that will allow more information to be shared and to be shared more quickly," says Liu, who is lead researcher on the project. The protocol supports the gradual disclosure of information, which lowers the risk that an agency's interests will be harmed, while also building trust between the entities sharing data. "Nobody has any motivation to delay because they are helping each other," Liu reasons. The researchers used the protocol to model the exchange of information between two organizations by writing software for each organization and creating data. The protocol's deployment via XML Web Services means that it can be directly incorporated into current e-government systems, while also sparing agencies the headache of dealing with non-interoperable hardware and software. The research was funded by the National Science Foundation and the U.S. Energy Department, and detailed in a paper presented in the February issue of the Journal of the American Society for Information Science and Technology.
    Click Here to View Full Article

  • "Bigger Phishes Ready to Spawn"
    CNet (04/06/05); Hines, Matt

    Security researchers say the growth of phishing attacks has slowed dramatically, but they warn that online criminals are crafting more sophisticated attacks that employ pharming, instant messaging platforms, cross-site scripting, and DNS poisoning. Phishing attacks are also targeting smaller groups of people who hold valuable information, enabling the attacks to use more effective social engineering techniques. Salesforce.com customers, for example, were targeted with phishing messages offering free trials of new application features. Anti-Phishing Working Group Chairman Dave Jevans suspects the thieves used account names and passwords to steal corporate information that could be resold to marketers or used for industrial espionage. Phishers can use more effective social engineering with a smaller group of targets instead of general spam messages. An attack via the Yahoo! Messenger platform in March leveraged contacts in people's address books, and shows that phishers could also be targeting teenagers who might be more prone to divulge personal information. Another innovative social engineering attack mimicked antiphishing messages from eBay and other firms, warning users not to release personal information via email, said Mail-Filters' Dan Ashby. Among legitimate links included in those messages was a link to a fraudulent site. Phishers are also becoming more professional, changing their techniques in response to publicized security information. When warnings about cross-site scripting were published, some attackers began loading content into Web pages' internal frame rendering so that it would reach people who had turned off JavaScript applications.
    Click Here to View Full Article

  • "A Trail of DNA and Data"
    Washington Post (04/03/05) P. B1; Saffo, Paul

    Institute for the Future director Paul Saffo envisions a future scenario in which biometric identity systems are used by law enforcement to monitor citizens. However, Saffo dismisses the reliability and security of identity that advocates claim such systems would provide as "pure science fiction," and is particularly worried that the adoption of DNA as a personal identifier will lead to abuses far worse than current identity theft practices, since DNA cannot be canceled or swapped like a stolen credit card number. Saffo writes that the exploitation of DNA and the genetic data it contains is irresistible, and he foresees the emergence of "a genetic marketplace not unlike today's consumer information business...swarming with health insurers attempting to prune out risky individuals, drug companies seeking customers, and employers managing potential worker injury liability." Pharmaceutical industries will need people's DNA to tailor special drugs for individuals, while law enforcement will make the disclosure of DNA a civic duty and national security requirement. Saffo strongly doubts that Americans will rebuff such measures, given their rampant willingness to sacrifice privacy in exchange for material goods. "Today's biometric advances are the stuff of tomorrow's hackers and clever crooks, and anything that can be detected eventually will be counterfeited," he notes. Saffo concludes that the only way to ensure identity's security and reliability is to effect dramatic revisions to technology and policy on a system-wide and nationwide basis.
    Click Here to View Full Article

  • "E-Records Research in Jeopardy"
    Federal Computer Week (03/28/05) Vol. 19, No. 8, P. 10; Sternstein, Aliya

    The Office of Management and Budget has cut funding for the 70-year-old National Historical Publications and Records Commission (NHPRC), endangering future electronic records research, according to field experts. The NHPRC is a relatively small grants program of the National Archives and Records Administration, but has a tremendous impact on electronic records research, says National Coalition for History director Bruce Craig. Many of the NHPRC's electronic records research grants were paired with equal or greater private-sector contributions, he says. The group funded approximately 70 electronic records research projects between 1979 and 2002, usually providing less than $300,000 per grant. The groundwork for NARA's Electronic Records Archive, the latest digital preservation initiative, was laid by NHPRC. The commission's budget cut is surprising, given that President Bush last year reauthorized $10 million annual funding for four more years and last month appointed two new representatives to the group. American Archivists Society President Randall Jimerson says the elimination of the commission could seriously jeopardize electronic records research infrastructure, especially the training of new archivists in handling electronic records. Grants already award are safe from the budget cuts, but proposals not yet funded could be cut.
    Click Here to View Full Article

  • "Linux Making Its Mark in Messaging"
    Network World (04/04/05) Vol. 22, No. 13, P. 21; Fontana, John

    Linux is starting to gain ground as an application-layer option, particularly with clustering, IP, and virtualization upgrades in the most recent kernel; this trend is evidenced by the numerous Linux versions of popular email and collaboration servers now available as well as similar offerings from smaller vendors. John Giantelli, senior IT director of the American Society for the Prevention of Cruelty to Animals, notes that his organization's conversion to Linux has resulted in a 30 percent drop in messaging costs primarily due to the reliability and the less expensive hardware his mail server is running on. Scalix founder Julie Farris considers email to be a leading candidate for a Linux killer app. Reliability is another issue cited by Linux converts as a reason to switch, though experts say Linux-based messaging platforms cannot support features more advanced than reliable email routing and mailboxes, and calendaring/scheduling. Osterman Research President Michael Osterman says a Linux-based messaging platform is a sensible choice for those who are adopting Linux to simplify management tasks, as well as those who do not like licensing revisions around software maintenance on the Microsoft platform. An Osterman Research survey of 103 respondents found that almost 43 percent were likely or certain to consider switching their back-end servers if it did not entail switching desktop clients, while about 32 percent said they would consider Linux or another alternative platform to Windows if it meant they could wipe the slate clean. Approximately 22 percent said they would rather have something other than Windows be their next messaging platform.
    Click Here to View Full Article

  • "RoboGames: Battling Bots, But No Killer App"
    EE Times (04/04/05) No. 1365, P. 6; Ohr, Stephan

    The RoboGames competition in San Francisco showcased the development of different robotic elements--such as electronics, sensors, electromechanics, and precision machining--but also reminded participants of the limits of robot technology. Robotics Society of America President David Calkins was somewhat apologetic, conceding that many of the robots were unrefined. Calkins believes robots will enhance living by taking care of specific tasks, such as lawn mowing and vacuuming, but says, "Robotics won't take over the world." RoboGames contestants included sumo wrestlers, maze runners, biped androids, combat robots, soccer-playing Aibo robot dogs, and even autonomous fire-fighting robots that had to distinguish between a candle and light bulb, both of which emit heat and light. Tektronix general manager Bob Bluhm, whose firm sponsored the CM Robotics team, anticipated robots that could dismantle bombs and perform quick surgeries on the battlefield. The RoboGames combat robots showed more destructive capabilities, with the top heavyweight prize going to a 340-pound machine that spun like a top and used a barbed weapon that sliced opponents. Sony Aibo robot dogs demonstrated soccer-playing abilities, using CMOS image sensors to identify playing field areas and the orange-colored soccer ball. Some goalie Aibos refused to get up after catching the ball between their front paws, forcing human referees to perform a "hard reset"--lifting the robots out, orienting them, and putting them back.
    Click Here to View Full Article

  • "High-Tech's New Day"
    Newsweek (04/11/05) Vol. 145, No. 15, P. 60; Stone, Brad

    New and veteran venture capitalists are on the move to turn the Internet into a cash cow using both innovative and traditional ideas in the wake of the dot-com bubble's collapse. Spurred by such factors as Google's meteoric success, the Internet's penetration into everyday life, and the spread of broadband and mobile phones, these entrepreneurs are revisiting many dot-com concepts that may have seemed ridiculous a few years ago, and approaching them with a new perspective. Accel Partners venture capitalist Joe Schoendorf says these ideas became viable as management teams gained experience, concepts were more rigorously thought out, and a supporting infrastructure was established. Entrepreneurs have been encouraged to re-invest by free open-source software, cheap computer servers, and affordable office rents that allow startups to be launched easily and inexpensively. Less costly promotion via blogging and other new media outlets is also proving to be advantageous. As the economics of launching a Web business change, a shift in motivation may also be taking place among venture capitalists: Rather than pursuing an IPO, many entrepreneurs are hoping to be acquired by Internet juggernauts such as Google. For instance, rumor has it that Yahoo! purchased the Flickr photo-sharing site for $35 million. As part of Yahoo!, "we'll be taken care of and can concentrate on what we like doing best, which is building the product," says Flickr co-founder Stewart Butterfield.
    Click Here to View Full Article