HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 701: Friday, October 1, 2004

  • "Senate Bill Aims at Makers of File-Sharing Software"
    New York Times (09/30/04) P. C7; Zeller Jr., Tom

    The Senate decided on Sept. 30 not to vote on the Induce Act, a bill that would make makers of peer-to-peer file-sharing software liable for any copyright violations committed by their customers. Critics argue that the language of the Induce Act, which targets anyone who "intentionally aids, abets, induces or procures" copyright infringement, is so broad that it could strangle innovation and threaten many legitimate activities. Supporters, on the other hand, insist that such legislation is needed to protect the country's creative culture and economy. Sen. Bill Frist (R-Tenn.) has remarked that copyrighted content such as books, movies, software, and music inject more than $500 billion into the U.S. economy annually and support 4.7 million employees. Induce Act advocates, represented by the recording industry and its congressional allies, and opponents such as trade groups and Internet boosters, have been negotiating and refining the bill's language for months, yet critics maintain that few of their concerns have been addressed. Others strongly doubt that new laws can rein in peer-to-peer technology: Although a company can be prevented from reaping financial rewards with peer-to-peer products, their argument goes, there is nothing to stop file-swapping software authors from creating such programs and willingly sharing them online. BigChampagne CEO Eric Garland thinks the Induce Act and similar legislation are attempts by desperate content industries to preserve an outdated business model that allows them to assert control over product consumption, but he insists that their adoption of a more mature model will lead to innovations that will "make people in the creative chain a lot of money." Nevertheless, critics and supporters concur that file-sharing software distributors may soon be assigned some measure of accountability in digital piracy, regardless of the Induce Act's ultimate fate.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Program Seeks Auditable Electronic Voting"
    iSeries Network (09/29/04); Ross, Cheryl

    Electronic voting systems have been heavily criticized for their vulnerability to hacking and malfunctions, which carry the danger of election fraud and voter disenfranchisement. The Verified Voting Foundation founded by Stanford University computer science professor David Dill believes the only real solution to these problems is to incorporate a voter-verified paper trail so that users can confirm their votes and officials can hold accurate recounts. Verified Voting nationwide coordinator Pamela Smith notes that approximately 30 percent of U.S. voters will use direct record electronic (DRE) devices in the upcoming November election. Only DRE machines in Nevada will be capable of printing out a paper ballot, since the state government has made voter-verified paper trails mandatory. Verified Voting members would prefer the use of optical-scan ballots that electronically record votes while supplying a paper record filled out by the voter, and the presence of on-site scanners to verify that the ballot is properly filled out and allow voters to correct any errors; such measures would permit an election to continue even in the face of hardware malfunctions, as well as provide an auditable vote record for recounts. Verified Voting's TechWatch program has enlisted over 1,500 techies to report any e-voting incidents that occur at polling places throughout the United States. "The net result of all that information capture will be to improve election processes as we get down the road, and hopefully even solve some of the problems in real time on election day," says Smith. Verified Voting has also lobbied for the passage of legislation such as Rep. Rush Holt's (D-N.J.) HR2239, which calls for paper trails in all e-voting machines but is unlikely to be approved in time for the New Jersey elections.
    Click Here to View Full Article

  • "New Services Are Making It Easier to Hide Who Is Behind Web Sites"
    Wall Street Journal (09/30/04) P. A1; Bialik, Carl

    The emergence of services offering anonymity to Web site operators has sparked a dramatic rise in the number of sites with nameless owners, which in turn is generating concerns about criminal activity, stimulating discussions about online privacy, and even making an impact on political races. For a fee, some domain-name registrars can shield Whois info--customers' names and contact data--as well as filter emails and screen phone calls. Privacy proponents contend that the publication of Web site owners' Whois info in public databases should not be a requirement: For one thing, some owners have valid reasons for wanting to cloak their identity, while the disclosure of such data runs the risk of exploitation by spammers and identity thieves. Officials such as Alabama Securities Commission director Joseph Borg are worried that online anonymity services will encourage criminal activity. Registrars such as Go Daddy Software downplay this suggestion, insisting that they share contact data with law enforcement agencies and will sometimes de-cloak Whois info in response to complaints or subpoenas, or if customers violate the registrars' terms and conditions. Anonymous sites have been linked to political controversy, one example being a pair of sites encouraging voters to support Tim Torrey, departing mayor of Eugene, Ore., as a write-in candidate against primary winner Kitty Piercy this November. The identities of the Web sites' owners were shielded, but the speed of their registration convinced Piercy advocates that they were operating as a political action committee (PAC) and therefore had to register with the state, although Torrey denied any affiliation. Rep. Howard Berman (D-Calif.) backed a bill that would have made registrars at least partially accountable for ensuring the quality of their Whois info, but the measure did not win congressional approval.

  • "Towards a New, More Acceptable Face for Biometric Security"
    IST Results (09/30/04)

    The European BioSec effort aims to make biometric security systems ready for widespread use, and is tackling public perceptions, biometric system security, European Union regulation, best practices, and standardization issues. The EU's Information Society Technologies program formulated the BioVision roadmap, which lays out goals for the biometric field over the coming years, and the BioSec project is a corollary effort that addresses 20 of the 38 research projects required by the roadmap. One of the most important aspects of the BioSec project is the security of biometric systems themselves, including 3D imaging and technology to determine whether someone is dead or dismembered when their biometric identifiers are being read; ID tokens could also put users in control of their own biometric data instead of relying on government or corporate databases. Besides these security assurances, BioSec researchers are looking at the public education aspect, such as the need for appeal mechanisms if something is wrong with the system. Amputees may be excluded from some systems, and biometric systems will have to be designed so they do not record extraneous data such as a person's sobriety or physical health. The European Biometrics Forum is an overarching effort to define the obligations of biometric system operators as well as users' rights. Finally, technical standards are an issue that must be addressed before vendors can safely invest in systems knowing they will be compatible across national boundaries and with existing technology. There is currently no biometric-specific EU regulation and data protection has already been regulated as a responsibility of Member States, while BioSec coordinator Orestes Sanchez-Benavente says initial biometric security deployments will have to deliver obvious efficiency and security benefits to users.
    Click Here to View Full Article

  • "Offshoring Forces Tech-Job Seekers to Shift Strategy"
    Washington Post (09/30/04) P. E1; McCarthy, Ellen

    Offshore outsourcing of technology jobs is pressuring people to adjust their career paths, which is the advice Carol L. Covin, OSITA founder and author of "20 Minutes From Home," gave attendees at a Sept. 24 meeting of the Association for Women in Computing's Baltimore branch. She said positions that involve sensitive tasks (computer security and network architecture, for instance) or rely greatly on face-to-face interaction (such as project management) are likely to remain stateside, while Washington, D.C.-area workers with security clearances have a virtual guarantee against offshoring. Paul Villella, CEO of the Virginia-based staffing firm HireStrategy, explains that unemployed tech professionals may need to move outside of "pure, narrow programming" in order to qualify for more stable employment. "Those skills are still the key thing to get the door open...but the big differentiator is the ability to translate that skill into something higher, being able to communicate it," he notes. Lloyd Griffiths, dean of George Mason University's School of Information Technology and Engineering, attributes the plummet of enrollments in the school's computer science program not to outsourcing, but to some other, still unclear factor. Though the Labor Department expects software engineering and computer specialists to continue to be ranked among the fastest-growing jobs throughout the next 10 years, Griffiths foresees a serious deficit of trained technical American workers if current trends keep up. "The demand is really increasing and the supply side is the side that's just not there, and I don't know what we're going to do about it, frankly," he laments.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Software-Defined Radio Advances on Several Fronts"
    EE Times (09/30/04); Merritt, Rick

    Software-defined radio (SDR) technology is moving forward with government and private-sector investment and standardization efforts. November will be a watershed month in several respects: The Institute of Electrical and Electronics Engineers (IEEE) 802.22 working group is expected to launch work on a standard for fixed-access systems that employ cognitive radio methods to detect and tap unoccupied TV spectrum between 54 MHz and 862 MHz, while the SDR Forum is slated to start defining an industry road map for SDR at its annual conference in Arizona. The IEEE initiative is directed at an FCC proposal to test smart SDR systems by opening up 300 MHz of unused UHF/VHF spectrum. Intel Labs wireless research manager Jeff Schiffer says the project "is the first opportunity to use cognitive radio in a commercial environment and it could give the FCC the impetus to open up other areas of spectrum to unlicensed devices." Interim chair of the IEEE 802.22 working group Carl Stevenson explains that such unused spectrum is perfect for setting up regional networks to supply broadband service in areas where wireline service is impractical because of sparse population density. Meanwhile, Schiffer says the SDR Forum's industry road map initiative will be assisted by commercial research awards from the National Science Foundation and the Defense Advanced Research Projects Agency. The military thinks SDR will provide radios that are flexible enough to operate in any country, while cell phone makers see SDR as a vehicle for consolidating radios in handsets and fixing bugs via software downloads. A major concern is SDR's vulnerability to hacking, but the SDR Forum will publish online within 30 days a document that sets out guidelines for securely downloading radio software, as well as issue to the private sector a request for information about security problems and resources.
    Click Here to View Full Article

  • "Nicta Humanises the Computing Environment"
    Computerworld Australia (09/27/04); Gedda, Rodney

    Improving people's understanding and sharing of information through data visualization methods is the goal behind Humans Understanding Machines (HUM), a National ICT Australia (Nicta) program that employs the new Visual Information Access Room (VIAR) in Sydney. Program leader and professor Peter Eades explains that HUM ultimately envisions a future where computer interfaces are ubiquitously deployed, to the degree that "The light that comes from every surface is controlled by a computer." Such interfaces will use walls as screens. Nicta CEO Dr. Mel Slater notes that the applications many people in Australia depend on are already producing a "glut" of data, one that will multiply in the years ahead. Visualization technology should help lighten the load. "Visualization technologies have a way of taking a look at complex sets of multi-dimension data and putting them in visual representations that let people extract the information that they need quickly," Slater points out. Eades says technologies such as the VIAR have practical business applications as collaboration interfaces incorporated into the office architecture--in fact, he thinks offices will come to resemble the VIAR 10 to 20 years from now. Eades sees the VIAR functioning as a surveillance center for stock market traders, among other things.
    Click Here to View Full Article

  • "Grants Will Preserve Paperless Bits of History"
    New York Times (09/30/04) P. E3; Hafner, Katie

    As part of a congressionally authorized $100 million initiative to preserve information resources that are increasingly digital in nature, the Library of Congress will announce on Sept. 30 a $15 million grant divided among eight institutions to archive Web sites, digital maps, audio recordings, and other kinds of electronic material. The recipients will supplement the awards with a matching contribution of cash, software, hardware, or consulting services. University of Michigan history professor Myron Gutmann, whose school is one of the grant recipients, explains that much of the material to be archived is scattered among various Web sites, computers, and research institutions, while some is still in paper form. "Our goal is to assure that the material remains accessible, complete, uncorrupted, and usable over time," he says. A wide range of social science data that includes opinion polls on aging, politics, women's rights, race relations, and employment will be archived by the University of Michigan, in collaboration with partners. Other grant recipients include the University of Illinois at Urbana-Champaign, which will focus on the preservation of sound recordings; the University of California, whose goal is to archive Web sites relating to the state's 2003 gubernatorial election; Atlanta's Emory University, which will work with several partners to store material about slavery, the Civil War, and the civil rights movement; and Raleigh and North Carolina State University, whose projects involve collation and archival of digital cartographic data from U.S. counties. Among the problems complicating the maintenance of electronic archives is the rapid obsolescence of computer hardware and software, limited durability of magnetic storage media, and sustaining the accessibility of links in Web-based documents. Effective preservation techniques for digital material are still being sought.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Clever Cars Can Read Road Signs"
    New Scientist (09/30/04); Graham-Rowe, Duncan

    Road signs could one day be supplanted by Global Positioning System-based information systems or roadside radio beacons, but until then the avoidance of accidents caused by driver ignorance of such signs could be improved by technologies such as the driver assistance system (DAS) from the National Information and Communications Technology Australia (NICTA) lab. DAS uses a camera affixed to the rear view mirror to scan the road ahead and two cameras mounted on the sides of the dashboard's instrument panel to track the motorist's gaze. Camera data is processed by an in-vehicle PC with software programmed to spot road signs and determine where the driver is looking, while the car's speed is also noted through a connection to the speedometer. Gareth Loy, formerly with the NICTA team, explains that previous color-based sign detection approaches were limited by variant lighting, but DAS sidesteps this problem by looking for symmetrical shapes--octagons, diamonds, circles, rectangles--distinctive of road signs. Once a road sign is detected by the camera, the computer compares its image to a sign database and monitors the driver's gaze via FaceLab software. If the driver is not focused on the sign, and the car is not responding appropriately to the sign, the system alerts the driver. Although some warn that sign detection systems could become irritating to drivers, NICTA developer Nick Barnes says DAS would have adjustable sensitivity or overrides. Preliminary trials of the system's performance reportedly went "very well," even at high speeds, and the NICTA team is expected to announce this at the International Conference on Intelligent Robotic Systems in Japan this week.
    Click Here to View Full Article

  • "IBM's 'Marvel' to Scour Net for Video, Audio"
    CNet (09/29/04); Kanellos, Michael

    The extension of search technology beyond text is illustrated by Marvel, a system from IBM designed to comb through thousands of hours of video and audio to retrieve specific clips. The ability of existing search engines to retrieve video clips or images is still text-based, and limited to a small portion of manually labeled files. Manual labeling is a laborious, time-consuming job that is becoming even more difficult with the rapid growth of information in need of classification. John R. Smith of IBM Research says the goal of the Marvel project is "to index content without using text or manual annotations." Marvel automatically categorizes and retrieves clips by assigning modifiers such as "outdoor," "indoor," "cityscape," or "engine noise" to the action depicted in the clip; the system employs a form of artificial intelligence called support vector machines, in which a computer is taught to tag pieces of data with the equivalent of a yes or no value. The amount of data condensed in even short video clips is enormous: The Marvel research team has recognized 166 distinctive dimensions for color-based queries, and the system will scan both the audio and visual tracks to counterbalance the many other dimensions that have to be ignored in order for Marvel to function efficiently. The project is a collaborative venture between IBM Research, libraries, and news organizations such as CNN; the first Marvel prototype, which was spotlighted in August at a Cambridge University conference, is capable of searching a database of over 200 hours of broadcast news video using 100 different descriptive labels, which IBM hopes to increase to 1,000 by April 2005. Earlier this year, Purdue University showcased a search engine that searches on a 3D sketch, while other organizations are developing software that can search on items within a limited range of subjects with more efficiency.
    Click Here to View Full Article

  • "Someday, Talking Computers Will Seem Cuddly as Teddy Bear"
    USA Today (09/29/04) P. 3B; Maney, Kevin

    Tech experts expect the public's attitude toward interactive voice response (IVR) technology, which borders on near-universal vilification, to undergo a dramatic reversal starting next year. They predict that IVR systems will significantly improve, eventually reaching a point where they are preferred over live agents. Among the organizations working to make this dream happen is TuVox, which touts technology that shuns the hierarchical, menu-driven IVR scheme in favor of a technique for deducing a caller's needs from words spoken conversationally. Software developed at the University of Southern California can detect frustration in the caller's voice, and could enable an IVR system to assume the customer is getting aggravated and transfer him to a live agent. Both IBM and Microsoft have released speech software, with IBM making its product open source so that a massive community of developers can improve it. Also contributing to IVR's expected turnaround is the increasing affordability of high-powered computers and the continuing improvement of speech-recognition software. However, getting computers to generate natural-sounding speech on the spur of the moment is tricky. Forrester Research projects that the speech application sector will become a multibillion-dollar industry by 2008--while even further ahead, IVR technology might become so affordable that companies could start offering cell phones with built-in conversational software.
    Click Here to View Full Article

  • "Tim Berners-Lee: Weaving a Semantic Web"
    Digital Divide Network (09/30/04); Carvin, Andy

    World Wide Web inventor Tim Berners-Lee said his intention was to include embedded machine-readable data in the Web, but that a contextual aspect eventually was not included in official specifications; the Semantic Web fills that gap, Berners-Lee said during a keynote at the MIT Technology Review Emerging Technologies conference. Even as it is now, the Web is fulfilling its roll as a collaborative medium that inspires creativity, as is being witnessed with blogs and wikis. One of the things Berners-Lee wanted to do with original Web specifications was to typify links according to the relationship between the Web site, adding a dimension of personal or professional relationship to Web structure. With the Semantic Web, every piece of Web site data would be linked to a URL so that information would not be static, but automatically updated and linked to other data on the Web. The Semantic Web will be tremendously tangled, but Berners-Lee said that connectedness will help a number of projects, including artificial intelligence programs, online translators, and other programs requiring descriptive data. Powerful social networks could arise on top of Semantic Web data. In the future, computers will automatically have relevant Web information ready for users according to their needs. Berners-Lee said royalty-free patents were a crucial aspect of successful Web development. Interestingly, he evangelized the early Web concept on alternative news groups instead of groups used by fellow scientists, where people such as Netscape founder Marc Andreesen picked up on the idea.
    Click Here to View Full Article

  • "Fifth Publication of the UN ICT Task Force Series"
    CircleID (09/28/04)

    The United Nations Information and Communication Technologies (ICT) Task Force recently published a collection of papers that were submitted for its Global Forum on Internet Governance meeting, which was held in New York in March of this year. The collected works, published under the title "Internet Governance: A Grand Collaboration," represent the best selections of about 30 papers that were submitted by various members of the international community, including stakeholders, Internet practitioners, and independent experts. The papers (located at http://www.unicttaskforce.org/perl/documents.pl?id=1392) were submitted at the urging of the Task Force Secretariat, with the idea that the exchange of ideas would provide guidance to the United Nations Secretary-General's Working Group on Internet Governance. The topics covered by the papers pertain to issues found on the Web site of the ICT Task Force, including public policy, development, technical, and regulatory issues. The published collection of papers has been divided into the following six themed sections: Understanding the Challenge; The Evolution of the Internet Governance Debate; Frameworks and Definitions; Public Policy Issues; Technical Issues; and The Way Ahead. Each of these themed sections is broken down into three categories--Background Papers, Comments, and Stakeholder Perspectives. The collection of papers includes papers of a technical or esoteric nature that have been parsed so that they are palatable to a wide international audience, while papers that deal with Internet governance issues in a more general sense have been presented in whole.
    Click Here to View Full Article

  • "Saluting the Data Encryption Legacy"
    CNet (09/27/04); Schneier, Bruce

    Counterpane Internet Security CTO Bruce Schneier acknowledges the debt that the field of cryptography owes to the publication of the Data Encryption Standard (DES), the first freely available encryption algorithm, about 30 years ago. DES, which the National Institute of Standards and Technology (NIST) proposed retiring as an encryption standard last month, was submitted to NIST--then called the National Bureau of Standards--from IBM, and was designated the government's standard encryption algorithm for "sensitive but unclassified" traffic in 1976. The National Security Agency (NSA) added a tweak to the algorithm that was not publicly disclosed, and reduced the key size by over 50 percent; these actions provoked an outcry, but also spurred research to either break DES or understand the tweak, which in turn birthed the modern academic discipline of cryptography. Schneier recounts that in 1997 NIST solicited a replacement standard for DES, one year before academic cryptographers built a machine capable of brute-forcing a DES key within a few days. Ten countries submitted 15 standard proposals in response to the NIST solicitation, with Belgium's Rijndael algorithm eventually selected as DES' successor, the Advanced Encryption Standard (AES). Schneier doubts that AES will become as pervasive as DES, given how much the cryptography landscape has changed in the last three decades. Still, he acknowledges that "A NIST standard is an imprimatur of quality and security, and vendors recognize that." Schneier notes that the NSA is still ahead of state-of-the-art academic research thanks to a bigger stable of mathematicians who have been working on problems longer, although he writes that academics are closing the distance.
    Click Here to View Full Article

  • "Taxicabs and Railroads: A New Approach to Building Adaptive Information Systems"
    Computerworld (09/27/04); Sapir, Jonathan

    InfoPower Systems' Jonathan Sapir advocates "a completely fresh approach to [software development] methodology" if companies are to erect adaptive information systems that fulfill the functionality, flexibility, and time-to-market needs of modern business. He compares the traditional software development model to the construction of railroad systems, which are characterized by rigidity in the form of fixed plans, fixed rails, stations, and predetermined time schedules: Customers must adapt to the system rather than the other way round. Sapir notes that the opposite case applies to the methodology of taxicab companies, where the organization must adapt in real-time to the customer, whose plans are rarely predetermined. This is the methodology he suggests as the new software development paradigm. Sapir argues that the business world is on the cusp of transitioning from the railroad model to the taxicab model, and this transformation "will make users responsible for automating their own jobs in ways that make sense to them; they will be able to 'package' their expertise and make it available as a service over the Web; and they will be able to synchronize these services with other services to achieve larger, more complex business objectives." The necessity and viability of the new approach is being driven by a convergence of technology and worldwide political, social, and economic developments. The emergence of Web services has given the software industry an open, universal standard for creating and putting together basic functionality components, while users are becoming increasingly IT-competent and responsible for effectively managing personal information services. Another trend Sapir points out is the growing obsolescence of existing software assembly modes fueled by the increasing rate of change.
    Click Here to View Full Article

  • "Computer Browsers: Virtual Tourists Are Helping the Swiss to Plan Their Landscape"
    Economist (09/16/04) Vol. 372, No. 8393, P. 87

    The Swiss government is relying on computer models of the Alps trekked by virtual tourists to determine whether subsidizing farmers to graze their cows in the mountains will result in better views. Government officials want to know how much views will be improved by having cows eat young trees, and where the most value will be gained. Researchers at the Swiss Federal Institute of Technology in Zurich tout the use of virtual tourists, or autonomous agents, as a more efficient way to gauge the impressions of real visitors. By using the computer model with agents programmed to act like real tourists, Swiss officials do not have to pay real people to walk the mountain routes over and over again. Moreover, Dr. Kai Nagel and his colleagues are able to alter the electronic landscape, such as by felling trees to create an alpine meadow, growing an instant forest, or adding a cable car in a certain plateau, and then access whether a large number of agents take advantage of the change. The next step is to test the accuracy of the best routes and views by having real hikers explore the simulation, and then use their feedback to program the virtual tourists to act more realistically.
    Click Here to View Full Article

  • "Optical Network Test Beds Blooming"
    Federal Computer Week (09/27/04) Vol. 18, No. 34, P. 52; Perera, David

    All-optical network test beds have grown over the past year thanks to expanding awareness of wide-ranging applications for high-performance computing, increasing availability of fiber-optic cable, new technology, and government funding. "The technology and the need and the ability to pay for it are coming together fortuitously, and that fortuitousness makes it a prime target for research," reports Grant Miller with the National Coordination Office for Information Technology Research and Development. The National Science Foundation (NSF) has apportioned grants to a pair of optical network test bed research projects: $6.7 million is going to a four-year, three-university effort to develop generalized multiprotocol label switching optical networks that deliver dynamic resource allocation, and $3.5 million has been awarded to a three-year initiative led by the University of Virginia to create a circuit-switched high-speed end-to-end transport architecture to support numerous e-science projects. Kevin Thompson, program director for the NSF's National Middleware Initiative, notes that burying fiber in the ground to support specific projects is impractical, while the simultaneous construction of the testing sites will foster close collaboration and prevent researchers from "reinventing the wheel 20 different times." Bill Wing with the Oak Ridge National Laboratory's Networking Research Group says scientists are scrambling to become proficient in end-user photon networking because of impending applications that will need to handle petabyte-scale files. Wing predicts that high-end computer users will be able to access all-optical networks in several years, while National Coordination Office for Information Technology Research and Development director David Nelson believes commercial use of optical networking could be widespread in approximately a decade.
    Click Here to View Full Article

  • "What Do Developers Want?"
    InfoWorld (09/27/04) Vol. 26, No. 39, P. 38; McAllister, Neil

    The results of this year's InfoWorld Programming Survey show that IT organizations are continuing to emphasize integration and consolidation rather than invest in new initiatives, but there are signs that may suggest a reversal of this trend in the next few years. Though 33 percent of respondents claim their organizations already have an enterprise application consolidation project underway, just 9 percent report that such projects are planned for the next 12 months, while 7 percent have them planned for the next 24 months; similarly, 26 percent say their companies are already replacing legacy mainframes and 24 percent are focused on augmenting their mainframe apps for modern architectures, yet only 5 percent plan to implement such projects in the next year. Java is the top programming language for 64 percent of respondents, and 48 percent expect to boost the usage of Web scripting languages over the next 12 months. However, the Perl scripting language has a greater number of adherents than C, while the Python object-oriented scripting language experienced the most growth in terms of popularity, compared to last year's poll. Microsoft.Net is the preferred framework or application programming interface for 53 percent of respondents, while another 51 percent project increased .Net usage over the next year. But though there was a 6 percent increase in C# users over last year, almost twice as many respondents are using Visual Basic, indicating that the installed Windows base is the chief source of growth in .Net adoption. Seventy-one percent expect to ramp up XML usage in the next 12 months, and 59 percent either already have Web services projects underway or are planning to start using Web services technologies in the next year. The results of this year's survey mirror last year's tendency among developers to leverage existing competencies.
    Click Here to View Full Article

  • "Tech's Future"
    Business Week (09/27/04) No. 3901, P. 82; Hamm, Steve; Kripalani, Manjeet; Einhorn, Bruce

    The maturation of affluent tech markets is spurring technology companies to court customers in emerging markets in China, Brazil, India, Russia, and elsewhere, with the underlying effect being a fundamental transformation of the industry. The opportunity these markets represent is rooted not just in the vast numbers of poor, underserved people in rural areas, but growing populations of middle-class consumers. The domination of Third-World economies by well-entrenched Western companies and brands is not guaranteed, as local up-and-comers are establishing footholds, while increasing support for open-source software in developing countries is another potential obstacle. Eastern companies are expected to ascend in the new economic paradigm as wireless technology overtakes PC-centric products. These pressures are forcing the Western powers to formulate new business approaches and rethink product design, and this has led to a veritable innovation explosion from companies both old and young. C.K. Prahalad, business professor at the University of Michigan Business School, says, "What's required is a fundamental rethinking of how to design products and make money." Devices and software will not make a splash in the developing world unless they fall in price and become easier to use, durable in harsh environments, and more compact, while still being feature-rich. Paramount to the successful targeting of innovations in these markets is a deep understanding of peoples' needs, a goal many companies are aggressively pursuing through ethnographic studies and experimentation with new products and services in Third-World settings. Products that are modified for use in emerging markets can often prove useful and popular elsewhere. Some examples include the handheld Simputer device now being sold in India; a phone preloaded with the Koran that alerts users to prayer times and can point them toward Mecca that is aimed at the world's 1.4 billion Muslims; and e-Town, an Internet access solution for rural Chinese towns.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM