Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to technews@hq.acm.org.
Volume 6, Issue 610:  Wednesday, February 25, 2004

  • "Cyber-Terrorism Warning Sounded"
    Los Angeles Times (02/25/04) P. A12; Marino, Jon

    Experts and federal officials testifying yesterday before the Senate Judiciary subcommittee on terrorism, technology, and homeland security warned that terrorists could heighten the devastating effects of physical attacks by launching cyber-attacks against the systems that manage transportation, basic utilities, and emergency services in the United States, and former Marine Corps intelligence officer Dan Verton harbored no doubts that such an attack would occur sooner or later. He told the subcommittee that Al Qaeda is well organized and dedicated to undermining the U.S. economy by hacking encryption algorithms and penetrating major corporations' technological systems. Subcommittee Chairman Sen. Jon Kyl (R-Ariz.) noted, "We've seen reports that Al Qaeda has explored the possibility of damaging some of our key computer systems, seeking to cripple electric power grids, transportation systems, even financial institutions." He observed that the ranks of Al Qaeda sympathizers are swelling with young people with computing skills who could leverage those talents against the U.S. cyberinfrastructure. Kyl also found fault with the Homeland Security Department's anti-cyberterrorism initiative, citing a December 2003 report concluding that the speed of computer virus proliferation has more than doubled since 2001. Verton outlined a worst-case scenario in which a region the size of five U.S. states and three Canadian provinces could lose electric power for months, noting that government and private-sector officials are not fully knowledgeable of their connected infrastructure. FBI deputy assistant director Keith Lourdeau reported that federal agencies are collaborating with Japanese, African, Australian, Canadian, German, Russian, Romanian, and British authorities to address Internet-based security threats worldwide.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Controversial Government Data-Mining Research Lives On"
    Associated Press (02/23/04); Sniffen, Michael J.

    The Total Information Awareness (TIA) program may have been killed by congressional decree, but key elements of the program have survived at other intelligence agencies, according to congressional, federal, and research officials. TIA's goal was to employ data-mining to sift through public and private databases to track terrorists, which stirred up fears that the program would be used to spy on millions of innocent Americans. Congressional officials have not disclosed which TIA programs were eliminated and which were retained, but insiders report that TIA's Evidence and Extraction and Link Discovery projects, collectively encompassing 18 data-mining initiatives, are among the surviving components. The continuance of certain projects originally falling under the auspices of TIA has led Steve Aftergood of the Federation of American Scientists to conclude that Congress' decision to disband TIA was nothing more than "a shell game." Despite the death of TIA, Capitol Hill is still paying for the development of software designed to collect foreign intelligence on terrorists: a $64 million research program run by the Advanced Research and Development Activity (ARDA), which has employed some of the same researchers as TIA, was left untouched by Congress. The goal of ARDA's "Novel Intelligence from Massive Data" effort is to develop software that can cull information from petabytes of data, and ARDA says that it complies with privacy laws. Sen. Ron Wyden (D-Ore.) is pressuring the executive branch to disclose details of its data-mining projects to Congress, and recently urged a Pentagon advisory panel to propose laws on reviewing data.
    Click Here to View Full Article

  • "RSA Panel: Cryptography Can't Foil Human Weakness"
    eWeek (02/24/04); Hachman, Mark

    Four panelists invited to discuss cryptography with Counterpane Internet Security CTO Bruce Schneier at the RSA Conference devoted most of their attention to information security failures, and concluded that the most advanced cryptographic solutions can be undone by the user, who is the weakest link in the security chain. Cryptography Research President and panelist Paul Kocher reported that the only apparent consumer privacy enforcement solution, government regulation, is a terrifying prospect, noting that corporations and law enforcement agencies do not have a strong incentive to uphold consumers' privacy. The panelists agreed that government regulation implies the creation of new laws that usually muddle and impede information flow. The panelists posited that the challenge security experts currently face has more to do with the complexity of standards and government rules than the complexity of cryptosystems. MIT computer science professor Ron Rivest offered the controversy that erupted over the insecurity of Diebold Election Systems' electronic voting machine code as a example, pointing out that independent confirmation of results was not permitted until a grassroots movement stumped for the inclusion of paper ballots. Sun Microsystems chief security officer Whitfield Diffie said a fight is brewing over the definition and deployment of digital-rights-management, explaining that copyright owners are starting to dictate consumer usage of products. Adi Shamir of Israel's Weizmann Institute of Science said that stream ciphers used to encode real-time data streams are becoming obsolete, and a more powerful solution lies in employing microprocessors to encode data in blocks through block ciphers.
    Click Here to View Full Article

    To read about ACM's concerns regarding e-voting and security, visit http://www.acm.org/usacm

  • "Tech Jobs: Research and Development"
    NewsFactor Network (02/22/04); Hill, Kimberly

    Demand for research and development personnel has never waned, despite the turbulence of the American job market. "R&D continues to be a strong area for jobs for cream-of-the-crop kinds of people," asserts Challenger, Gray & Christmas CEO John Challenger, who also reports that demand is increasing for processor chips, wireless hardware, storage, and memory. "More capital is being spent on new technologies, and more companies are going to try to get at that revenue by providing a product with an edge in an area where there's real strength," he notes. The U.S. Department of Labor's Occupational Outlook Handbook lists computer-software engineering as the fastest growing occupational field, and expects this momentum to continue through the end of the decade. Government data estimates that an applications software engineer earns a median annual salary of $67,670, while the highest-paid 10 percent with that job title make upwards of $106,680; systems software developers are squeezing out a bit more pay. The median annual salary for computer-hardware engineers is $67,300. Deloitte senior manager David Parent concludes that the need for the highest quality R&D talent extends across all U.S. industries, not just the computer hardware and software sectors. Opportunities to work with cutting-edge technologies and a lack of well-defined career strategies are even more important than salary to prospective R&D employees. This adds up to fierce competition between companies for R&D staff, explains Parent.
    Click Here to View Full Article

  • "Immigration & Innovation"
    InformationWeek (02/23/04); McGee, Marianne Kolbasuk; Chabrow, Eric

    The United States is no longer accepting H-1B applications, claiming that the 65,000 annual cap has already been reached. Experts worry that such actions will make the United States a less attractive option for foreign-born IT workers, forcing them to go elsewhere to develop new technologies, which could prove detrimental to the American economy. U.S. employers argue that they need foreign tech talent, especially in specialized areas, to address a shortage of domestic talent. Countering this argument are people like health-care company IT manager Adrian Williams, who contend that the domestic shortage will only get worse if companies elect to hire cheaper foreign labor, taking job opportunities away from recent U.S. college graduates. Jim Wilson, a systems engineer for San Mateo County, Calif., says educational shortcomings are the chief culprit behind the lack of qualified U.S. tech workers: Grade schools need to more heavily promote science and math, while community colleges must offer students the latest programming skills. Wilson adds that U.S. companies have "a moral obligation" to hire American employees first. There are also those who think the United States will maintain its status as the first choice for most foreign-born technologists and scientists, despite the H-1B limit. Annalee Saxenian, recently appointed dean of the University of California at Berkeley's School of Information Management and Systems, says U.S. capital markets are unmatched for their openness, while the country also boasts the most favorable entrepreneurial atmosphere in the world.
    Click Here to View Full Article

  • "Is Broadband Set to Make Power Lines Sing?"
    CNet (02/24/04); Hu, Jim

    The FCC has opened the door for electricity companies to offer broadband over power line (BPL) Internet access, and this move potentially adds electricity companies to the ranks of broadband providers, which currently include phone and cable companies. BPL would also help solve the last-mile gap hindering fuller broadband deployment across the country, since the electricity grid reaches more homes than cable or phone. EarthLink said it would begin a 500-home trial of BPL technology in North Carolina in conjunction with local utility Progress Energy; that deployment will be a real-world test of technology that has already proven viable in the laboratory, according to EarthLink officials. Packet-based broadband signals will be delivered to Wi-Fi stations, broadcast wirelessly, and then picked up by wireless broadband routers installed in customers' homes. EarthLink's Kevin Brand says BPL promises to be price-competitive with cable and DSL broadband services. Previous BPL pilots include well-financed attempts such as Nortel and United Utilities' Nor.Web program in the United Kingdom in the late 1990s, but that project failed because of cost and technical glitches such as lampposts that picked up and rebroadcast signals. The FCC decision to open BPL is only preliminary, and specifics about how utility companies use their wires to carry Internet data will be decided by local public utilities commissions. If BPL technology is proven successful in the field, it will have to overcome serious regulatory battles before it reaches the mass market. BPL technology has yet to prove itself; one of the biggest issues is signal interference, particularly for radio signals. American Radio Relay League President Jim Heynie says, "Any time you put a signal on top of a metallic object such as a power line, it's going to radiate and I'm going to hear it. The industry has not addressed the reception problem."
    Click Here to View Full Article

  • "High-Tech R&D: Too Vital to Outsource?"
    E-Commerce Times (02/23/04); Diana, Alison

    A 2003 study by AMR Research found that 50 percent of research and development units in the automotive, aerospace, defense, and high-tech industries had no plans to outsource IT support, while only 4 percent of tech companies said they outsource R&D. AMR's Lance Travis reports that most companies do not want to outsource R&D functions because "they consider R&D the crown jewels of the company and don't want to lose them."
    He adds that to keep security risks down, most companies opt for a global delivery model in which R&D operations are split between onshore and offshore services. Stout Systems President and owner John Stout points out that offshoring software development often does not yield the promised cost savings: "The requirements-gathering and design work is almost exclusively done at home, because it is found that these functions do not outsource overseas effectively." Still, Bain & Co. VP Tom Banning says that his firm believes companies could transfer more than 40 percent of their R&D activities overseas. Furthermore, the increasing sophistication of technology and business processes could make offshore outsourcing even more palatable to U.S. companies. For the moment, the lion's share of R&D work in the United States appears to be stable. OOCenter.com managing editor Allen Stern takes note of other IT jobs that may be safe from offshoring, such as Web design and positions that require direct contact with clients.
    Click Here to View Full Article

  • "Real Genius"
    Federal Computer Week (02/23/04); Anderson, Tania

    Federally funded research and development projects spurred by terrorism concerns are expected to debut as new technologies this year. The Defense Advanced Research Projects Agency's (DARPA) $2.2 million Software for Distributed Robotics program has yielded Centibots to assist in search and rescue missions and detect concealed enemies. Centibots, which are being developed by SRI International, ActivMedia Robotics, the University of Washington, and Stanford University, "are each individually autonomous, and further, they can act together in a team so they can make trade-offs, exchange information and help each other," notes Charlie Ortiz of SRI's Artificial Intelligence Center. The Centibots are to be demonstrated to DARPA officials in situations that involve the machines mapping out environments for search and rescue operations and detecting hazardous chemicals and biological agents in buildings. DARPA's Compact Aids for Speech Translation project, formerly known as Babylon, has set as its goal the development of handheld translators for use by soldiers and medical teams stationed abroad, while domestic emergency services are also in talks to employ the technology. First-generation translators, also known as Phraselators, were PDAs that held several thousand English phrases in different languages; second-generation devices used phrase-based translation software that facilitated a two-way conversation between an English speaker and a speaker of Pashto; the third-generation translator currently under development will support two-way, spontaneous translation. Meanwhile, with the events of Sept. 11, 2001 sparking fears that cargo containers shipped to the United States could secretly hold dangerous materials, security officials have sponsored research into advanced chemical and biological detection devices; CACI International is working on a system for monitoring container content via ultra-wideband data signals.
    Click Here to View Full Article

  • "Grid Forum Backs Utility Computing Standards"
    IT Management (02/24/04); Shread, Paul

    The Global Grid Forum (GGF) is supporting the Distributed Management Task Force's (DMTF) initiative to draft utility computing standards. "The collaboration will deliver the usability the industry requires, and provide standards that capitalize on existing efforts to deliver the management capabilities that will be essential to creating the tools and frameworks necessary for utility computing," says Argonne National Laboratory senior fellow and GGF Chairman Charlie Catlett, who adds that GGF will hold the first conference of an "Enterprise Grid Requirements" research group at the upcoming GGF 10 in March. Standards activity related to Grid, Web services, and utility computing has increased in the last couple of months; such activities include initiatives to more closely align Grid and Web services and build a data center language standard. "The efforts are indeed complementary, and where we have found intersection of activities we have created high-bandwidth liaison activities," Catlett reports. He notes that GGF boasts a Common Management Model working group that was founded by DMTF participants, and one of its goals is a commingling of distributed systems management and Grid/utility computing. Catlett calls the WS-Resource Framework (WSRF) initiative to turn core elements of GGF's Open Grid Services Infrastructure (OGSI) standard into a series of Web services specs the clearest indicator of convergence. He assumes that OASIS will standardize the WSRF specs, with the GGF OGSI working group functioning as liaison. Catlett also expresses interest in the Data Center Markup Language effort, and says he is considering how he can include it in grid projects he is undertaking independent of GGF.
    Click Here to View Full Article

  • "Finding a Way to Fry Spam"
    CNet (02/24/04)

    Independent Internet and email consultant John Levine, who co-chairs the Internet Research Task Force's Anti-Spam Research Group and is a member of the Coalition Against Unsolicited Commercial Email, argues that curbing spam must be the responsibility of both users and ISPs, noting that anti-spam technologies must be complemented by legislation in order for there to be an effective spam solution. He explains that recipients rather than senders bear the costs of processing emails, and observes that the proliferation of spam has two negative consequences: It forces people to install increasingly intrusive spam filters that are more and more likely to block legitimate messages, and it reduces the appeal of email in general. Levine points out that anti-spam legislation such as the Can-Spam Act is ineffective because it does not support an effective enforcement model--under Can-Spam, only state attorneys, not individual users, can take legal action against spammers, while the FCC also has the authorization to sue but lacks the funding to do so. Levine says that although spamming is not technically unlawful, it is employed to carry out unlawful activities such as phishing, in which users are fooled into giving away sensitive personal information by answering emails that purport to be from trusted senders, when in fact they are not. He adds that most spam is delivered by worms and viruses such as MyDoom, which hijack vulnerable computers and turn them into spam launching platforms; he comments that ISPs should take a more proactive approach to this problem, first by monitoring and counting the amount of email being sent by their users, and then quarantining infected systems, alerting users to the problem, and making corrective recommendations. Levine also notes that users must be made more aware that spam filters can block important email as well as spam, especially if the spam comes from U.S. ISPs.
    Click Here to View Full Article

  • "It's About Connectivity Not the Internet!"
    CircleID (02/23/04); Frankston, Bob

    Popular thought about the Internet is so muddled that most people do not understand the opportunities lost when discussing Internet governance, media consolidation, and new carrier services, argues Bob Frankston. The word "Internet" no longer denotes the transport of bits, but includes the meaning of the bits as well, and this confusion masks the fact that broadcast, cable, cellular, and telephone companies are all basically using the same Internet protocols, or connectivity, while connecting to the customer in different ways. In order to hold onto their customers, media and carrier companies are trying to impede natural innovation from smaller players, notes Frankston. Recent reporting in The New York Times, for example, failed to recognize the common theme to three stories run simultaneously, and on the same page; the stories dealt with the FCC decision on VoIP, a Comcast acquisition of Disney, and the buyout of ATT Wireless. Frankston contends that the reporter did not address the issue that VoIP is fundamentally a technology, not a service, and should not be under FCC purview except that it deals with the legacy phone network. Similarly, the Comcast-Disney combination is not about convergence, but about broadcast and media giants attempting to limit the opportunities of new experimental players entering the market. Reporting on the Internet itself similarly focuses on governance and technological issues that put responsibility in the hands of gatekeepers instead of end users; peer-to-peer communities have countered this centralization by building on top the existing Internet and treating the Internet as a routing service instead of a layer. The author recommends that innovators build upon simple connectivity instead of on centrally controlled aspects such as IP addresses, and adds that when the press is adequately knowledgeable about this distinction, then they will be able to accurately inform users about lost opportunities and needed remedies.
    Click Here to View Full Article

  • "It's Time to Talk Mobile Phone Security"
    InternetNews.com (02/19/04); Graff, Mark G.; vanWyk, Kenneth

    The mobile phone industry needs to address the issue of security, thanks to the migration of bugs and digital attacks into that arena. AL Digital's recent study of mobile phones lists Bluetooth-related vulnerabilities in some phones that could allow an attacker to read and write to a phone's stored information. Mobile phone manufacturers do not yet have the infrastructure needed to respond to vulnerabilities properly, though their addition of Internet access to their devices makes them responsible for responding. Operating system developers have already established mechanisms to fix bugs and to announce and distribute the fixes; mobile phone makers and manufacturers of embedded systems should do the same, write cyber security researcher Mark G. Graff and information security professional and author Kenneth vanWyk.

  • "Are You Ready for MDA?"
    Software Development (02/20/04); Ambler, Scott W.

    Model-Driven Architecture (MDA) is touted to be applicable to a wide-range of development tasks by vendor-backed organizations and modeling experts, but other software development thinkers understand MDA has serious flaws. MDA relies on complex modeling tools to transform platform-independent models into platform-specific models and then finally into the end-result working system. All the models are built using Unified Modeling Language (UML) from the Object Management Group, which also created MDA. Despite the advertised benefits of MDA tools, organizations need to carefully examine their needs and resources before attempting to use the architecture: First, only a very small percentage of software developers are adept modelers, meaning that organizations either have to train existing developers or hire expertise, which is likely to be in high demand; second, abstract models and UML are not well understood by business stakeholders and difficult to learn, which prevents users from being actively involved in system specification, as opposed to the simpler tools and techniques advocated in user-centered design. MDA technology itself may prove a problem as a preferred tool may not work on the platform at hand, and is really not appropriate for business application development. Additionally, MDA tools at this point almost always lose data when sharing models between them, thus leading to vendor lock-in. Finally, MDA does not have a test-driven approach similar to test-driven development used in programming-based development. These limiting factors mean MDA is actually useful for just a small number of companies who have adequate skills and job requirements; Agile Model-Driven Development, which does not rely on complex modeling tools but requires thoughtful planning, is a much better model-based approach.

  • "Apache Rewrites License"
    SD Times (02/15/04) No. 96, P. 1; Rubenstein, David

    Starting March 1, all Apache projects must employ a rewritten version of the Apache license that eases reuse across projects, increases the license's compatibility with the GNU General Public License (GPL), and bolsters safeguards against patent infringement claims. "We want to make sure open-source projects are as protected as possible," explains Apache Software Foundation co-founder Roy T. Fielding, who notes that the scope of use for Apache software remains the same. The Apache license has been made more generic so that Apache code reusers can use the same files without changing the licenses; Fielding explains that mentions of Apache software have been removed, while users can also insert their own liability and warranty protections. One license revision requires that contributors who consciously encroach on their own patents in a contribution must provide a license shielding Apache and third-party users from claims, and Fielding points out that termination would ensue on any patent licenses granted under the terms of the Apache license in cases where this type of patent infringement results in litigation. In the event that contributors unwittingly offer patented software, Apache would remove the code once it has been notified. Certain members of the open-source community wanted the Apache license to be more closely aligned with the GNU GPL, which mandates that any changes to the code must be released back into the community. Fielding says that Apache more highly values the community than the code, since the code changes every several years as the result of technological innovations. "We think contributions should be freely given and not coerced," he asserts.
    Click Here to View Full Article

  • "Software"
    Business Week (03/01/04) No. 3872, P. 84; Baker, Stephen; Kripalani, Manjeet; Hof, Robert D.

    The dreams of prospective U.S. software developers have been tempered in the last several years as more and more jobs have been outsourced to cheaper overseas labor, and Silicon Graphics CEO Robert R. Bishop concludes that American programmers "are competing with everyone else in the world who has a PC." The spread of programming and other tech jobs outside the United States is threatening America's leadership position in the global tech economy, but optimists say offshoring will actually benefit the economy by accelerating innovation and fostering growth in other industries. On the other hand, there are clear concerns that the falloff of U.S. tech students--which may increase as a result of fewer job opportunities--could lead to an unhealthy reliance on foreign talent, which could in turn backfire if foreigners decide to return to their native countries to start their own businesses. Achieving job stability in such a turbulent market requires more than just computing skills: People skills and project management skills are gaining importance, and universities may need to overhaul their computer-science programs to better equip students for jobs that demand these qualities. On the other side of the ocean in major outsourcing centers such as India, tech graduates face their own challenges, one of the biggest being setting up a homegrown venture industry so they can start their own companies. Leading Indian tech graduates are chiefly relying on working for U.S. companies in India so they can avail themselves of the latest research and more creative programming. However, the United States outclasses offshore outsourcers in certain areas, such as diversity. American universities and tech companies boast a multicultural, multilingual environment, which plays a vital role in global software projects.
    Click Here to View Full Article

  • "RSA Show to Highlight New Security Approaches"
    Network World (02/23/04) Vol. 21, No. 8, P. 9; Messmer, Ellen

    Security vendors and other large IT players plan to unveil new approaches to fixing vulnerabilities and user authentication at the RSA Conference 2004. The Organization for the Advancement of Structured Information Standards (OASIS) is nearing the release of its Application Vulnerability Description Language (AVDL) Version 1.0, which will allow compliant security products to share vulnerability and patch information via XML; several security tool vendors have AVDL-enabled products lined up for the RSA Conference, and NetContinuum and Spi Dynamics will demonstrate how their application-layer firewall and vulnerability-assessment tool can automatically share information and create temporary patches. Gartner analyst John Pescatore says such a solution would dramatically narrow the response time of organizations to new application vulnerabilities, while the U.S. Department of Energy already intends to use AVDL as the basis for its internal incident advisories. OASIS will also likely include digital signatures in future iterations of the AVDL standard, according to NetContinuum vice president Wes Wasson. Authentication issues are not as clear-cut, but are nonetheless generating a lot of activity at the RSA Conference with Microsoft and host RSA announcing an integration deal that worries other security vendors. VeriSign Security Services vice president Mark Griffiths says the Microsoft-RSA deal would be a lot different if it was an RSA plug-in for the desktop, for example. For its own part, VeriSign plans to launch hosted authentication services worldwide by the end of the year, allowing companies to defer the costs of setting up their own authentication infrastructure. VeriSign is also pushing an Open Authentication standard that would create interoperability among vendors' different token products, an effort Burton Group analyst Trent Henry says may not go anywhere because of VeriSign's relative weakness in the token market.
    Click Here to View Full Article

  • "Want to Stop Spam? Multiple Techniques in Unison Is the Answer"
    Computer Technology Review (01/04) Vol. 24, No. 1, P. 36; Korsak, John

    A multi-pronged strategy is the only true option a business can take to curb spam, as there is no one method that can thwart all of the tricks spammers use. A particularly effective approach is to educate end users on spamming techniques and countermeasures, such as: Not purchasing products or services as a result of a spam message; not using a legitimate email address when posting to news groups' list servers, bulletin boards, or chat rooms; not replying to spam in any way; refusing to use one's business email address online unless the collecting organization is trustworthy; deactivating an email client's ability to preview messages or disabling outbound HTTP for the mail client whenever possible; and forwarding spam to the IT department. There are variety of spam filtering measures, such as connection filtering, which can recognize many tactics spammers employ to avoid being traced, and block spam from known spammers. Other filtering options include SMTP filtering, whereby receiving servers use filtering rules when they communicate with sending servers to halt spam before it can be delivered into the organization's mail system; content filtering, in which legitimate email and spam is distinguished by inspecting words, phrases, structure, and URLs within the message; Bayesian filters, which statistically determine the likelihood that an incoming email is spam by studying the words in the message; and HTML tag filters, which zero in on HTML characteristics common to spam. The likelihood of false positives can be lowered considerably by leveraging email server features such as the ability to skip authenticated users, white lists, and trusted IP addresses. The above techniques, when integrated, can turn many spammers' own methods against them, but companies must also remain vigilant of spammer strategies, which are constantly changing to thwart more sophisticated anti-spam measures.
    Click Here to View Full Article

  • "10 Tech Trends to Bet On"
    Fortune (02/23/04) Vol. 149, No. 4, P. 74; Vogelstein, Fred; Boyle, Matthew; Lewis, Peter

    A number of tech trends are poised to do well over the next few years: Smart dust--tiny, wireless, low-cost sensors that can monitor temperature, vibrations, light, and even radiation and toxicity levels--are expected to make their mark in commercial, military, security, and ecological projects over the next 24 months; 150 million "motes" should ship by 2006 and be employed on factory floors to predict equipment failures, in forests to anticipate fires, on truck tires to maximize fuel efficiency and avoid accidents, and elsewhere. Convergence of PC and TV technology is finally a reality thanks to inexpensive flat screens and hard drives, simple home-network installation, and easy broadband access, while such applications as recording TV shows via computer and listening to music on PCs are proving very popular. Open-source software is penetrating large and small companies, and governments around the world; it is an inescapable conclusion that it will penetrate consumer devices such as desktop PCs over the next two years. China is on its way to becoming the chief standard-setter for the 21st century: The Chinese government's investments in education and R&D facilities are rising while U.S. R&D spending levels are falling, and China has started pushing its own standards for operating systems, office software, mobile phones, and other vital technologies. Widespread use of geographically unrestricted Wi-Fi access is a likely prediction for the next few years, as Wi-Fi hot spots proliferate and wireless carriers agree to allow each other's customers to link anywhere; the release of the WiMax standard this year could close the "last-mile" gap of bringing high-speed services into the home, according to Intel executives. On the horizon is Subscription Burnout, in which network service subscribers grow tired of the many bills they must pay, while online ad spending is enjoying a resurgence. Finally, VoIP is likely to explode, but it remains to be seen whether telcos will benefit: The upshot is that consumers will pay less for phone calls, but VoIP is threatening to erode telcos' revenues because of the inevitable price war.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "The Robots Are Here"
    Technology Review (02/01/04) Vol. 107, No. 1, P. 30; Brooks, Rodney

    Rodney Brooks, director of MIT's Computer Science and Artificial Intelligence Laboratory, believes robot technology is currently at the same point that computer technology was in 1978, and contends that robots will become as ubiquitous as email and the Web in 15 more years. He points out that robots have begun to migrate from labs to consumer households; robots are also being employed by the military for reconnaissance missions, while universities offer robotics graduate programs and are starting to offer undergraduate courses. Brooks also cites the growth of his company, iRobot, as evidence of robots' transition to marketable applications. The increasing usability of robots is directly related to continued advancement in robotic navigation, which is key to the successful performance of a wide array of machines ranging from automated lawnmowers to cleaning robots to military reconnaissance robots. But the market potential for robotic navigation could be outmatched by that for robot vision and dexterity, Brooks contends. Textile, home appliance, toy, and electrical goods industries, to name just a few, could be revolutionized by machines capable of recognizing and precisely manipulating objects. They would also save a tremendous amount of money in labor costs. Robot vision and dexterity have a long way to go, but research into nanotechnology, microelectromechanical systems, motion tracking, and face recognition is yielding promising results. Brooks thinks that the emergence of these technologies will dramatically restructure labor markets and immigration patterns, and adds that the most important application could be for elder care as the population of aging baby-boomers explodes in Japan, Europe, and North America.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)