Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to technews@hq.acm.org.
Volume 6, Issue 602:  Wednesday, February 4, 2004

  • "Europe Blames Weaker U.S. Law for Spam Surge"
    Wall Street Journal (02/03/04) P. B1; Mitchener, Brandon

    Brightmail estimates that more than half of all email in the European Union is spam, and Europeans claim U.S. anti-spam laws, which are far more lax than European regulations, are chiefly to blame. Eighty percent of EU spam is written in English, and that same percentage apparently originates from North America. EU law subscribes to an opt-in policy, in which email marketers cannot send unsolicited commercial email unless recipients specifically ask for it; U.S. law follows an opt-out policy, whereby spammers do not have to obtain prior permission from recipients to send them spam. Canada, Australia, and Switzerland have implemented an opt-in anti-spam policy similar to the EU model, while Japan, South Korea, and Mexico follow the opt-out strategy. Europe is demanding that the United States crack down harder on spamming, a vital issue in a week when the Organization for Economic Cooperation and Development is meeting in Brussels to call for more international cooperation on anti-spam enforcement. "The ball is very much in the [U.S.] Federal Trade Commission's court," notes European Coalition Against Unsolicited Commercial Email Chairman George Mills. The U.S. counters that adopting an opt-in policy would be detrimental to small businesses that rely on unsolicited email to market themselves and compete with bigger players, and force companies into the onerous task of proving they had permission to send email. Howard Beales III, head of the FTC's bureau of consumer protection, dismisses the notion that opting out worsens the spam problem, and urged conferees in Brussels to help deflate this "urban myth."

  • "Rock the Vote"
    GovExec.com (02/03/04); Harris, Shane

    Four computer experts recommended that the Defense Department halt its Secure Electronic Registration and Voting Experiment (SERVE) on the grounds that the system is susceptible to hacking and errors. But though they were invited by the Pentagon to appraise the SERVE system, Defense officials and SERVE contractors were apparently perturbed that the experts publicly aired their findings, and have downplayed the warning as a "minority report," since the experts only comprised 40 percent of a 10-member review panel. SERVE, a remote Internet voting system, was designed to enable servicemen and other American citizens stationed abroad to participate in the upcoming presidential election without resorting to mail-in absentee ballots. For ACM President Barbara Simons notes that she and the other three experts decided after two meetings to release a report that concentrated exclusively on the security problems they saw, adding that the six remaining SERVE reviewers were too unfamiliar with the problems to contribute to the study. Simons and report co-author Avi Rubin claim the Pentagon asked the review panel to sign a nondisclosure agreement that forbade them from publicly disclosing their findings, but relented after the group refused. Pentagon officials were reportedly satisfied with SERVE's security despite the warning, and Accenture's Meg McLaughlin promises that the SERVE project will go forward as planned. She argues that the report findings are misleading because the panel assessed SERVE as if 6 million Americans based overseas were using it, when in fact the test will most likely comprise no more than 100,000 volunteers. In addition, Accenture posted several "inaccuracies" in the SERVE report that allegedly had not been corrected before the study was publicly disclosed, a claim Simons vehemently disputes.
    Click Here to View Full Article
    To read the complete report, visit http://servesecurityreport.org/.
    To learn more about ACM activities regarding e-voting, visit http://www.acm.org/usacm/Issues/EVoting.htm.

  • "PC Makers Set to Face Costs of Recycling"
    Financial Times (02/04/04) P. 9; Harvey, Fiona

    Computer manufacturers are bracing for two major directives on the disposal of personal computers by the European Union that will have a significant impact on the cost of doing business. "It's the sheer cost that will be most damaging to our members," explains Dudley Ollis, program manager for environmental services at Intellect, a trade group for the electronics industry in the United Kingdom. "This is retrospective legislation, in effect, and companies have not made financial provision for this in the past." The recycling and disposal mandate comes at a time when PC manufacturers are seeing low margins, says Forrester Research senior analyst Paul Jackson, who adds that companies are likely to struggle if they do not pass the cost on to consumers. Gartner says recycling could raise desktop PC prices by $50, and replacing hazardous substances with new materials in the production process could add another $10 to the cost. Governments have become increasingly concerned about the disposal of PCs, which are built with components that have a substantial amount of heavy metals and other pollutants, because they end up in landfills. Industry observers do not believe recycling will give PC sales a boost just ahead of 2005. Ultimately, the new regulations could lead PC makers to build units that are environmentally friendly. Last summer Hewlett-Packard's laboratories were asked to find ways to build and package more Environmentally friendly machines. HP chief technology officer Shane Robison says, "We were asked to look at the PC stem to stern, to find ways to use alternative materials, to find different manufacturing processes, to become more energy efficient." As a result, HP is now working to reduce their factory's emissions, to use greener packaging, and to replace some plastics in their printers with corn starch.
    Click Here to View Full Article

  • "High-Speed Internet's Hurdles Still Considerable"
    Cox News Service (01/31/04); Jones, Adam

    University students and researchers at 205 U.S. universities enjoy a next-generation Internet experience using the Internet2 network. Students consume as much bandwidth as is allowed them and will take their need for Internet speed into the working world, says Georgia Institute of Technology chief technology officer Ron Hutchins. Since the inception of data networks in the 1960s, scientists have pioneered new technologies and applications that slowly become mainstream, says University of Michigan School of Information dean Dan Atkins. The Internet, however, is facing a number of human challenges as it grows in popularity, including spam email, computer viruses, and malicious spyware; these human problems are more formidable than the technical challenges facing the Internet, says academic Jaron Lanier. "The Internet, instead of being just an open network, is becoming more and more a world of fences and locks," he says. Carnegie Mellon University computer science associate professor Hui Zhang is nonetheless working on the serious technical challenge of connecting millions of American homes to a faster Internet network: As lead researcher for the National Science Foundation's "100 Megabits to 100 Million Homes" program, Zhang studies wireless networking solutions for bypassing the copper wires currently connecting homes to the phone companies' switching stations. And even after the so-called "last mile" gap is closed, the Internet will again require more reliability for high-speed business applications that cannot tolerate variances in performance. Though consumers are used to technical glitches and slowdowns, businesses stand to lose significant revenues if their high-speed Internet applications do not work, says University of Wisconsin professor emeritus Larry Landweber. Zhang says routers are another infrastructure that will need to be revamped in the future.
    Click Here to View Full Article

  • "Q&A: Open-Source Guru Eric Raymond"
    InternetNews.com (01/30/04); Wolfe, Alexander

    Open Source Initiative President Eric Raymond, author of "The New Hacker's Dictionary" and the more recent "The Art of Unix Programming," notes that programmers in school are now being trained on both Windows and Linux. He adds that "it's all PCs today," meaning that the next crop of programmers will be primarily familiar with PCs rather than minicomputers or mainframes. Raymond says the transition to 64-bit computing will be more important than he first thought, because the need for very large address spaces is accelerating faster than he originally anticipated. The open-source guru points out that many people hold a misconception that Extensible Markup Language (XML) is a solution to all problems related to data structure and organization, but he believes that the roots of XML security will be set down shortly when a standardized technique for securing XML data blocks in a verifiable way is issued. Raymond says the value of XML-based Simple Object Access Protocol (SOAP) is inflated, and thinks that SOAP will become impractical because it suffers from a "second-systems syndrome," in which the flaws of the first-generation system trigger a tendency to overdo the second-generation system to the point where it falls apart under its own weight. He explains that he prefers the Python scripting language to Java, and has refused to transition to the latter because he sees few dissimilarities in capability between the two languages. Meanwhile, he considers next-generation scripting languages such as Perl and PHP to be very useful; Raymond comments that scripting languages allow programmers "to automate the low-level bookkeeping [like] resource and memory management, so that [they] can concentrate on higher level tasks." He sees the value of keeping programmers on-site so that customer contact is maintained and institutional knowledge is preserved, and he thinks offshore outsourcing is a passing fancy.
    Click Here to View Full Article

  • "Firms Develop Gesture-Operable Digital Home Electronic Devices"
    NE Asia Online (01/26/04); Tachimoto, Shiro

    Development of gesture-operable input devices for home electronics is accelerating in a variety of industries, including gaming and automotive. The Remote Controller for Wearable Home Electronics Appliance developed by Toshiba's Human Centric Laboratory allows users to activate and deactivate lighting and air conditioning equipment by pointing at the appliance or waving up and down while wearing an acceleration sensor and a Bluetooth unit. Hitachi, meanwhile, is working on several advanced devices: One touchless input device linked to a PC through a universal serial bus recognizes nine different gesture commands, though recognition varies according to location and sensor directivity; another Hitachi innovation, the NEXTRAX touch panel display, can be controlled by hand gestures and finger inputs, and employs infrared radiation to facilitate triangular surveying. Hitachi expects NEXTRAX to become particularly useful as a 3D presentation tool for real estate agents or museums. A soon-to-be-released game from Sony Computer Entertainment will also be outfitted with gesture controls so that users can make menu selections as well as manipulate their on-screen counterparts through body movements. Force feedback and other tactile interface technologies are being incorporated into products such as Sony's UCP-8060 video editor, and BMW's iDrive vehicular environment integration system. Hitachi's Tactile Driver allows users to feel display panel buttons either by physically warping the display panel or by simulating the buttons' concave or convex shape.
    Click Here to View Full Article

  • "Matrix Plan Fuels Privacy Fears"
    Associated Press (02/02/04); Bergstein, Brian

    The Multistate Anti-Terrorism Information Exchange (Matrix), a quick-access information repository that integrates state-based data with privately held data, is currently in use by six states and being considered for use in several others despite privacy worries and concerns that the system is too similar to the federal Terrorist Information Awareness data-mining program that Congress pulled funding from last year. Law enforcement officials argue that the system is an efficient crime-fighting tool and note that that it only uses public state records data and does not attempt to make crime or terrorism predictions. Connecticut, Florida, New York, Ohio, Michigan, and Pennsylvania all participate in Matrix, while Utah ended its participation last week as did Georgia, which had said it was dropping in October but then failed to do so, blaming a miscommunication. Matrix combines participating states' records with 20 billion database files held by Seisint, and privacy rights advocates are concerned about the scope of the information available. Georgia provided state prison and sex offender records for the database, but state attorney general Thurbert Baker later ruled that Georgia could not share driver's license records unless state law was changed. Matrix project leader Mark Zadra, chief investigator for the Florida state police, says over 10 other states have seen presentations on Matrix in the last few weeks, and five or six more states are expected to join the program. Although controversy could derail the program, some sort of data-mining tool is likely to be implemented by states in support of homeland security. Matrix is backed by $12 million in federal funds and states could pay as much as $1.8 million annually, according to the AP. Others are concerned that the data in Seisint and the transfer of data to the database are secure; Louisiana backed out of Matrix due to security concerns.
    Click Here to View Full Article

  • "Unpopular Argument: Sending Tech Jobs Abroad Is Good"
    USA Today (02/04/04) P. 3B; Maney, Kevin

    Many U.S. technology executives believe the offshore outsourcing of programming and other IT jobs will bolster the economy and raise Americans' security and standard of living. Subscribers to this belief follow the theory of comparative advantage, which posits that countries that concentrate in the areas they excel at the most will enjoy increased productivity and higher living standards. For instance, though India's low-wage, high-volume tech workforce could perform software programming and innovative technology development at lower cost than the United States, India is "most best" at programming because it lacks the infrastructure or venture capital to support innovation; in contrast, programming is very costly while innovation is very cheap in the United States. It would therefore benefit both India and the United States to devote their energies to their "most best" areas and then trade, thus increasing programming and innovation while lowering their costs. By extension, the savings a computer company realizes by outsourcing programming would trickle down to consumers in the form of lower computer prices, and simultaneously raise the company's profitability so that the firm can recruit more staff and invest in new products. Oracle CEO Larry Ellison claims, "A free movement of labor allows us to become more efficient, produce better products at lower costs, grow more profitable, pay more taxes to the government which, in turn, looks after the people who have been displaced." Concurrently, a rise in living standards in other countries is good news to U.S. workers since so many products that other nations desire--cars, cell phones, etc.--are made in America, argues Opsware's Marc Andreessen. Such views are not very popular, given claims by executives, politicians, and others that outsourcing is eroding America's technological superiority; but the theory of comparative advantage has held up remarkably well since it was formulated almost 200 years ago.
    Click Here to View Full Article

  • "Pentagon Kills LifeLog Project"
    Wired News (02/04/04); Shachtman, Noah

    The Defense Advanced Research Projects Agency's (DARPA) LifeLog project, an initiative to chronicle every aspect of a person's life in a single database, was quietly disbanded in January by the decree of the Pentagon, much to the relief of civil libertarians who argued it could be used as an invasive profiling tool. Supporters claimed that LifeLog would serve as a digital memory that could assist users who would otherwise struggle to accurately recount past experiences and activities. The reasoning behind the project's cancellation is unclear--DARPA representative Jan Walker would only reveal that LifeLog was killed because of "a change in priorities." AI researchers are disappointed with the loss: "We were very interested in the research focus of the program...how to help a person capture and organize his or her experience," asserts Howard Shrobe of MIT's Artificial Intelligence Laboratory. The Electronic Frontier Foundation's Lee Tien reports that DARPA is more hesitant to initiate or continue projects with the potential to stir up controversy after the negative publicity surrounding the now-defunct Terrorism Information Awareness data-mining effort. David Karger, a colleague of Shrobe's at MIT, is confident that DARPA will continue to explore the digitization and mining of personal experience in one way or another. Meanwhile, private companies are pursuing their own memory storage and retrieval initiatives, an example being the MyLifeBits program conceived by Microsoft's Gordon Bell.
    Click Here to View Full Article

  • "Why This One Is Scarier"
    San Francisco Chronicle (02/03/04) P. B1; Kirby, Carrie

    The Mydoom computer worm's success in shutting down the SCO Group's Web site through a denial-of-service attack waged by 25,000 to 50,000 infected "zombie" computers raises the bar for malware in terms of damage and sophistication, but some security experts believe Mydoom was created as a spamming tool, rather than a political weapon wielded by fringe Linux advocates against SCO's attempts to halt the distribution of the Linux operating system. Such a possibility highlights the growing prevalence of financial gain as a motive for virus development and exploits. F-Secure systems engineer Tony Magallenez observes that viruses often follow a parallel evolutionary track to communications technology--for instance, the Melissa email virus made a big splash back in 1999 because email had just become a breakout communications medium. As email viruses became more advanced and threatening, email users grew more cautious, which in turn prompted virus authors to resort to new strategies to spread their malware, such as writing deceptive lines and messages. Bugs that spread automatically online, such as Code Red, Nimda, and Slammer, soon followed, and each new major worm proliferated faster than the one before it. Mydoom, the latest email worm, installs "back doors" in victims' computers, allowing hackers to commandeer those machines for their own ends. The original Mydoom permutation infected around 500,000 computers, according to Network Associates; a far smaller number of systems was tainted by the variant Mydoom.B worm, which is targeting Microsoft. Network Associates' Craig Schmugar reports that approximately 7 percent of Mydoom.B-infected computers will launch an attack on www.microsoft.com, which may hardly make a dent in its operation.
    Click Here to View Full Article

  • "Neural-Chaos Team Boosts Security"
    Technology Research News (02/04/04); Patch, Kimberly

    Researchers at Israel's Bar-Ilan University have integrated a neural network encryption scheme with chaotic signal synchronization to generate code that is very difficult to crack. The scheme involves two identical synchronized systems--one at the sender's location and one at the receiver's location--used to form encryption keys; in this way, a random source is shared between the two parties, and chaotic systems and neural networks make good random sources because they are complex and constantly in flux. Synchronized chaotic signals can be employed to facilitate private-key encryption, according to previous research. Earlier investigation also established that neural networks can be synchronized by having them affect each other, while the Bar-Ilan scientists successfully encrypted code via neural network synchronization. "Both networks are dynamic and [pass] some information about their state to each other, until they reach a dynamic symmetric state of their synaptic weights," explains Bar-Ilan's Rachel Mislovaty. The researchers mixed neural and chaotic synchronization by employing a pair of chaotic maps and a pair of software-based neural networks; the signal for the chaotic maps is built using the synchronized synaptic weights, which causes the maps' output to act as the neural networks' input. "As the synapses become more and more synchronized, the maps' signals become identical, causing the maps to synchronize as well," notes Mislovaty. The hybrid system exhibits faster synchronization than the neural-network signals alone, significantly reducing the likelihood that a snooper could store all the transmitted data that makes up a key, and allowing shared secret encryption keys to be produced using an unsecured communications line.
    Click Here to View Full Article

  • "DARPA-Funded Linux Security Hub Withers"
    SecurityFocus (01/30/04); Poulsen, Kevin

    The Sardonix project, a two-year-old Defense Advanced Research Projects Agency (DARPA)-backed research project designed to track Linux code for security audits, has been largely abandoned, says Sardonix founder and computer scientist Crispin Cowan, chief research scientist at WireX Communications. DARPA's funding ran out nine months ago, and although the Web site created to designate which Linux code had been audited and by whom still exists, it has seen little action. The project was designed to replace the existing loosely-structured Linux security review process with volunteers whose work would be tracked by the Web site and available for others to review and amend. However, the only participants have been U.C. Berkeley graduates students assigned to do the work by computer science professor David Wagner. Cowan blames a culture that rewards people for finding holes but not for ensuring that software is secure in the first place. He says, "It seems the Sardonix lesson is people don't want to play this game, they want to play the Bugtraq game."
    Click Here to View Full Article

  • "The Internet2 Commons: Supporting Distributed Engineering Collaboration"
    Syllabus (01/04) Vol. 17, No. 6, P. 24; Finholt, Thomas A.; Hajjar, Jerome F.; Hofer, Erik C.

    The George E. Brown, Jr. Network for Earthquake Engineering Simulation (NEES) project started by the National Science Foundation three years ago is an initiative to leverage the national cyber infrastructure to conduct research aimed at making built environments more resistant to earthquakes. A virtual earthquake engineering laboratory will be comprised of 15 NEES equipment sites linked together in a national collaboratory known as the NEESgrid, which will utilize Internet2's Abilene network to supply backbone connectivity for the video, data, and control information to be relayed between sites. But Internet2 has already played a key role in the system's development and enablement: Participants from each of the NEES components that make up the collaboratory needed to confer regularly, but face-to-face meetings and phone conferences were too problematic. H.323 video conferencing was adopted by the NEES community, which used Internet2's Commons suite of collaborative services to provide the multipoint infrastructure. The first Commons-hosted NEES video conference, the "NEES Equipment Site Technical Forum" (ES-TF), was held in late January 2002; now the ES-TF meets about two to four times a month to discuss such topics as data models, telepresence systems, control protocols, experimental setups, and new testing techniques. The service offerings now extend beyond data and video conferencing to encompass Internet2 Commons-derived multipoint video conferencing, Placeware Web conferencing, video streaming and archiving, and asynchronous email discussion and data sharing. The Internet2 Commons has evolved into a tool the NEES community can leverage to try out collaborative technologies before committing to major investments.
    Click Here to View Full Article

  • "Twentysomething"
    Federal Computer Week (01/26/04) Vol. 18, No. 2, P. 20; Hasson, Judi

    IT jobs in the federal government are proving very attractive to younger-generation information professionals, particularly those facing unemployment or underemployment in the private sector because of the economy. Such interest is vital when measured against the imminent departure of senior IT management. The federal government offers much of what the twentysomething IT workforce is looking for--a healthy benefits package, job security, and rapid promotion--and many younger workers are opting for government work out of a desire to make a difference, especially after the Sept. 11, 2001 tragedy. Another lure for young IT talent the federal government has begun to offer is pay banding, in which salaries and raises are based on talent and merit rather than longevity. There is still room for improvement: Information Technology Association of America President Harris Miller reports that the government needs to recruit more women, minorities, and disabled people, as well as help young professionals get security clearances. Furthermore, many IT workers cite a lack of holiday bonuses and maternity leave as problems, while some say that federal IT managers have a responsibility to train their successors before retirement so that their knowledge and familiarity with legacy systems is retained. Ira Hobbs, co-chairman of the CIO Council's Workforce and Human Capital for IT Committee, says the federal government is particularly interested in fresh expertise in the areas of enterprise architecture, security, and project management.
    Click Here to View Full Article

  • "Dual Curses: Viruses and Spam"
    Computerworld (02/02/04) Vol. 32, No. 5, P. 29; Ubois, Jeff; Betts, Mitch

    A Web-based survey of senior executives conducted by Computerworld and Ferris Research finds that viruses and spam are the biggest email-related headaches. IT managers are fearful of zero-day attacks because virus authors are exploiting software vulnerabilities faster. Meanwhile, spam is a source of frustration because it leads to lost productivity as well as embarrassment: A Nucleus Research study estimates that system administrators lose an average 4.5 hours of productivity a week to spam-related problems, while CIOs may feel pressured to solve spam problems because they are a source of irritation and humiliation in the workplace. Respondents to the Ferris/Computerworld survey also list regulatory compliance as a major email issue; in addition, a surprising result of the poll is the indication that concerns about dealing with denial-of-service attacks are growing among CIOs. Email downtime is apparently not a source of concern with CIOs, but respondents have expressed fears about prolonged periods of disabled email service stemming from hacker attacks. Instant messaging from wireless devices, migrating between email packages, switching messaging servers to Linux, and using mainframes as email servers are among the issues generating the least amount of concern among survey respondents, while email budget issues such as total cost of ownership are not among the top 10--an unexpected conclusion given how cost-conscious the CIOs are. Robert W. Reeg of MasterCard International reports that respondents generally frown upon switching email platforms partly because of the problems and costs inherent in such a migration, such as training and the loss of email archives. "I don't see any business case [that would justify migrating], unless someone's on a really antiquated, unsupported package," he argues.
    Click Here to View Full Article

  • "Computer Makers Tackle E-Waste"
    Industry Week (01/04) Vol. 253, No. 1, P. 60; Bartholomew, Doug

    The mounting problems of electronic waste and the introduction of e-waste legislation in more than 50 percent of U.S. states over the last 12 months is spurring IBM, Gateway, and other computer manufacturers to ramp up their recycling initiatives. International Data estimates that typical PCs last only three to five years before obsolescence or hard-drive failure ends their value to users, while another estimate indicates that some 315 million PCs will be discarded this year alone. Hewlett-Packard annually disassembles and recycles nearly 4 million pounds of junked PCs, monitors, or printers regardless of manufacturer, and consumers pay a takeback fee ranging from $13 to $30. Meanwhile, consumers can pay Dell Computer $7.50 to pick up and recycle up to 50 pounds of computers, and Dell's sustainable business director Pat Nathan says the company teamed up with the same outbound shippers to return old hardware. IBM lets both consumers and corporations return old PCs, which are either broken down into their basic materials for recycling or donated to charitable causes. However, IBM notes that almost 4 percent of the discarded hardware it accepts is dumped into landfills. Gateway relies on local recycling efforts and offers consumers $50 in discounts if they recycle an obsolete computer before buying a new one. The state of California now mandates that consumers purchasing PCs pay $6 to $10 to support municipal recycling initiatives, but environmentalists and manufacturers criticize this plan for lacking an incentive for PC makers to design more environmentally friendly products.
    Click Here to View Full Article

  • "Where Have All the Programming Jobs Gone?"
    Application Development Trends (01/04) Vol. 11, No. 1, P. 12; Seeley, Rich

    IEEE-USA contends that the trend to offshore engineering is partly responsible for the job losses programmers and other IT professionals have sustained in recent years. Ron Hira, who chairs IEEE-USA's R&D Policy Committee, reported that American IT workers are suffering from historically high unemployment levels; at a congressional hearing, he cited Bureau of Labor Statistics estimates that programmers are experiencing an unemployment rate of approximately 7 percent, compared to a roughly 6 percent national unemployment rate. Hira further noted that the unemployment rates for engineering managers and IT managers is 8 percent and 5.5 percent, respectively, while all categories of managers boast a general employment rate of 2.9 percent. "We have thousands of unemployed engineers [and] computer scientists who are having difficulty finding employment because many of the jobs are taken by cheaper, foreign labor," declared IEEE-USA President Elect John Steadman, and Meta Group's calculation that an average 41 percent of new development activity is now outsourced adds weight to this argument. The IEEE warns that the loss of engineering jobs has two negative effects: It complicates life for unemployed workers and erodes the United States' future IT innovation potential. Hira testified at the hearing that H-1B and L-1 visa programs originally set up to address a labor shortage at American tech companies are now being used as an excuse for firms to replace domestic workers with lower-wage foreigners that eventually return to their native lands and add to overseas tech competition. Hira related several IEEE recommendations at the hearing, among them: Regular tracking of outsourced IT jobs by the federal government; corporate notification of offshoring plans so displaced workers and federal support agencies can prepare; an overhaul of U.S. workforce assistance programs to better support displaced high-tech employees; and reinforcement of H-1B and L-1 workforce protections and enforcement so that U.S. employment opportunities for domestic high-tech workers are not threatened.

  • "The Rise of Telecities"
    Futurist (02/04) Vol. 38, No. 1, P. 28; Pelton, Joseph N.

    Arthur C. Clarke Institute executive director Joseph N. Pelton argues that the trend toward megacities characterized by faster transportation systems, centralized infrastructure, and increased urbanization is wrong-headed and will only increase populations' vulnerability to terrorist attacks, natural catastrophes, and human error; a much better urban model for the future is the telecity, a global community whose life, direction, and operations are determined by telecommunications. Pelton believes the megacity paradigm will ultimately give way to the telecity, not just because of terrorism or disaster concerns, but because of the thrust toward energy conservation, less pollution, cleaner air, more service-based jobs, and dealing with the high cost of urban property. The author recommends that the United States take a cue from Japan, which has reduced the impact of terrorism and natural disasters by deploying backup data centers and distributed wireless and satellite facilities; urban planners who want to transition to a telecity model should design more intelligent airports, transportation, and energy systems. Pelton envisions a global environment where commuting is unnecessary, as satellite networks, wearable terminals, handhelds, and telepresence technology would permit organizations to move operations to suburban, rural, or even offshore locations. The telecity model will be supported by growing ranks of wireless teleworkers with the advent of broadband satellite links, while teleservices such as remote health care, education, and government services will become commonplace. The world economy is already sustained by transnational electronic funds transfer systems; securing the telecommunications elements of such systems, along with other kinds of vital infrastructure, will be one of the key challenges of the 21st century. Pelton writes that telenetworks could enable emerging countries to access the Internet, e-commerce, and other services that buttress sustained development. He also notes that satellite and broadband networks will facilitate a society that is primarily visually oriented.