ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to t[email protected].
Volume 5, Issue 584:  Wednesday, December 17, 2003

  • "Bush OKs Spam Bill--But Critics Not Convinced"
    CNet (12/16/03); McCullagh, Declan

    The addition of President George W. Bush's signature to the Can-Spam bill on Dec. 16 has not silenced critics, who doubt that the new law, which supercedes many individual state anti-spam measures, will be an effective deterrent against the tide of junk email currently flooding the in-boxes of businesses and consumers. Under the law, which goes into effect on Jan. 1, spammers who falsify email headers or send sexually explicit email without clearly labeling it as such could face jail time, while the FTC is authorized to enforce the law and has the option to establish the email equivalent of a do-not-call registry. But opponents charge that Can-Spam will do more harm than good by not granting individual recipients the right to sue spammers, or forcing bulk emailers to obtain recipients' permission before sending them email; worse, certain kinds of spam would be legalized under the legislation. Sen. Debra Bowden (D-Calif.) alleges that the legislation "doesn't can spam, it gives it the congressional seal of approval." Officials of the very federal organizations tasked with enforcing the new law are among its harshest critics: FTC Chairman Tim Muris contended in August that Can-Spam requires high standards of proof of illegal spamming in order to launch lawsuits against spammers, while the National Association of Attorneys General fired off a letter to Congress last month warning that the bill is riddled with so many loopholes and exceptions that effective enforcement is impossible. The Direct Marketing Association (DMA) endorses Can-Spam--DMA public affairs director Louis Mastria insists that "anything that's less than a federal [anti-spam] standard would be, No. 1, ineffective and, No. 2, put a [damper] on the larger marketplace." Research groups such as Gartner doubt that the spam problem will improve with a federal law in place; if anything, the problem will get worse, given the lack of enforcement.
    Click Here to View Full Article

  • "Scientific Research Backs Wisdom of Open Source"
    NewsFactor Network (12/15/03); Martin, Mike

    Researchers at the National Science Foundation (NSF) and the University of California, Irvine, are united in their opinion that open-source software development is far more advantageous than closed corporate development in terms of quality, speed, and cost. "Free and open-source software development is faster, better, and cheaper in building a community and at reinforcing and institutionalizing a culture for how to develop software," declares Walt Scacchi of UC Irvine's Institute for Software Research, who has presented a series of online reports detailing the development model's advantages in the hopes that businesses can understand the implications of making internal or external investments in open source. "We're not ready to assert that open-source development is the be-all and end-all for software engineering practice, but there's something going on in open-source development that is different from what we see in the textbooks," he adds. Scacchi notes that most open-source development projects fail, and the study of Linux and other successful open-source projects makes a clear contribution to open-source research. David Hart of the NSF reports that Scacchi, along with Santa Clara University's John Noll, the University of Illinois' Les Gasser, and UC Irvine colleague Richard Taylor, are taking the lessons learned from open-source practices and applying them to devise new design, knowledge-management, and process-management tools for development projects spanning across multiple organizations. Gasser and Scacchi, for instance, are drawing insights about the relationship between bug disclosures and software quality by mining open-source project databases, which Scacchi describes as a cherished resource not found in corporate communities. Scacchi and colleagues are studying over 100 open-source projects covering categories such as network games, industry-backed projects such as IBM's Eclipse, and Web infrastructure projects.
    Click Here to View Full Article

  • "Execs Beg Nanotech Funding"
    San Francisco Chronicle (12/16/03) P. B1; Tansey, Bernadette

    The message of the Nanotechnology and Homeland Security Forum on Dec. 15 was that nanotech can significantly bolster the United States' critical infrastructure, but only if the federal government provides a solid foundation for the private sector to develop new defensive nanotech products. Peter Friedland of the NASA Ames Research Center, the forum's host, commented that a shortage of industrial markets could hinder nanotech development in certain areas, which is why a federal initiative is so important. One area of development that could probably not advance without federal encouragement is search and rescue, where nanotech research could lead to handhelds that provide up-to-date data on the location of victims and the integrity of collapsed structures. Aviron founder Dr. Leighton Read commented that another possible application is a self-replicating nanoparticle aerosol that could be used to vaccinate large populations from bioterror agents. MicroFluidic Systems President Allen Northrup noted that companies working on nanoscale sensors designed to detect the presence of anthrax and other biological weapons could prosper from strong civilian markets for such products, and reported that the FBI is devising a biosensor-based rapid human identification system. Meanwhile, Ames Research Center director Scott Hubbard said his facility is working on a carbon nanotube-based sensor primarily designed to detect minerals on Mars, but which could also be applied to homeland security as a device to identify bomb components. Participants in the forum praised the recent congressional approval of the 21st Century Nanotechnology Research and Development Act, which allocates a four-year grant of $3.7 billion to federal nanotech research.
    Click Here to View Full Article

  • "Open Source Takes on Hardware Biz"
    Wired News (12/17/03); Asaravala, Amit

    The mission of the OpenCores.org consortium is to promote hardware design via the open-source development model, and consortium founder Damjan Lampret's latest attempt to spread the organization's message was the announcement of a system-on-a-chip microprocessor devised from open-source schematics at a Dec. 15 conference in Mountain View, Calif. "Hardware has always been very proprietary--even more so than software," Lampret declared an interview. "But our system-on-chip shows that open-source technology can compete successfully in the hardware space, especially in embedded applications." Proponents of open source want to spur device manufacturers to embed the blueprints for OpenCores technologies into their own designs, which will add up to significant savings on research and development, and could ultimately lead to lower-cost hardware as well as accelerate the development of more open-source software. However, proprietary hardware manufacturers see little reason to fret over the open-source hardware movement. Intel's Chuck Malloy says the processors his company is working on are far more sophisticated than those OpenCores and its ilk are developing, while existing hardware patents will probably impede movement toward a full-featured open-source CPU. Furthermore, loosely connected consortia such as OpenCores are unlikely to have the financial resources needed to mount a strong defense against costly patent infringement lawsuits.
    Click Here to View Full Article

  • "The Spies Who Come in Through the Keys"
    Financial Times (12/17/03) P. 15; Morrison, Scott

    Snoopware--software that can be installed surreptitiously on victims' computers and record their keystrokes, emails, passwords, chatroom postings, and Web site visits without them knowing--is thought by security experts such as Earthlink VP Matt Cobb to be "the next big threat" to both corporate and individual privacy. There are various snoopware programs currently available, promoted mainly as tools for employers to monitor Internet use in the workplace or for parents to keep track of their children's computer activity. But the products' potential for abuse--potential that has only recently come to light by real-world instances of such abuse--is even greater. For instance, a New York hacker confessed this year that he installed keylogging software on public computers at 13 Manhattan Kinko's outlets so that he could record and purloin personal data from over 450 people, which he sold online and also used to divert money from his victims' bank accounts into new accounts he set up in their names. Most snoopware victims have been individual users, but corporations have recently become targets. Snoopware is often innocently downloaded by unsuspecting victims off the Internet, as well as buried in email attachments, viruses, and pop-up ads that trigger snoopware downloads when users click on the "close" button. Though Symantec and other computer security organizations have begun to offer snoopware detection programs, Network Associates' Ryan McGee believes snoopware authors will inevitably make their software harder to identify. People at the greatest risk of being victimized by snoopware are users of public computer terminals, while broadband users also have a high degree of vulnerability.
    Click Here to View Full Article

  • "Head of US Delegation Talks Shop"
    IDG News Service (12/15/03); Blau, John

    IDG News Service spoke recently to David Gross, head of the American delegation at the three-day World Summit on the Information Society (WSIS) in Geneva. On several of the summit's main issues--Internet governance, financing for Internet growth in developing nations, and software and intellectual property--American representatives bargained hard, and the language of the official documents reveals an expressed U.S. interest to either protect the current status on such matters as Internet governance and financing or to make certain that the interests of leading American-based suppliers are represented. Gross believes that while ICANN plays a crucial role in Internet governance, including top-level domain name allocation, it is not a group intended to address Internet regulation matters. He said the U.S. delegation feels that Internet governance mandates a numerous-stakeholder tactic that would include governments and the private sector. He added that in regards to financing Internet expansion in developing countries, the United States has a digital freedom initiative that concentrates on nations that are investing in their individuals and that have the correct regulatory and legal atmosphere. Lastly, Gross expressed satisfaction with the intellectual property paragraph, which specifically recognizes that intellectual property rights are a vital part of the information society and that information diffusion is an important factor in the information society.
    Click Here to View Full Article

  • "'D' Is for Disputes That Shape Our Lives"
    Toronto Star (12/15/03); Geist, Michael

    Canada in 2003 was a hotbed of headline-grabbing technology law developments, which Michael Geist of the University of Ottawa recounts to illustrate the wide variety of subjects they covered. The Canadian Parliament considered--and failed to pass--two separate anti-spam bills. One bill called for the establishment of a do-not-spam list, greater penalties for spammers, an international initiative to address the spam problem, and the founding of an Internet Consumer Protection Council to set spam reduction standards. Another bill called for inserting anti-spam provisions in the Canadian criminal code, and imposing severe prison terms and fines on spammers that increase with each subsequent offense. Just last week, the Copyright Board of Canada ruled that the tariffs on blank media established to reimburse musicians and the music industry in exchange for consumers' rights to make personal copies of music should be broadened to include MP3 players, a move indicating that the Board considers peer-to-peer music downloading legally valid in certain situations. The SOCAN v. CAIP case, which the Supreme Court of Canada heard in early December, brought to the fore the issue of ISP accountability for the usage of caching technology to accelerate network access to content as well as whether ISPs should pay a tax for Web-based music downloads. Canada hosting its first conference of the Internet Corporation for Assigned Names and Numbers in Montreal in June, with its central debate on Internet governance, also made news. Meanwhile, in August the Alberta Court of Appeal upheld the acquittal of a man charged with sending spam that marketed bomb-making information, on the grounds that the spammer lacked the knowledge needed for the offense of counseling a criminal offense that was never committed.
    Click Here to View Full Article

  • "Supercomputers Let Scientists Break Down Problems in Reverse for Better Quake Models"
    Pittsburgh Post-Gazette (12/15/03); Spice, Byron

    Carnegie Mellon University and Pittsburgh Supercomputing Center researchers earned the Gordon Bell Prize in November for their earthquake simulation experiments, which among other things yielded insights about subsurface geology based on the study of surface motions--in other words, determining the causes of quakes by working backwards from the effects. The data drawn from this inverse method can also be applied to computer models to make more accurate earthquake predictions, as well as weather forecasts and chemical terrorism analysis. Supercomputers were employed by the CMU researchers to solve the inverse problem of earthquakes because of the huge amount of calculations involved. CMU professor of civil and environmental engineering Jacobo Bielak reports that performing the inverse problem took longer than originally projected because "we just weren't aware of how hard the forward problem would be." The Pittsburgh Supercomputing Center's LeMieux supercomputer, a machine capable of trillions of calculations every second, has allowed researchers in CMU's Quake Project to model quake activity in the Los Angeles basin, albeit in a limited fashion. Through the evaluation of soil and rock conditions at 10-meter intervals, the team can simulate quakes in an area of the L.A. basin 100 kilometers square and 50 kilometers deep, but only their effects on buildings five stories or higher can be measured. Including smaller buildings and homes would require the simulation of seismic waves at shorter wavelengths, a job that calls for a machine 20 times faster than LeMieux and readings of ground conditions at 1-meter intervals, which CMU engineer Omar Ghattas calls an impossible task. The inverse problem offers an alternative solution to this dilemma.
    Click Here to View Full Article

  • "World's Biggest 'Virtual Supercomputer' Given the Go-Ahead"
    SpaceRef.com (12/16/03)

    On Dec. 16 the Particle Physics and Astronomy Research Council (PPARC) approved funding for the construction of GridPP2, a gigantic "virtual supercomputing" grid that will eventually be incorporated into a larger European grid to process the vast amount of particle physics data yielded from CERN's Large Hadron Collider (LHC) facility, which is slated to go online in 2007. "The GridPP2 Grid will address the future computing requirements of all the U.K. Particle Physics Experiments and should provide efficient sharing of resources between Particle Physics and other disciplines at the institutes," proclaimed GridPP Collaboration Board Chair Steve Lloyd. The LHC particle accelerator is expected to produce data at a rate equal to 20 million CDs yearly. PPARC Director of E-Science Dr. Neil Geddes estimated that the grid will have the processing power of 20,000 1GHz computers, making it the biggest funded grid of its kind in the world thus far. The GridPP project has been operating a "testbed" grid over 10 British sites, and the experiments have yielded the middleware necessary for a bigger grid. The GridPP testbed was embedded in the LHC Computing Grid in September, and GridPP is collaborating with other initiatives such as Enabling Grids for E-Science in Europe, which will build a European Grid framework for the European Research Area by integrating national, regional, and thematic grid projects. The results of GridPP's experiments comprise the foundation for a broader implementation of scientific computing grids across British universities participating in England's e-Science program.
    Click Here to View Full Article

  • "XML--Rodney, Are We There Yet?"
    CNet (12/14/03); Ruh, William

    The dot-com bust left XML with few ardent supporters and little respect; however, XML's fortunes have now turned as major software vendors have either already included support for the standard or plan to do so in their next upgrades. Though XML began as a simple improvement on HTML, it now is the cornerstone of many companies' business strategies and is even reshaping the way industries do business. In health care and insurance, the HL7 and ACCORD standards, respectively, are representative of how XML provides the basis for industry data exchange standards. The Chief Information Officers Council has designated XML as one of the technological foundations for improving federal government operations. XML's popularity comes as organizations begin to recognize the importance of data as the lifeblood of the extended enterprise and for boosting productivity within the company. Executive management dashboards and other enterprise information applications are high on the list of executive priorities today; XML is well able to meet these demands, especially since it provides the semistructured type of data businesses need to extract from forms and documents. XML is also not overburdened by extraneous services requirements, similar to other widely adopted technologies such as Ethernet and Internet Protocol. From a development standpoint, XML is also easy to learn and work with. Still, some challenges lie ahead for XML, including implementation of existing digital signature and encryption standards for better access control and security; also, XML involves significant bandwidth requirements, though these are no more than comparable technologies, such as Adobe Systems' Portable Document Format.
    Click Here to View Full Article

  • "Copyright Doesn't Cover This Site"
    Wired News (12/16/03); Delio, Michelle

    The Pool is a collaborative online environment created by the University of Maine's Still Water new media lab where creative work including images, video, music, texts, and programming code is available to all. "We are training revolutionaries--not by indoctrinating them with dogma but by exposing them to a process in which sharing culture rather than hoarding it is the norm," explains Still Water co-director Joline Blais. Pool users can enter the environment and look for content to incorporate into their own projects, find partners to collaborate with, and seek out peers for feedback. In addition, contributors can suggest projects for others to execute, or respond to entreaties to study, re-edit, remix, or debug existing content. The Pool is organized so that the path of conceptual development, from idea to execution, is easily traceable, and all contributors and people who accessed the works are known. University of Maine new media major John Bell notes that Pool contributors have many licensing options and reuse terms to choose from so that their own feelings of how open their work should be can be honored. Blais hopes the Pool concept works so well that within a decade it will either be everywhere or redundant; she wants the Pool to be a model for cooperation between individuals and communities that displaces corporate productivity and influence. "It's all about imagining a society where sharing is productive rather than destructive, where cooperation becomes more powerful than competition," she asserts.
    Click Here to View Full Article

  • "Military Eager to Use Technology for Gathering Data"
    Federal Times (12/15/03) Vol. 39, No. 46, P. 14; Matthews, William

    The Defense Department's Total Information Awareness project was scrubbed because of public outcry over its potential to allow the military to monitor and mine the personal data of the American citizenry in an attempt to root out terrorists. Nevertheless, Sue Payton, deputy undersecretary of Defense for advanced systems and concepts, told the Pentagon's Technology and Privacy Advisory Committee last month that military computer systems that employ data mining, knowledge management, decision support, and other analysis methods stand in readiness. She believes such systems could effectively detect terrorists and terrorist cells before they attack, but said that they are currently in limbo because of unresolved privacy issues. Payton feels the military could thwart terrorist strikes through an electronic "connect-the-dots" methodology such as the kind used by business intelligence, which mines phone call logs, financial transactions, residence data, etc., to find prospective clients, detect credit card fraud, and predict consumer demand, among other things. Undersecretary of Defense for intelligence Stephen Cambone said he wants the Pentagon to make "universal situational awareness" and worldwide "persistent surveillance" a reality within the next 10 years through the deployment of space-based radars, better intelligence communications to troops in the field, and faster analysis. Cambone told defense writers in November that he sees a time when the military will boast advanced intelligence capabilities "to sustain situational awareness of events throughout the globe." He added that part of the plan involves training and enlisting more intelligence personnel.

  • "IPv6 Fears Seen Unfounded"
    Network World (12/15/03) Vol. 20, No. 50, P. 1; Marsan, Carolyn Duffy

    Contrary to conventional wisdom from the Internet engineering community, the transition to Internet Protocol version 6 (IPv6) is proceeding more easily and less expensively than projected, as indicated by good notices about Defense Department and academic IPv6 implementations at last week's U.S. IPv6 Summit 2003. The IPv6 switchover has taken longer than proponents anticipated, but this has yielded an unexpected plus--now that users desire IPv6 deployments, they can take advantage of the protocol's incorporation into many networking products that they must purchase to facilitate regular infrastructure upgrades. One of the most surprising revelations about the ease of IPv6 deployment came from Moonv6, an IPv6-based network built by the North American IPv6 Task Force in collaboration with military and university communities; the network links over 80 servers, switches, and nodes distributed across eight states, and the Defense Department, which has set a 2008 deadline to fully migrate to IPv6, says the network's success is a good sign. NTT's Verio subsidiary learned through its development of the first U.S.-based commercial IPv6 service that IPv6 needs few additional expenses outside of normal network upgrades. Meanwhile, the Abilene network, which connects 200 American universities, has enabled IPv6 on 50 percent of its network connectors. Rick Summerhill of the Internet2 consortium expects most Abilene-supporting universities will switch to IPv6 in three years. Still, North American IPv6 Task Force Chairman Jim Bound notes that major business applications vendors have yet to support IPv6, though the disclosure of Oracle's IPv6 roadmap at the IPv6 Summit indicates movement in this direction. Industry analysts believe the corporate upgrade to IPv6 will be a gradual process, and expect IPv6 and IPv4 to exist side-by-side for some time.
    Click Here to View Full Article

  • "Wrapping Up Web Presence"
    InfoWorld (12/08/03) Vol. 25, No. 48, P. 48; Connolly, P.J.; Udell, Jon

    In a discussion about whether Web services will fuel a transformation in enterprise collaboration, InfoWorld Test Center lead analyst Jon Udell and senior analyst P.J. Connolly argue about how presence awareness can--or cannot--apply to this equation. Udell reasons that Web services will spur the development of next-generation collaboration, but the question remains as to how they will accomplish this; Udell foresees two major developments. "First, Web services will provide a general means of access to the messaging substrates," he explains. "Second, Web services will help us unify metadata [message headers, aka context] and content [message bodies, aka documents] under a common data-management discipline: XML." Connolly agrees that such an approach to content and metadata has merit, but contends that platform-neutral Web services--which Udell thinks presence awareness must become--is a long way off. The cost to customers as well as vendors' reluctance to open up their products to add such features are formidable barriers. Connolly also disputes Udell's assertion that messaging technologies are unable to embed context within documents and discussions, and that only a Web services infrastructure can handle such a task. He further points out that there is still no accord between Extensible Messaging and Presence Protocol (XMPP) advocates and supporters of SIP (Session Initiation Protocol)/SIMPLE (SIP Implementation for Messaging and Presence Leverage Enhancements) such as Microsoft and IBM over the presentation of presence as a Web service.
    Click Here to View Full Article

  • "Web Services Put GIS on the Map"
    Computerworld (12/15/03) Vol. 31, No. 56, P. 30; Mitchell, Robert L.

    Web services technology is popularizing third-party geographic information system (GIS) information, even at companies that maintain their own internal GIS databases and systems. Commercial developer Edens & Avant uses GIS Web services to create quick overlay maps of prospective shopping center sites, integrating relevant Census Bureau, Environmental Protection Agency, local government, and commercial data. Edens & Avant systems manager David Beitz says the Web services model is for prospecting, but in-depth analysis involves in-house data. As Web services support for GIS continues to grow, analysts expect more industries to use GIS in greater capacity. Previously, specialists acted as gatekeepers to GIS systems, but now decision-makers and other GIS users have direct access to GIS data. Developers are also taking advantage of Web services to integrate GIS into their applications, using offerings such as Microsoft's MapPoint Web Services, for example. The Open GIS Consortium (OGC) is behind standards to support these Web services, including Web Map Service, Web Feature Service, and Geography Markup Language, which is based on XML. The payoff for Florida Farm Bureau Insurance is up-to-date information used to approve homeowner policy applications, says senior strategic planner Steve Wallace. Still, GIS over Web Services is hampered by differing standards at the base level, and OGC specification program director Carl Reed says the OGC is in discussions with states and counties to iron out simple definitions such as road width.
    Click Here to View Full Article

  • "Strong Signals: The Ultranet"
    CIO Insight (11/03) Vol. 1, No. 33, P. 31; Parkinson, John

    RFID technology is the first step toward an entirely new mesh networking architecture where capacity grows in line with the number of connected nodes, writes Cap Gemini Ernst & Young chief technologist John Parkinson. Unlike traditional networking models such as Internet Protocol and the Web, mesh networking involves physical distance as a factor, differentiating between local and long-distance traffic. By relying on local ad hoc connections instead of centralized infrastructure, the entire mesh network can expand without regard to established capacity, as in the case of a traditional network; mesh network connections are also much easier to locate and repair. Even though radio frequency identification (RFID) technology currently uses mostly dumb end points except for transmitters attached to railcars and shipping containers, future mesh networks will include intelligent devices similar to IBM's experimental Meta Pad ultracompact PC. Prototype devices that are about the size of a paperback book will likely become commercially available next year, but by around 2006, Cap Gemini Ernst & Young predicts such ultracompact devices will begin to create a new type of mesh networking architecture. These devices will have about the same processing power as is available for notebook computers today, but will be much smaller and include handwriting and voice recognition technology; a host of wireless connectivity including cellular, RFID, Bluetooth, and Wi-Fi; and at least two terabytes of storage--enough to store all of a person's critical transactional documents and some of their personal media. With enough of these devices about, the networking paradigm would shift from centralized storage and intelligence to a peer-to-peer node scheme. Dynamic computing grids and workgroups could be created between end-point devices instead of relying on servers. So far, these types of architectures have been tested in laboratories, but with RFID, they are beginning to evolve in the real world.
    "Nanotechnology-Enabled Sensors: Possibilities, Realities, and Applications"
    Sensors (11/03) Vol. 20, No. 11, P. 22; Smith, Sharon; Nagel, David J.

    Sensor design is en route to improvements--in efficiency, power requirements, sensitivity, and specificity, among other things--thanks to advances in the field of nanotechnology and its convergence with information technology and biotechnology. Technical and economic challenges must be met for the technology to become commercially viable: Nanosensors must address design issues such as heat dissipation, interface requirements, and interference and noise, while present difficulties in mass-producing nanomaterials have made companies reluctant to invest in nanotech. The integration of nanoscale technologies could spur the development of tiny, intelligent, low-power sensors with applications far beyond those promised by microsensors. Progress in the area of nanoscale top-down and bottom-up manufacturing processes is another major driver of new sensor system developments, while nanosensor computational design is also within reach thanks to the availability of powerful computers and algorithms for modeling nanoscale interactions. Research has yielded nanoscale sensors that demonstrate the technology's diverse applications: Such breakthroughs include a small gas ionization detector based on carbon nanotubes that could be employed for gas chromatography; Titania nanotube hydrogen sensors bundled into a wireless sensor network that measures the amount of hydrogen in the atmosphere; devices designed to detect biomolecules; submicron mechanical electrometers; and a nano-enabled chemical detection system integrated into an unmanned aerial vehicle that could be used for defense and homeland security. Other sectors that stand to benefit from nanosensors and nano-enabled sensors include transportation, medicine, communications, safety, and buildings and facilities.
    Click Here to View Full Article

  • "Problems Persist With U.S. Visas"
    IEEE Spectrum (12/03); Kumagai, Jean

    Foreign students are thinking twice about enrolling in U.S. universities while foreign researchers are reconsidering their participation in U.S.-based conferences because of the many security procedures and protocols they must endure in the wake of Sept. 11, 2001. The situation could significantly impact the U.S. technology industry as about half of all computer science and engineering graduate students in the U.S. come from abroad. Before a visa can be issued or renewed foreign nationals must submit to in-person interviews with American consular officers, while those studying security-sensitive topics are subjected to background checks. There is also a federal database to track all foreign students in the United States, while fingerprinting, photographing, and interviewing are common procedures for male students over 16 years of age whose countries of origin have large Muslim populations or are on the U.S. list of terrorist-friendly nations. The security measures have led to incidents of outright harassment, while red tape has forced students and others to postpone their studies, miss speaking engagements, or be denied entry back into the United States when they leave. The frustration of dealing with all of these protocols is spurring many foreigners to seek more welcoming shores: The number of applications for F-1 student visas submitted to the State Department fell about 24 percent between fiscal 2001 and fiscal 2003, while the Institute of International Education reports that the annual growth rate of foreign students enrolled at U.S. schools has declined from 5 percent five years ago to less than 1 percent. Countries with less stringent security policies, such as Australia, Great Britain, and Canada, stand to benefit from the U.S. loss. "Without foreign graduate students and postdocs, there will have to be a cutback in the research activity of American universities," warns former Texas Instruments executive Norman Neureiter, who calls for less complicated and intrusive security measures.
    Click Here to View Full Article

    [ Archives ]  [ Home ]