ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM.

To send comments, please write to [email protected].

Volume 5, Issue 553: Friday, October 3, 2003

  • "A Suspect Computer Program"
    Los Angeles Times (10/02/03) P. A1; Piller, Charles; Alonso-Zaldivar, Ricardo

    Despite the application of the most advanced technology, experts warn that would-be terrorists could likely foil the security system protecting air travel in the United States. Meanwhile, tens of thousands of innocent passengers would be flagged by such a system and subjected to searches while the commercial and government records of millions of passengers would be combed for suspicious activity. Transportation Security Administration (TSA) officials want to build CAPPS II, the successor to the first Computer Assisted Passenger Pre-Screening system deployed in 1998. CAPPS II would make the TSA the most intrusive government agency after the IRS, giving them power to investigate approximately 70 million American flyers each year. Of those, 96 percent would be assigned a low-risk status while 4 percent, or roughly 74,000 people each day, would be subjected to more rigorous searches. According to TSA estimates, only one or two people each day would be interrogated and barred from flight. However, others say the system, in practice, could be much more intrusive. Bruce Schneier, CTO for Counterpane Internet Security, says, "Systems that involve wholesale surveillance of innocents tend not to work. It's not feasible to catch the bad guys without also catching too many good guys." CAPPS II will cull telephone records, find familial links, and assess other data that details a person's day-to-day activity; but former security director for Israeli El Al airline, Offer Einav, says human vigilance will be needed as well. He says, "The U.S. is so much oriented toward a technology [solution] that the people are serving the technology...Human beings will always beat the technology." Einav also says eliminating religious, national, and ethnic information from the CAPPS II system, done in order to protect against discrimination, will hamper the ability to identify potential terrorists.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Tough Issues Face Information Society Summit"
    IDG News Service (10/01/03); Blau, John

    A number of highly contentious issues have upset preparations for the World Summit on the Information Society's (WSIS) December meeting, sponsored by the U.N.'s International Telecommunications Union (ITU). The main goal of the summit is to spread communication services and information uniformly throughout the world. ITU general secretary Yoshio Utsumi says public information access "is no longer a technical matter, but a fundamental policy goal of every nation." Observers of the negotiations so far note that Europe and the United States are aligned against developing countries in many instances, such as Internet governance; American and European representatives would like to preserve the private-sector status quo, while delegates from China, Brazil, and other nations want to see more global management of Internet resources such as domain names, IP addresses, and root servers. The differences prevented the finalization of key draft documents--the lofty Declaration of Principles and accompanying Action Plan, which lays out 140 steps to achieve the principles. WSIS organizers hope the documents can be finished in one or two interim meetings before the December summit, which is expected to attract about 50 heads of state. Among the adjustments already made to the Action Plan include the addition of proprietary software as important to governments' healthy IT environments; the document now lists proprietary, open source, and free software as all necessary in a balanced environment. The WSIS is unbalanced, with wealthy countries able to send delegates to argue their positions, says Karen Banks, whose Association for Progressive Communications is a member of the Civil Society, which represents about 500 nongovernmental groups from different countries at the WSIS. Banks says poorer countries do not have the resources to send as many representatives.
    Click Here to View Full Article

  • "Machines Learn to Mimic Speech"
    Wired News (10/03/03); Delio, Michelle

    Attendees at this week's SpeechTek tradeshow said speech technology companies have started to take a more realistic view in realizing that voice technology has not yet reached the point where computers can actually understand human speech. "Now that the magic is gone, we don't believe in using speech technology unless it serves a viable purpose--making it easier for people to work with a computer system, making systems more secure or even making computers more fun," remarked speech application programmer Frank Vertram. Still, SpeechTek showcased some impressive products: One ATM product was designed to aid visually handicapped or technology-evasive users by allowing them to hear descriptions of onscreen options through headphones. Nuance displayed a "say anything" natural language application that employs a database to interpret users' intent from "freestyle conversations." Cepstral unveiled two sets of computer voices, one geared for the American market and the other for the Canadian market--the American voices are imbued with a casual tone, while the Canadian voices speak with a French-Canadian accent. IBM highlighted WebSphere speech offerings upgraded with VoiceXML 2.0 support, which allows speech technology to be embedded within Web sites. SpeechTek's Speech Solutions challenge, which was set up to prove that programming speech applications does not necessarily have to be a frustrating experience, tasked seven teams with developing a workable application capable of identifying car trouble and scheduling a session at a repair shop by the end of the day; all seven teams met the challenge by 5:00 p.m. "Once we get past the mistaken idea that computers should be able to really understand us or that we can engage in meaningful conversations with machines, the new voice and speech technology is absolutely amazing," declared SpeechTek organizer James Larson.
    Click Here to View Full Article

  • "EU's Class of '04 Eyes Software Development"
    International Herald Tribune (10/01/03); Schenker, Jennifer L.

    Central and Eastern European countries are accelerating their efforts to vie for the American and Western European software development outsourcing market, which is currently dominated by India. Cyprus, Latvia, Hungary, the Czech Republic, Malta, Poland, Estonia, Slovenia, Slovakia, and Lithuania are making such moves in preparation for their admission into the European Union. These countries, along with Russia, are producing combined annual revenues of about $1 billion from software development outsourcing, while International Data's (IDC) Steven Frantzen estimates that the Central-Western European region could generate two or three times as much revenue if they can draw more Western European business. In these nations' favor is their close physical proximity to Western Europe, while most technology companies lack the resources to branch out internationally. Software development outsourcing efforts in the region include the Hungarian Software Alliance, composed of a dozen software development firms; Latvia's IS Cluster, which is collaborating with the Baltic Computer Academy, the University of Latvia, and Riga Technical University to lure offshore software development customers; and Romania's promotion of its software development competence through the www.outsourceromania.com Web site. Another development that may boost these companies' IT outsourcing prospects is the possible setup of an outsourcing taxation system to make Asian software development migration less attractive, according to Marianne Kolding of IDC.
    Click Here to View Full Article

  • "Researchers Create Super-Fast Quantum Computer Simulator"
    NewsFactor Network (10/02/03); Martin, Mike

    Japanese researchers have devised a tool designed to boost the speed at which classical computers can run quantum algorithms, which engineers hope will aid the design of quantum-computer hardware and software. The researchers used a "quantum index processor" to manipulate an algorithm formulated by AT&T researcher Peter Shor in which the time it takes to factor a number increases only as a polynomial function of the number's size. With traditional factoring algorithms, the time required to factor a number rises exponentially with the number's size. Running quantum algorithms on classical computers is problematic: For instance, because Shor's algorithm only generates a "highly probable" correct result, it must be run continuously to boost the outcome's likelihood. The amount of simulated quantum bits needed to process these repetitions becomes unwieldy after a short time. Running Shor's algorithm with the quantum index processor sidesteps the complexity needed to run the algorithm on a classical system employing complementary metal oxide semiconductor technology, according to Texas A&M electrical engineering professor Laszlo Kish. He estimates that the quantum index processor is 100 trillion trillion times speedier than A&M's workstations, and overtakes any other quantum simulator currently in existence. "The proposed system will be a powerful tool for the development of quantum algorithms," declare Minoru Fujishima, Kento Inai, Tetsuro Kitasho and Koichiro Hoh of the University of Tokyo.
    Click Here to View Full Article

  • "Plasma Screens and the Future of Display Technology"
    TechNewsWorld (10/01/03); Stresing, Diane

    Plasma display panels (PDPs) may face more competition in the next few years as new display technologies emerge. One such technology is Organic Light Emitting Diode (OLED) displays, which promise cheapness, light weight, and efficiency because backlighting is not required. Janice Mahon of Universal Display reports that OLEDs have made significant strides recently: OLED size is increasing, while a demonstration that OLED displays can be created on Amorphous-Silicon rather than Poly-Silicon backplanes is an encouraging sign from an economic angle. Nano-Emissive Display (NED) technology, which researchers at Motorola's Microelectronics and Physical Sciences Laboratory have been developing for eight years, could make the leap to mass production with the creation of a proprietary catalyst that dramatically lowers the temperature at which the displays can be synthesized from nanoscale tubes. Though NED research team leader James Jaskie estimates it could take a year or more for the technology to hit the market, he says that its easier, more streamlined production process and use of cheap electronic drivers will make NED-TV a less expensive proposition than PDP. U.S. Display Consortium President Michael F. Ciesniski foresees three competing display technologies for the immediate future--PDP, Projection, and liquid crystal displays (LCDs). Gartner analyst Alan Brown anticipates that plasma will gradually become less expensive, and does not discount the possibility "that some [other] projection technologies, such as digital light processing, could replace plasma screens in some applications."
    Click Here to View Full Article

  • "Now Hear This, Quickly"
    New York Times (10/02/03) P. E1; Heingartner, Douglas

    New software and hardware are letting people vary the speed at which they listen and watch audio and video recordings; the new digital time compression technique cuts out tiny segments of repetitive audio, such as portions of a vowel enunciation, so that speech can be accelerated without changing the pitch--so fast-talkers do not sound like Alvin and the Chipmunks. Researchers say fast audio playback uses more of the brain's idle resources and can actually enhance comprehension since listeners must pay more careful attention. Among the most common targets of variable speed technology today are compressed radio advertisements, university lecture recordings, call center recordings, and other commercial stores of audio data. Dictaphone's Ed Rucinski notes that many companies' recorded audio stores are growing rapidly, reaching millions of hours, and that digital compression can help make those recordings more accessible. Panasonic has included digital acceleration technology in two models of DVD recorders and in one of the company's DVD players, letting users watch a regular-length movie in under an hour, for example; digital acceleration plug-ins are also available for the RealOne and Windows Media Player platforms. RealNetworks general manager Richard Brownrigg suggests the technology would be useful for long cell phone messages, especially when subscribers do not want to use too much airtime. Web designer Gregory Rosmaita, who is blind, sets the text-reader he uses to browse the Web as fast as 650 words per minute (average speech is between 140 and 180 words per minute), and uses a synthesized British accent since it enunciates more clearly and quickly. Listening to text reading at normal pace sounds unnatural, according to Rosmaita and other accelerated playback users, who say the technology is to normal speech what broadband is to dial-up.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Academic Consortium Plans a $100-Million Optical-Research Network"
    Chronicle of Higher Education (10/03/03) Vol. 50, No. 6, P. A30; Olsen, Florence

    Academic institutions and university consortia have joined forces to construct a $100 million national optical network that will serve as the underpinning of what National Science Foundation officials term a worldwide "cyberinfrastructure" to play a key role in the advancement of science and engineering. The infrastructure, which will be developed under the aegis of the nonprofit National LambdaRail consortium, will be used to conduct experimental research on optical networks as well as for medical, scientific, and engineering projects. The network's fiber-optic lines have already been deployed, but the installation of the optical gear will not be completed until next April. National LambdaRail CEO Thomas W. West reports that none of the participating universities expect to make money, and explains that their collective five-year investment would amount to something between $80 million and $100 million. The first phase of the cyberinfrastructure's development will involve the establishment of connection points in Seattle, Atlanta, Denver, Chicago, Pittsburgh, Washington, D.C., North Carolina, California, and Florida. The network will boast a 40 Gbps capacity, a fourfold increase over the Internet2 consortium's Abilene network; West says the infrastructure's ownership and control scheme will also differ from Abilene's. Universities that wish to carry out experiments requiring a very high degree of bandwidth will have exclusive access to a portion of the network's capacity. Initial members of National LambdaRail include Internet2, the Corporation for Education Network Initiatives in California, Florida LambdaRail, the Georgia Institute of Technology, the Pittsburgh Supercomputing Center, and the Committee on Institutional Cooperation.

  • "Software Jungle Challenges Auto Engineers"
    EE Times (10/01/03); Hammerschmidt, Christoph

    The Motor Vehicle Electronics conference this week clarified the growing challenge facing automotive systems designers, who have to manage increasing complexity without modularization and standardization. Auto electronics, now nearly synonymous with software, is expected to provide most of the innovation in cars in the near future, but its complexity is growing exponentially; by 2010, the number of lines of code in cars will increase 100-food. Part of the reason reusable software components have not emerged, said ZF Friedrichshafen's Wolfgang Runge, is that suppliers are fearful of commoditization. Eventually, however, this roadblock will have to be removed. University of California, Berkeley, computer science professor Alberto Sangiovanni Vincentelli said the complexity of modern car systems is demonstrated in the 500 calibration parameters he found in one original equipment manufacturer's system software. "This is...a sign of illness," he said. A solution will be to uncouple hardware from its singular function and promote more systems integration. Currently, modularizing embedded system software is a risky proposition, said Vincentelli. IBM Global Automotive Software Business director Andreas Eppinger said the problem was not really software quality, but rather the interfaces and system engineering that prevent accurate diagnosis; Eppinger supported the use of deployment managers to help track a growing multitude of software used in control devices. IBM and BMW have already begun talks on standardized data models that could pave the way for software reuse and better process management.
    Click Here to View Full Article

  • "Controversial Pentagon Program Scuttled, But Its Work Will Live On"
    IEEE Spectrum (09/26/03); Cherry, Steven M.

    A congressional appropriations conference committee may have killed the Terrorism Information Awareness (TIA) program, a Pentagon effort to root out suspected terrorist activity by mining databases of Americans' transactional information, but some Information Awareness Office (IAO) programs will be shifted to other agencies and offices rather than be discontinued, a move that worries critics. Eight programs originally developed under the IAO, which was defunded along with TIA, will be resumed in other sections of the Defense Advanced Research Projects Agency. TIA-related research will be continued by the National Foreign Intelligence Program (NFIP), a joint venture of the FBI, the CIA, the National Security Agency, and other intelligence entities. The official description of NFIP states the program's mission is "thwarting the intelligence collection activities of foreign powers and their agents," but the program's role in preserving TIA projects is unclear, as is the precise involvement of NFIP's member agencies. The salvaged IAO programs include Bio-Event Advanced Leading Indicator Recognition Technology, Wargaming the Asymmetric Environment, Rapid Analytical Wargaming, and five initiatives to translate and study written and spoken natural language; the collective budget of these projects is $79.2 million. TIA-related research could also migrate to the new Department of Homeland Security Advanced Research Projects Agency, which was granted a first-year budget of $800 million. TIA may be officially defunct, but other federally funded data-mining projects of concern to privacy advocates are proceeding apace, such as the Transportation Security Administration's second-generation Computer Assisted Passenger Prescreening System program.
    Click Here to View Full Article

  • "E-Mail Is Broken"
    Salon.com (10/02/03); Mieszkowski, Katharine

    Four computer scientists--Carnegie Mellon University's Dave Farber, Brandenburg Consulting principal Dave Crocker (a former student of Farber's), Electronic Frontier Foundation chairman of the board Brad Templeton, and Nielsen Norman principal Jakob Nielsen--separately discussed the sorry state of email and what can be done to solve the spam problem. Templeton observed that "Computers amplify both the good and the bad we can do, and spam is yet another example." Farber declared that email's reliability has gone downhill because more and more people are installing poorly performing spam filters, and he warned that time is running out to staunch the growth of spam; Crocker noted that many people are frustrated by spam because its sheer volume makes it hard to find legitimate email. Nielsen said he thinks an anti-spam law is a good idea, but this would do little to deter spammers based overseas, while Templeton characterized most anti-spam legislation as "worse than useless." Farber commented that a Massachusetts law permitting people to sue spammers is unlikely to be effective, given the difficulty in tracking spammers down, but said that an enforcement scheme set up by the FTC or FCC would at least rein in spamming by big companies. Technical solutions suggested by the computer scientists include authentication standards, but Farber pointed out that no one appears to want to invest in deploying such a solution; Crocker favored incremental email revisions coupled with non-onerous methods of locating spammers, and an increase in accountability. Nielsen considered a radical and unpopular solution--to wipe the slate clean and phase out all existing email protocols. This would involve a global upgrade by all companies simultaneously, which Nielsen called an impossible task. Crocker concluded that people should stop wasting time looking for a magic bullet, for there is no single solution to the spam problem.
    Click Here to View Full Article
    (Access to full article available to paid subscribers only.)

  • "Don't Hold Your Breath for Online Voting"
    Investor's Business Daily (10/02/03) P. A5; Barlas, Pete

    The benefits of electronic voting--convenience, almost instant results, and cost effectiveness--promise to eliminate hanging chads and other problems that plague elections, as well as boost voter turnout. Accenture E-Democracy Services CEO Meg McLaughlin says American consumers' desire to vote online was clearly demonstrated in the 2000 primary, when 41 percent of 85,559 Arizona voters cast their ballots over the Internet. E-voting testing and deployment is even more prevalent in Europe, especially in England. However, McLaughlin estimates that it will take five or more years for online voting to be widely adopted in the United States while technical and ethical problems are worked out. For one thing, county and state governments have yet to agree on a standard voter authentication methodology; a national ID card has been proposed, but this has raised the ire of privacy proponents, McLaughlin notes. Another area of contention is ballot confirmation, while e-voting critics are concerned that electronic elections could be rigged by crafty hackers. Still, the potential advantages of e-voting are undeniable: Alan Winchcombe, deputy returning officer for Swindon local authority unitary in England, says, "[Online voting] makes it easier and more convenient to vote because [people] can do it whenever they want." E-voting is currently permitted in just 9 percent of U.S. polling places, although the percentage is expected to increase to 20 percent in 2004. In addition, counties in eight states--North Carolina, South Carolina, Arkansas, Washington, Utah, Hawaii, Florida, and Minnesota--will allow military personnel and civilians stationed overseas to vote online in next year's presidential election.

    To learn more about ACM's activities involving e-voting, visit http://www.acm.org/usacm/Issues/EVoting.htm.

  • "Virus Experts Debate Bug Names"
    IDG News Service (09/30/03); Roberts, Paul

    How to name computer viruses without generating confusion that could hinder efforts to stop them was a point of debate at the recent Virus Bulletin 2003 (VB2003) conference in Toronto. However, Richard Ford of the Florida Institute of Technology cautioned that this debate has gone on for nine years without resulting in a mutually accepted answer. Ford Motor global anti-virus project manager Shawn Campbell remarked that the names anti-virus vendors ascribe to malicious code and the names the media gives the same code--which laymen are more likely to remember because they are not so technical--can bewilder IT experts in major organizations and cause anti-virus experts to waste time and resources during a cyber-attack by attempting to resolve the inconsistencies between names. Many blamed the Computer Antivirus Research Organization's (CARO) Virus Naming Convention as the root cause of the problem: Some VB2003 attendees testified that the convention generates different results when applied to new viruses by legitimate anti-virus experts. Symantec senior research fellow Sarah Gordon noted that characterizing a virus' unique traits is a subjective process, while "blended threats" with numerous attributes lead to long, complex names; confusion is generated by the lack of a central CARO name repository. Microsoft release anti-virus specialist Randy Abrams added that threat investigations can be held up because different anti-virus products often have different ways of identifying the same threat. Gordon and other attendees thought a much more effective solution is a scientific naming scheme, similar to Mitre's Common Vulnerabilities and Exposures List or Wildlist Organization International's virus list. Abrams said this standard could be used to certify anti-virus engines.
    Click Here to View Full Article

  • "Research Team Set to Revamp Internet"
    Stanford Report (10/01/03); Levy, Dawn

    Stanford University is one of eight institutions participating in a National Science Foundation project to research an Internet overhaul that supports emerging technologies and scales up to increasing demand. The project's researchers believe their work could lead to the deployment of a fiber-optic network penetrating 100 million U.S. homes within a few years, while Stanford computer scientist Nick McKeown estimates that the network's 100 Mbps data transfer rate could constitute a roughly 2,000-fold boost over dial-up Internet access and an approximately 100-fold boost over DSL. High-tech industry leaders also think the network could support enough broadband access to inject $500 billion into the economy. High-definition video on demand could be delivered over this network, because the file size limitation on data transmission would be eliminated. McKeown's job is to conceive the backbone network and possibly 100 major regional switching nodes tasked with directing communications traffic across the United States. The effectiveness and efficiency of the next-generation Internet can only be realized if the network incorporates greater reliability and security, can accommodate as-yet unthought-of applications, boasts long-term economic sustainability, and is simpler to use and operate. McKeown says the current Internet's susceptibility to denial of service attacks and viruses is a design flaw. The next step in the research project is to define the new Internet using security, economics, and network research criteria.
    Click Here to View Full Article

  • "Study Points Out Costs of Computer Disposal"
    CNet (09/29/03); Sharma, Dinesh C.

    A new report from research company Gartner shows that companies can recoup 3 percent to 5 percent of the original price of old computer equipment by selling the parts, but companies also should expect to spend anywhere from $85 to $136 to dispose of an outdated PC. The cost of disposing an old PC would cover everything from disconnecting the unit from a network, erasing sensitive data from a hard drive, reloading operating systems, and testing equipment to paperwork, packing, shipping, and handling. International environmental groups and state governments have pressured computer manufacturers such as Dell and Gateway into establishing programs for recycling old PCs. Earlier in September, another Gartner study revealed that companies can avoid direct computer-related costs by holding onto PCs for four or more years, but they will face indirect costs, including lost productivity and downtime. "Many enterprises have paid a high price in costs, regulatory fines, bad publicity and even litigation, when their PCs turned up in landfills or third-world countries, or when confidential data was recovered from hard drives that had not been properly sanitized," Gartner research director Frances O'Brien said in a statement.
    Click Here to View Full Article

  • "Developers Blaze Their Own Trail"
    InfoWorld (09/29/03) Vol. 25, No. 38, P. 36; Knorr, Eric

    The 2003 InfoWorld Programming Survey of 804 programmers and their managers concludes that Web applications dominate industry, even though Microsoft and others insist that developers should switch to fast desktop clients: 80 percent of respondents report that such apps are a key component of their server development, while 53 percent say they favor apps with a Web-style user interface. Moreover, a significant percentage of programmers, in defiance of assertions that Microsoft .Net and J2EE rule the programming roost, prefer to base their Web apps on simple languages such as Perl, JavaScript, and VBScript. Fifty-one percent of those polled note that their server development includes Web services, and 52 percent are using XML or object-oriented databases. Certain IT shops have made it a standard practice to use the tools, languages, and framework for mission-critical apps for departmental apps designed to increase productivity and satisfy short-term business demands, but the survey shows that others accept a certain loss of control by using tools and techniques more tightly aligned to the job at hand. Most respondents indicate that the biggest barriers to software reuse are the effort required to design reusable software or a lack of awareness of available software; 44 percent declare themselves satisfied with current resuage levels while 41 percent express dissatisfaction. Sixty-nine percent call "shared libraries" highly reusable, while 42 percent classify "components" and 21 percent list dynamic language as highly reusable. "People build applications using scripting tools because the tools are so easy to get going and because they deliver functionality to business users so fast," notes an anonymous consultant. "The downside is that they do not require the process and discipline of the more robust applications, so maintenance tends to be very difficult if not impossible."
    Click Here to View Full Article

  • "Female IT Professionals Cope in a Male-Dominated Industry"
    Network World (09/29/03) Vol. 20, No. 39, P. 7; Messmer, Ellen

    Female professionals account for 25.3 percent of America's IT workforce of 3.6 million employees, according to the Information Technology Association of America (ITAA), and women have found competing in a male-dominated industry to be a formidable struggle. At the recent Auto-Tech conference, Nick Andreou of General Motors remarked that some cultures frown on putting women in positions of authority, and even discussing such a problem is disapproved. This issue was mentioned at the Executive Women's Forum by Trend Micro co-founder Eva Chen, who noted that she had to stay low-key on business trips in Japan, where gender bias is rampant. Andreou pointed out that such discrimination is detrimental to business productivity and relationships based on globalization. At the Executive Women's Forum, Oracle's Mary Ann Davidson said that women executives should check their own behavior and not make things tough on others because their own work experiences were hard, while CYA Technologies CEO Elaine Price explained that female workers need a thick skin, especially in the IT salesforce. Sanctum CEO Peggy Weigle advised women to relate their ideas statistically, a practice appreciated by men. Adaptability is another factor: Many women at the conference reported that they had to engage in male-oriented activities, such as golf and drinking at bars, in order to gain credibility among their male peers. A May ITAA report on gender and race suggested that women are lagging in IT partly because of parenting responsibilities, while Stanford computer science professor Eric Roberts observes that many women shun careers in computer science in favor of other positions that offer more human interaction. Meanwhile, a Sheila Greco Associates survey estimates that women make up only 13 percent of IT vice presidents and CIOs, and earn about 9 percent less than men.
    Click Here to View Full Article

    The activities of ACM's Committee on Women in Computing can be viewed at http://www.acm.org/women.

  • "Supercomputing Horizons"
    Computerworld (09/29/03) Vol. 31, No. 45, P. 33; Anthes, Gary H.

    Vincent F. Scarafino, manager of numerically intensive computing at Ford Motor, warns that the United States' supercomputing effort is in danger of falling way behind Japan, and this could lead to a serious lag in U.S. science and engineering unless the federal government steps up supercomputing research and development funding. He says that his company's competitive edge is maintained through early access to leading-edge supercomputers, and such access will be lost if the United States loses the supercomputing crown to someone else. Scarafino adds that though American scientists can access Japan's Earth Simulator--currently the world's fastest supercomputer--it is Japanese interests who are the project's primary beneficiaries. He observes that the government has de-emphasized the design of high-performance supercomputer architectures in favor of commodity clusters, but argues that commodity clusters are not only costly, but unsuited for solving the most computationally intense problems. Scarafino says the federal government should finance high-end processor design and supporting chip elements, with the overall objective being super-fast processors whose memory and I/O systems are aligned to computational speeds. He projects that a computer with 1,000 times as much power as current models would allow Ford to predict how car occupants could be wounded in accidents, simulate full vehicle lifetimes to bolster durability analysis, model the performance of exotic materials in vehicles, and study greater inconsistencies in design specifications so that rival design needs can be balanced while the design cycle is shortened.
    Click Here to View Full Article

  • "Leaping, Then Looking"
    Baseline (09/03) No. 22, P. 17; Dignan, Larry

    Few executives are taking time to consider IT outsourcing's potential ramifications--on worker morale, the American economy, national security, and other areas--before moving software development overseas in order to cut costs and remain competitive. "From a free-market point of view when you have talent around the world that's less expensive, you should be able to hire that talent," contends George Mason University finance professor Gerald Hanweck. Gartner predicts that one in 20 jobs at U.S.-based IT product and services companies will be outsourced by the end of next year, and Forrester reckons that 3.3 million services jobs will move overseas over the next 15 years; meanwhile, Economy.com conservatively estimates that $50 billion in wages, $60 billion in economic activity, and $13.4 billion in taxes will be lost with the expected outsourcing of around 800,000 back-office jobs through 2008, assuming a typical worker earns $60,000 annually. Free-market advocates claim that the U.S. economy will adjust to the changing job landscape: For one thing, a large segment of American programmers and tech execs will reach retirement age in the next 12 years, and there will be fewer domestic replacements to take up the slack. Ronil Hira of the Rochester Institute of Technology is concerned that the outsourcing movement is proceeding faster than the labor market can adapt, and is especially worried that cutting-edge research and development will migrate overseas. He urgently recommends that outsourcing be studied in detail before it is too late for the U.S. economy to catch up, emphasizing such points as determining the number of tech jobs that are outsourced, the quality of services, and compensation for workers driven into unemployment by outsourcing. Cadence Design Systems CEO Ray Bingham argues that the short-term disadvantages of outsourcing will be offset by the increase in U.S. workers' production in response to the international competition, while at the same time insisting that "core" R&D--at least for his company--will remain domestic.
    Click Here to View Full Article

[ Archives ] [ Home ]