HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 615:  Monday, March 8, 2004

  • "Short-Lived PCs Have Hidden Costs"
    Wired News (03/08/04); Leahy, Stephen

    A new book from Tokyo's United Nations University, "Computers and the Environment," argues that increasing computers' usability lifetime will reduce their environmental impact, which is much larger than previously postulated. The book estimates that an average desktop PC devours about as much energy as a refrigerator throughout its lifespan, while its construction cost equals 1.8 tons of water, chemicals, and fossil fuels. Smaller and faster PCs are exacerbating the situation, because the fabrication of their more complex and sophisticated parts consumes more resources, according to co-editor Eric Williams, who calculates that a memory chip weighing just 2 grams eats up 1.3 kilograms of fossil fuels and materials. Usually only metals are recovered when a computer is recycled, while the components and plastics that cost so much energy to manufacture are destroyed; reselling or upgrading a computer can save five to 20 times more energy than recycling. The Silicon Valley Toxics Coalition estimates that the average lifespan of current computers is two years, and Sheila Davis of the SVTC reports that reuse is becoming increasingly difficult thanks to software licensing issues, the need for technical support, and non-compatible components. Williams' recommendations to lower the environmental costs of PCs include using a machine as long as possible, then donating or selling the unit and buying a previously owned machine; he also recommends keeping the PC off at night and checking that stand-by modes are operational. Williams thinks tax laws should be set up to give users an incentive to buy used machines. Davis notes that major businesses, institutions, and governments are planning to purchase more environmentally-friendly computers, while the EPA is developing an "ecolabel" for machines made from less hazardous materials and processes.
    Click Here to View Full Article

  • "Science Tries to Woo Women"
    Financial Times (03/08/04) P. 6; Harvey, Fiona

    Addressing the underrepresentation of women in IT, engineering, and other scientific careers is essential to the modern economy, according to a March 8 report from the Institute of Physics, and Elizabeth Cannon at the University of Calgary contends that the lack of women pursuing science careers stems from a paucity of exposure to role models and the creative aspects of science. She recommends instituting courses in which female students meet with mentors on a regular basis, and augmenting classes with hands-on opportunities. Sainsbury management fellow and engineering graduate Catherine Bright notes that schoolchildren should be taught that engineering work is especially appealing for people who feel a great need to contribute, given the responsibility such jobs entail and the immediate impact they generate. Nick Kalisperas, senior program manager for Intellect's Women in IT Forum, comments that there is a profound shortage of women in the technology sector, pointing out that women comprise only 25 percent of IT graduates and a fifth of IT employees. "If you regard technology as the driver behind the knowledge economy, it's important that more women enter the IT workforce and stay there," he says. Employers must be more knowledgeable about women's needs, such as raising a family, and give them more flexible work patterns to accommodate them. There are also arguments that women in industrialized nations should be permitted to opt out of IT jobs, especially when such work is being offshored. Kalisperas, meanwhile, says that there will always be a need for highly competent IT personnel, such as project and program managers. "If we aspire to be a high-value knowledge economy and compete on a global scale, we need a workforce that is developing more skills and a wider diversity of skills,"
    he maintains.
    Click Here to View Full Article

    To learn about ACM's Committee on Women in Computing, visit http://www.acm.org/women.

  • "What's Good About Computer Viruses"
    TechNewsWorld (03/05/04); Stresing, Diane

    The growing threat of computer viruses and other forms of malware has inspired security researchers to turn to the human immune system as a model for more bug-resistant computing. University of Virginia computer science professor David Evans, who has several projects funded by the National Science Foundation, notes that computer programs may never be able to completely imitate biological systems, but biological defenses such as diversity and decentralization could be studied to build systems with enhanced scalability and robustness. "I think the future of computing will depend on both directions of research--changing the way systems are built to make them intrinsically less vulnerable, and developing immune systems for mitigating attacks," he explains. Despite the increasing craftiness and hostility of virus writers and the damage potential of their malware, there is one positive side to the threat: Victims of such bugs usually become more eager to adopt "healthy" anti-virus practices such as downloading patches and upgrading security regularly. For example, Symantec's online unique visitor count experienced a threefold increase in the week the SoBig worm struck. Sana Security founder and chief scientist Steve Hofmeyr confesses that designing computers to mimic the immune system is not a simple task, partly because there is a great deal of missing knowledge about the immune system's mechanisms. But he values the pursuit of biologically-inspired computing, not only because it helps nurture self-repairing computer systems, but more user-friendly systems. His company offers a server-based software system designed to acquire knowledge from both user actions and external stimuli, much like the human immune system.
    Click Here to View Full Article

  • "Competing Technologies Shake Up E-Mail"
    IDG News Service (03/08/04); Roberts, Paul

    Experts anticipate a shakeout of competing antispam technologies supported by Microsoft, Yahoo!, America Online, and others in the next few months. Microsoft announced Caller ID, a new email authentication architecture, at the recent RSA Conference: Caller ID requires email senders to publish the IP address of their outgoing email servers, so that email servers and clients that receive messages can verify that the message's IP address indeed matches the address of authorized sending servers; Caller ID is being tested by Sendmail, and CEO Dave Anderson said his company intends to develop an open-source plug-in Sendmail filter that interoperates with the Caller ID scheme. Caller ID bears a similarity to Pobox.com researcher Meng Wong's Sender Policy Framework, which AOL is testing among its 33 million users. Sendmail, believing different authentication schemes can coexist, is also testing DomainKeys, a Yahoo!-supported authentication technology that seeks to thwart email spoofing by using public/private cryptography to give each email address an individualized signature based on data in the message header. Anderson said both the source and the content of messages can be authenticated with DomainKeys, although senders must deploy Public Key Infrastructure. The Email Service Providers Coalition (ESPC) is organizing trials of the various email authentication schemes. Constant Contact CEO Gail Goodman and ESPC member Hans Peter Brondmo say the coalition also wants a "reputation system" that embeds accountability in email to be built. Brondmo foresees a "jostling" between rival authentication technologies in the coming months, which could have a positive impact by accelerating standardization of an authentication infrastructure and thereby enabling organizations to focus on refining the architecture and universal availability of secure email.
    Click Here to View Full Article

  • "E-Voting Field Test"
    Washington Post (03/04/04); Webb, Cynthia L.

    The Super Tuesday vote gave both e-voting critics and supporters more circumstantial evidence to bolster their respective cases: Nationwide, the media reported on potential problems, complaints, and positive feedback from voters. Many of the failures in electronic voting systems, which were used in an unprecedented scale that day, were due to human error, such as delays in system initialization; San Bernardino County, Calif., registrar Scott Konopasek said the computer program that tallied the votes needed to have been started much earlier in order to get a timely vote count, though in the end all the votes were indeed counted. Diebold's David Bear insisted that none of the isolated errors prevented people from voting, though low batteries in some machines caused an unfamiliar screen to show up, confusing poll workers. The Associated Press, however, said that electronic delays of up to two hours in California's San Diego County forced some voters to travel to other polling locations where paper ballots were used. Johns Hopkins University computer science professor and e-voting researcher Avi Rubin observed voting stations in Maryland and noticed that the Diebold systems relied on a single machine to collect votes from other machines' PCMCIA cards and then transmit the data to headquarters via a phone line; inspection of the code showed there was no encryption on the connection, and the phone connection itself was dysfunctional at the location Rubin observed. He also noted that the use of just one machine, a "zero" machine, to collect and send voting data gave potential hackers a singular target. Rep. Rush Holt (D-N.J.) said the inconsistencies revealed on Super Tuesday foreshadowed even greater problems in November unless a verifiable paper trial was produced: Failure to implement a trustworthy and verifiable system would result in a double failure of voter trust, he warned.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

    To read about ACM's activities and concerns regarding e-voting, visit http://www.acm.org/usacm.

  • "No Riders: Desert Crossing Is for the Robots Only"
    New York Times (03/08/04) P. C1; Markoff, John

    Impatient with military contractors' lack of progress in creating automated combat vehicles to be deployed in battlefield operations by 2015, the Pentagon has enlisted the help of computer scientists, artificial intelligence experts, and robot enthusiasts by inviting them to build robotic vehicles to participate in an off-road race from Barstow, Calif., to Las Vegas on March 13, with a $1 million prize going to the first unmanned vehicle to reach the finish line. Once the contest begins, the robot vehicles must run the course without human assistance, relying only on their sensors and on-board computers to safely guide them. The course will remain secret until two hours before the race when the Defense Advanced Research Projects Agency will hand each competitor a series of approximately 5,000 global positioning satellite waypoints, giving the teams only a short time to plot out optimal routes. Some teams are hoping to anticipate potential routes and their terrain by purchasing ultrahigh resolution digital imagery of the desert region where the race will take place. Among the designs that will compete is a robotic Hummer built by Carnegie Mellon University's Red Team, and a self-guiding Ghostrider motorcycle designed by students at the University of California at Berkeley. However, Hans Moravec of CMU is doubtful that any vehicle will successfully complete the race, including the Red Team's. The race will be monitored by military helicopters while a chase vehicle will shadow each robot, with a kill switch at the ready should the vehicle enter a hazardous situation. Most experts praise the Pentagon's strategy of promoting and advancing robotics technology--and their own military goals--in an inexpensive way through the contest.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Dissolving the Limits of Linux: The Breakneck Evolution Continues"
    LinuxWorld (03/07/04); Pettit, Jason

    Linux is evolving faster than any previous operating system, leaping from an alternative platform for non-core components to an increasingly viable candidate for even the most challenging scientific applications. Attendees at this year's LinuxWorld Conference & Expo no doubt were surprised by the operating system's rapid rise to the top echelons of enterprise technology. Many vendors and users now see Linux as the "one-Unix" to unify operating systems on various enterprise applications and hardware, according to POSIX standards. The last impediment to Linux adoption throughout the corporate world seems not to be technology, but rather company bureaucracy and inertia. A number of converging trends have helped push Linux so far so fast, including the development of 64-bit processor technology that lets Linux hobbyists contribute to an ever-stronger Linux server market--but though these relatively commoditized systems expand Linux's market share, they do little to push the operating system's technical limits. Those advances are taking place in the high-performance computing (HPC) realm where new Linux systems are performing admirably in terms of memory availability and I/O bandwidth. NASA Ames Research Center recently created the largest Linux supercomputer ever with 512 processors, and that system was also the first-ever to break 1 Tbps, according to the STREAM Triad benchmarks for memory bandwidth. This type of HPC use shows community-driven Linux is advancing at a faster pace than even proprietary solutions. ISVs are also contributing to the number of 64-bit software applications for Linux and making it a better candidate for operating environment unification.
    Click Here to View Full Article
    (Access to this site is available to paid subscribers only.)

  • "E-mail's Past Could Point to Future of Instant Messaging"
    IDG News Service (03/04/04); O'Connor, Fred

    The future of instant messaging could lie in the past history of email, noted panelists at the Instant Messaging Planet Conference on March 3. Kieran McCorry of Hewlett-Packard's technology leadership group called IM's situation close to that of email 10 years ago, which was characterized by competing standards with a shortage of compatibility: "Today, companies may have four different IM programs on their desktop, because one client uses AOL, the office uses an enterprise program and their spouse uses ICQ,"
    he explained. Parlano CTO Bob Serr added that IM, like email, has initially found favor among enterprise users because of security and other concerns. Two ways to address the IM interoperability challenge were proposed by the panelists: The installation of multiple IM clients, including free IM programs and their company's enterprise programs, on the desktop; or IM providers entering into an agreement to make their lists available to each other. Serr said the heterogeneous IM environment will prevail until an IM standard emerges, and some panelists believed that enterprise IM software will be standardized ahead of free public services. IMlogic's Jon Sakoda contended that this will happen because large enterprises have the financial resources to buy the software and more clout than consumers in pushing for standards. Panelists were of the opinion that IM probably has greater value as a tool for business rather than personal communication. Microsoft's Paul Haverstock said more easy-to-use IM programs should be developed and deployed and that users should be allowed to build groups of IM contacts out of existing addresses. He also called for the creation of software that can smoothly effect a switch from IM conversations to voice-over IP calls and the insertion of additional user information in IM user names.
    Click Here to View Full Article

  • "New Research Field Focuses on Video Games"
    Michigan Daily (03/04/04); Joling, Anne

    Video games are increasingly becoming a subject of research at colleges and universities across the country. The list of schools offering classes and programs on video games includes MIT, Purdue University, Ohio State University, and Princeton University. At the University of Michigan, communications professor Dmitri Williams teaches a class that covers economics, media history, and other topics in relation to video games; and communications and psychology professor Brad Bushman is focusing on the study of the impact of violence in video games, which is the most popular area for academic research. Meanwhile, electrical engineering and computer science professor John Laird is studying new ways to create video games. "The research my group does on computer games is to use computer games as an environment for testing out ideas on building artificial intelligence characters, as well as exploring new types of games," Laird explains. "By adding artificial intelligence characters, it might be possible to make computer games that are more of a synthesis of interaction and plot-driven stories." An enormous market, video game makers grossed more than Hollywood last year.
    Click Here to View Full Article

  • "Scientists Debate Success of Los Alamos Supercomputer"
    Associated Press (03/02/04)

    Officials at Los Alamos National Laboratory believe the decision to not pursue the final phase of the Q machine was rooted in politics and congressional budgetary concerns, rather than on the performance of the supercomputer. The Department of Energy has decided not to expand Q, which is the second-fastest supercomputer in the world. Although the federal government says the Q machine is prone to failure, lab officials say the supercomputer works well. The group of scientists that reviewed Q for the federal government reported that the supercomputer has design flaws in its microprocessors. "Consequently, the machine failed much more rapidly than one would expect," says Darrell Long, a computer-science professor at the University of California at Santa Cruz who participated in the review. Moreover, the company that made the 8,192 microprocessors has been sold, and the microprocessors have been discontinued. Theoretically, the Q machine was powerful enough to perform 20 trillion operations per second, but the next phase would have given it the capability to handle 30 trillion operations per second. Three linked supercomputers were supposed to comprise Q, with each handling a third of the computing power.
    Click Here to View Full Article

  • "Where Computer Display Technology Is Headed"
    CIO Update (03/01/04); Robb, Drew

    The way we regard computing could change significantly with a quintet of emerging display technologies. Organic light emitting diodes (OLEDs) are not backlit and boast a microsecond response time to avoid image smearing, giving them an image quality and contrast superior to liquid crystal displays (LCDs); "OLEDs offer wide viewing angles and very fast response times so they pick up changes in video images without producing jagged edges," notes Display Search's Barry Young. However, the technology will not be incorporated into computers until the displays' lifecycle is extended, while the tendency for static images to burn in over time also needs to be addressed. 3D displays are already available from vendors such as Sharp Electronics and NTT DoCoMo: Advantages include the ability to see 3D images without special glasses or headgear, while disadvantages include limited viewing angles; Ian Matthews of Sharp Systems America says 3D display products such as Sharp's RD3D notebook are well suited to molecular modeling, computer-aided design, and medical imaging. Field Emission Displays combine the brightness of cathode ray tubes with the space efficiency of LCDs, allowing manufacturers to build large CRT displays with the thinness of LCD screens. Digital Light Processing from Texas Instruments employs a chip with 1.3 million tiny mirrors that can build images by adjusting each mirror into an on or off position when light shines on the chip; as many as 1,024 levels of brightness can be generated by varying the amount of time the mirror spends in the active and inactive modes. The fifth technology is bistable displays, which are well suited for displaying mostly static information--the displays do not need a continuous source of power, and they retain images even after power is removed.
    Click Here to View Full Article

  • "More Work Needed for Biometrics"
    Federal Computer Week (03/01/04); Frank, Diane

    Officials say that there is still work needed on standards, interoperability, and testing before biometrics can be used as a mature government security solution. Commercial solutions vary in accuracy and reliability, and Martin Herman, chief of the National Institute of Standards and Technology's Information Access Division, says the institute needs to develop an independent effectiveness test for biometrics solutions. The Homeland Security Department's U.S. Visitor and Immigrant Status Indicator Technology program uses a fingerprint scanner, but the quality of the original scan and the scan at the time of use affect the solution's reliability. The White House's Office of Science and Technology Policy coordinates a lot of the research and standards work and unites government experts, according to OSTP senior policy analyst Kevin Hurst. Subgroups are working on integrating different biometrics into one solution for greater security, improving the human interface, addressing privacy and legal issues, and developing an international testing and evaluation structure. Meanwhile, government agencies attempting to implement various biometric systems must decide which technologies to use. The Department of Veteran Affairs this summer plans to start issuing over 500,000 smart cards to users. The cards will feature digital certificates, while only a subset of users will have be given cards with higher-level biometric capabilities; the agency is considering both iris and fingerprint scanning technologies.
    Click Here to View Full Article

  • "States Join Spyware Battle"
    CNet (03/04/04); Borland, John

    State legislatures are beginning to regulate spyware and other advertising software--a trend that may please consumers but worries some Internet businesses. "Our concerns are [about] regulating a technology rather than regulating the use of a technology," says Internet Alliance executive director Emily Hackett. Spyware is generally considered software that tracks users' online actions or uses PC resources to display advertisements, and it can be hard to find and uninstall. Utah's new law, though not yet signed by the governor, would bar companies from installing software that reports users online actions, sends personal data to other companies, or displays ads without permission. Cookies and ads that use HTML or JavaScript are not included. An Iowa bill concentrates more on software that collects and transmits data without consent. Privacy groups concerned about adware and spyware are nonetheless uncertain about the state bills. Center for Democracy and Technology associate director Ari Schwartz says that "to us, it makes sense to have a general [federal] privacy bill." California is considering two related bills; one that seeks to prevent advertisers from using an individual's computer without their permission, and another aimed at stopping the transmission of user information via spyware. Federal lawmakers have also introduced legislation to prevent software from commandeering a user's computer.
    Click Here to View Full Article

  • "Winning Ways to Stop Spam"
    Computerworld (03/01/04) Vol. 32, No. 9, P. 21; Hall, Mark

    Even with the rapidly increasing amount of spam email, some IT managers have successfully reclaimed email for their company using various methods. Wyndham International manually updated a content filter to keep out unwanted messages, but found its efforts overwhelmed in late 2002, says electronic messaging systems manager Lyndon Brown; the hotel chain switched to a standalone antispam appliance that Brown says also netted too many legitimate emails. Finally, MailFrontier Enterprise Gateway software was chosen to work alongside Wyndham's Lightweight Directory Access Protocol directory server. Besides regularly updating spam information from MailFrontier, the system learns individual user preferences and maintains "honey pot" addresses that fingerprint bogus email. Flooring maker Congoleum deployed iHateSpam filter software on its Microsoft Exchange email server, and Network engineer Wayne Neville says that despite the additional software load, the server actually performs better than before because of the fewer resources spent on spam. Pinkerton Computer Consultants uses an outsourced service from FrontBridge Technologies that applies 18,000 proprietary rules to incoming messages; the goal of the rules is to identify legitimate emails and allow those to pass through, not tag spam as in most other antispam systems. Pinkerton infrastructure engineering manager Clif Sevachko says an additional benefit is that virus-laden emails are kept on FrontBridge's off-site servers so as to protect his own organization from "zero-day" virus exploits. Ingersoll-Rand was also using an outside provider, but with rates tied to the number of messages, antispam costs were getting out of hand: E-messaging manager Doug MacLeod says his company instead bought two hardware appliances from CipherTrust that saved $1.5 million in spam-fighting costs to date. The appliance-based solution requires some customization, such as the setting up of white lists, but the result is worth the effort, says MacLeod.
    Click Here to View Full Article

  • "MEMS Reliability Key to Acceptance"
    Small Times (02/04) Vol. 4, No. 1, P. 23; Karoub, Jeff

    Increasing the acceptance of microelectromechanical systems (MEMS) technology is a tough challenge: Despite evidence that MEMS devices are reliable, their long gestation period for achieving dependable mass production has discouraged users, according to many attendees of last September's MEMS Technology Roadmap and Industry Congress. Cleo Cabuz, executive director of the MEMS Industry Group (MIG), notes that users want hard evidence, gathered under accelerated testing conditions, that MEMS devices will perform reliably in each unique application's lifetime. According to a survey incorporated into the "2004 MIG Industry Report: Focus on MEMS Reliability," 90 percent of polled device manufacturers say their clients want a demonstration of reliability, while between 70 percent and 100 percent of respondents require the devices to function reliably for at least a decade. Reliability demonstrations are complicated by the devices' complexity, the diverse materials they are constructed from, and the manufacturing processes and applications. Furthermore, existing reliability tests and data may vary across products, or may be withheld for competitive reasons. Communicating to customers in a clear and concise manner the considerable time and financial investments in bringing MEMS applications to market will help overcome the false perception of MEMS reliability, vendors contend. MIG is developing a design-for-manufacturability program with the Defense Advanced Research Projects Agency, and collaborating with UC Berkeley's Sensor and Actuator Center on teaching manufacturable MEMS.

  • "Smart-Label Revolution"
    Discover (03/04) Vol. 25, No. 3, P. 24; Johnson, Steven

    Radio frequency identification (RFID) technology has the potential to dramatically increase consumers' level of control over business transactions. Affixed to objects, RFID tags consist of small, inexpensive chips and antennas that broadcast data about the objects to a receiver; the chips require no batteries and are powered by electrical fields or magnetic energy discharged by the receiver. RFID technology may make it possible for customers to pass through checkout lanes without stopping because all their selected items would be instantly charged to their credit cards, while the store's inventory department would be advised to restock the purchased goods at the same time. Even more interesting is the way filters could be used with RFID technology to empower customers and discourage companies from selling poor-quality products: For example, a shopper watching his cholesterol intake could shop for groceries with an RFID reader equipped with a cholesterol filter, while consumers could use filters to access online opinions of products on the spot to avoid bad purchases. "The fact that a given item is number 123 in the RFID database needs to be legally considered a public fact," notes Cory Doctorow with the Electronic Frontier Foundation. "If you limit the public's ability to make a connection between a product and an RFID tag, then all of these filtering technologies become much harder to pull off." RFID is also drawing fire from privacy advocates who fear the technology could be employed as an invasive corporate or government surveillance tool. However, RFID chip technology is following a cycle typical of many disruptive consumer technologies, whereby public debate is split between critics who portend terrible consequences and proselytizers who promise dreamy applications.
    Click Here to View Full Article

  • "Rethinking Network Security"
    Business Communications Review (02/04) Vol. 34, No. 2, P. 16; Phifer, Lisa

    Overcoming the problems of network security and lowering the dangers presented by worms, trojans, and other kinds of malware that so plagued Internet users last year will require a coordinated multi-pronged approach that involves everyone. "Perimeter defense as the sole or primary means of protecting an organization is collapsing, especially as more and more organizations allow partners and customers to connect to them," notes Ian Poynter of Bit 9. Organizations stand a much better chance of recovering from network disruptions with a solid network security incident and disaster response strategy in place--one that is updated regularly, is known to all affected parties, and is practiced to confirm feasibility and completeness. Core Competence's David Piscitello argues that bettering software processes and designing more secure code is a much more effective measure than just blocking malware, and suggests developers could be spurred to focus on such improvements by threats of product boycotts. Another recommendation for bettering network security is to diversify software systems, which could avoid an operating system monoculture. Everyone should practice stronger password protection, while larger companies would do well to supplant password authentication with digital signatures, two-factor authentication, and other, stronger options. Halting problems at the point of entry offers more efficiency and cost-effectiveness than picking up the pieces after an infection has occurred, and this can be done via the centralization of endpoint security controls and the elevation of security intelligence levels in network servers. Many experts contend that ISPs should add safeguards--spam filters, virus scanners, etc.--to lower risks to their customers in the hopes of staving off threats to the Internet that may stem from network users, while designing networks and systems for high availability is another key stratagem for fortifying network security.

  • "Translation in the Age of Terror"
    Technology Review (03/04) Vol. 107, No. 2, P. 54; Erard, Michael

    The 9/11 tragedy has spurred intelligence agencies to overhaul their efforts to translate the massive amounts of information they collect in order to anticipate and thwart future terrorist attacks. The National Virtual Translation Center will serve as the nexus of a secure network of government linguists and private contactors employed by all intelligence agencies; rather than focusing on universal machine translation, which the U.S. government has been pursuing for about 60 years, the center will develop and deploy an array of technologies designed to enhance human translation and analysis by digitizing, parsing, and digesting raw intelligence data. Digitization-enabling technology, in the form of optical character recognition software, is limited because it works mostly with European languages and does not read distorted or handwritten documents well, so a federally-funded project at BBN has created a less rigid version that is more effective with blurred text and can be aligned to more languages. Existing software that seeks out key words and phrases and labels suspicious documents for closer examination is also chiefly limited to European languages, but Basis Technology's Rosette Arabic Language Analyzer can facilitate more accurate searches by regularizing the spelling of Arabic words and eliminating perplexing grammatical additions. Another ingredient in the National Virtual Translation Center's plan is translation memory, which picks out a chunk of text and matches it against previously translated content. These various technologies will be integrated into a secure Web architecture that will permit the center to send projects to analysts in the field and support fast and accurate collaboration between geographically scattered linguists. However, skeptics such as former CIA officer Robert David Steele do not expect the center to work "because they lack the mindset to understand networks, translators without security clearances, and ad hoc contracting," while other specialists believe the long-term solution still lies in machine translation.
    Click Here to View Full Article
    (Access to this article is free; however, first-time visitors must register.)



    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM