Association for Computing Machinery
Timely Topics for IT Professionals

About ACM TechNews

ACM TechNews is published every week on Monday, Wednesday, and Friday.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to technews@hq.acm.org.
Volume 6, Issue 690: Friday, September 3, 2004

  • "Copyright Office Pitches Anti-P2P Bill"
    CNet (09/02/04); McCullagh, Declan

    A new draft of the Induce Act dated Sept. 2 purports to outlaw peer-to-peer networks used for file-swapping without jeopardizing products such as MP3 players and portable hard drives that might "induce" people to commit digital piracy. The Copyright Office insists that this new version of the bill is a "discussion draft" designed to encourage debate over the issue, and should not be construed as a recommendation or official position. The revised draft makes it harder to establish companies' liability for copyright violations by assigning liability to anyone who "intentionally induces" such piracy, with inducement defined as one or more "affirmative, overt acts that are reasonably expected to cause or persuade another person or persons" to commit copyright infringement. Consumer groups and Internet providers are concerned that the new proposal is as flawed as the original Induce Act, citing provisions that establish liability for companies that "actively interfere" with a copyright owner's attempts to identify pirates and require tech firms and Internet providers to take all "reasonably available corrective measures" to prevent copyright violation. Will Rodger of the Computer and Communications Industry Association disparages the proposal on the grounds that it would regulate computer hardware and software. Fair use advocacy group Public Knowledge, Verizon, the Consumer Electronics Association, and other organizations are backing a much narrower alternative to the Induce Act that only covers commercial file-swapping networks that will wither in the absence of rampant piracy.
    Click Here to View Full Article

  • "Spammers Using Sender Authentication Too, Study Says"
    IDG News Service (08/31/04); Roberts, Paul

    The anti-spam effectiveness of the Sender Policy Framework (SPF) email sender authentication standard as well as a successor standard, Sender ID, has been called into question by the results of a CipherTrust survey, according to CipherTrust CTO Paul Judge. The poll indicates that out of roughly 2 million emails sent to CipherTrust clients between May and July, only about 5 percent originated from domains that published a legitimate sender authentication record using SPF or Sender ID, and Judge notes that spam slightly outnumbers valid email in that 5 percent. "The idea was that SPF would point to legitimate email because spam would fail SPF checks is not true, because spammers have rolled out [SPF] records, too," remarks Judge. "In fact, three times more spam passes SPF checks that fails it, so passing or failing an SPF check is not a strong indicator that messages are spam." He attributes this failing to spammers getting the jump on legitimate email senders in adopting sender authentication technology. The CipherTrust survey finds that 2.8 percent of valid email passes SPF checks, compared to 3.8 percent of spam, while Judge reports that just 31 Fortune 1000 companies publish SPF or Sender ID records, and a mere 6 percent of CipherTrust's customers publish SPF records. SPF and Sender ID co-author Meng Weng Wong of Pobox.com comments that SPF's adoption by spammers is actually a positive sign, as the forced publication of SPF records by both spammers and non-spammers will theoretically simplify legitimate firms' development of email reputations for Internet domains that do and do not send spam. Wong says that SPF was never designed as a spam panacea, but rather as a tool to deter spammers from spoofing email addresses.
    Click Here to View Full Article

  • "Tech Initiatives Aim to Go Global"
    Wall Street Journal (09/02/04) P. B4; Clark, Don

    Bringing information technology to the world's underdeveloped areas is the goal of numerous collaborations between academic and industrial researchers involving the design of new, inexpensive communications and computing devices. Advanced Micro Devices' (AMD) 50x15 initiative aims to link half the global population to the Internet by 2015; Intel and several universities have joined forces to enhance networking technology to function in regions vulnerable to frequent blackouts; and the National Science Foundation-funded ICTB4 (Information and Communications Technology for Billions) project is an international effort with multiple objectives, such as expanding the communications range of Wi-Fi and improving rural villagers' access to economically beneficial resources with "proxies" that store data from frequently used Web pages for use when Internet connections are unavailable. PCtvt, conceived by Carnegie Mellon University professor Raj Reddy, involves designing a new category of appliance-like gadgets that will allow semiliterate people to view TV and DVDs, get email with video clips, and make voice calls, all through remote controls. The devices will incorporate cheap chips from AMD and Intel, and streamlined versions of either Microsoft Windows or the Linux operating system. Another Intel project revolves around a new PC that allows parents in China to strictly control their children's access, based on research showing that some parents are concerned that computers and the Internet are drawing their kids away from educational pursuits. Projects such as these, which seek to close the gap between the digital haves and have-nots, illustrate a general agreement that the efforts of governments and charitable organizations are inadequate. University researchers and corporate executives are attempting to forge profitable partnerships with home-grown businesses in developing countries that could help market new technology.

  • "Virtual Humans Proposed as Space Travelers"
    Space.com (09/01/04); David, Leonard

    Computer animation expert Peter Plantec says virtual humans are a needed accompaniment for any long-term space travel because they will filter information for real human astronauts. Any encounter humans have with alien visitors will likely be with their virtual surrogates, since sending "soft" life forms into space is much more difficult. Virtual human companions on space voyages would deal with highly complex information analysis and automatically monitor sophisticated on-board systems. Plantec's proposed virtual human astronauts are not focused on artificial intelligence technology because Plantec has deemed AI as unnecessary and perhaps unhelpful in creating truly beneficial assistants and companions: "What we really need is to fake conscious behavior so that we humans can have the emotional relationship with machines," he explains, adding that such capabilities are beyond the scope of AI. A merging of "fake consciousness" and artificial intelligence will occur in the future, leading to self-programming systems that will be able to modify themselves to better emulate true human behavior, Plantec says. He has authored a best-seller on 3D animation as well as a "Virtual Humans" kit, including software, that guides people through the creation of their own virtual human. In the future, such creations will help the technologically disinclined interface with computers, and allow technophiles to become even more engaged with their gadgets. "I really believe that, eventually, these things will evolve into a quasi-life form and we will form symbiotic relationships with them," Plantec declares.
    Click Here to View Full Article

  • "Ultrawideband Takes on Wi-Fi"
    Technology Review (09/02/04); Brown, Eric S.

    Ultrawideband (UWB) is an attractive technology that offers unique benefits over other contenders for wireless home networking, but its ongoing standards battle could give it enough pause that a new Wi-Fi version pulls ahead. IEEE UWB representatives have been fighting over the 802.15.3a standard for an unusually long period of time, with Motorola and around 60 smaller firms offering technology that has already won federal regulatory approval and looks easier to integrate; the other group is made up of much larger companies and promises lower cost since it includes a number of home electronics makers. The IEEE 802.11n Wi-Fi standard dialogue, meanwhile, seems to be progressing smoothly with both sides reasonably amiable and willing to compromise for faster time-to-market. The standards process for both UWB and 802.11n could prove decisive, since Wi-Fi is already established in the home networking space and is expanding its reach as a wireless Internet access technology. If the home networking market is consolidated under Wi-Fi, UWB would still hold promise for other applications such as PC cable replacement and location-aware networking on highways, speeding traffic flow. UWB's spread-spectrum approach allows it to sense where devices are located and makes it a more secure data channel than Wi-Fi; broad frequencies also lessen interference, though that theory has yet to be proven to the full satisfaction of regulators, who limited the recently approved UWB chipset to a 20-foot broadcast radius because of interference concerns. The new Wi-Fi also promises reasonably competitive data speeds equivalent to fast Ethernet wired technology, though still requiring more power than UWB. Eventually, Wi-Fi 802.11n could be improved to roughly the speed of where UWB is expected to be at in a few years.
    Click Here to View Full Article

  • "Gearing Up for Digital-Era Preservation"
    IST Results (09/02/04)

    The importance of digitally preserving Europe's cultural and scientific heritage will be highlighted at an October workshop in Bern, Switzerland, hosted by the IST program's Erpanet project. Underlying Erpanet is the acknowledgment that Europe must commit more time, financial resources, and effort to dealing with digital preservation, and that this effort must be ratcheted up because the definition of digital preservation has expanded to include those who manage electronic records, not just traditional archivists and record managers. The three-year project, executed between Italian, Dutch, and Swiss partners, has studied experience, policies, and stratagems devised by others in the field of digital preservation; among Erpanet's accomplishments is the erpaDirectory, which details approximately 100 European programs, while Erpanet coordinator Peter McKinney also lauds the project's 60 case studies. "They are unique for the preservation community, looking not only at cultural heritage but also software and the way institutions such as banks plan to keep records in the future," he notes. Though more and more data is being rendered digitally, McKinney explains that "the problem is how we get people to use these standard formats--and whether they will continue to be standard in future." Software and hardware's ever-increasing speed of change is another challenging factor. The Erpanet coordinator reports that the project is cultivating a community of stakeholders, while Erpanet-distributed guidance documents on best practices and other advisories should maintain their usefulness for several years.
    Click Here to View Full Article

  • "Scientists Set Internet2 Speed Record"
    IT Management (09/02/04); Kuchinskas, Susan

    A new land-speed record for the Internet2 academic network was set by researchers at the California Institute of Technology and the European Organization for Nuclear Research (CERN) on Sept. 2 when 859 GB of data was successfully routed between Geneva and Pasadena at a distance of roughly 9,800 miles in less than 17 minutes. The experiment achieved a data transfer rate of 6.63 Gbps and surpassed the 100-petabit meter per second benchmark. The technological components of the test included an Xframe 10 GbE server adapter from S2io, 7600 Series Routers from Cisco, Microsoft's 64-bit version of Windows Server 2003, Itanium servers, and Newisys 4300 servers using Opteron processors from Advanced Micro Devices. The breakthrough is part of an initiative to transfer vast data volumes by 2007, when CERN's Large Hadron Collider (LHC) becomes operational and starts churning out approximately 15 petabytes of data annually. The test is one stage in an ongoing program to lay the groundwork for next-generation, data-intensive grids by building high-speed global networks. The 6.63 Gbps data transfer rate could do more than help scientists worldwide access data generated by the LHC: It could allow full-length DVD movies to be transferred in a matter of seconds, while other potential fields of application include oil and gas exploration, bioinformatics, global climate simulation, seismology, and astronomy. Though the 10 Gbps transfer rate of Internet2's Abilene backbone is formidable, Microsoft Research engineer Jim Gray, who participated in Wednesday's land-speed test, cautions that hardware and software issues still need to be resolved. For instance, the speediest available interface for PCs, the PCIX64 Bus Isolation Extender, can only deal with 7.5 Gbps.
    Click Here to View Full Article

  • "Birth of the Bluetooth Bots"
    The Feature (09/02/04); Pescovitz, David

    Researchers are finding the low-power Bluetooth wireless communications technology to be an excellent enabler for small, inexpensive robots that could serve as proof-of-concept demonstrations for collective machine intelligence and "self-deploying" sensor networks. Scientists at the Swiss Federal Institute of Technology's Autonomous Systems Laboratory have developed an unmanned miniature blimp that uses Bluetooth to wirelessly interact with a desktop PC that processes data captured by the robot's sensors, thus "evolving" its navigation software without human assistance. Meanwhile, a 30-gram "aerobot" equipped with a Bluetooth module is being developed to fly indoors while maintaining a PC connection; the robot's control system is comprised of neural networks and a computer vision system modeled after an insect's eyes, while the PC uses natural selection to optimize the machine's navigational software to the point that it can fly without a PC link. Bluetooth's benefits for robotics designers include its low power requirements and high error recovery, but the technology cannot support telepresence because it can only handle one megabit of bandwidth, while its transmission range is typically restricted to 10 meters. However, Walter Potter of the University of Georgia's Artificial Intelligence Center recently demonstrated that Bluetooth could facilitate "hive robotics." The demonstration involved two robots outfitted with Bluetooth-enabled Compaq iPAQs that collectively searched for a lightbulb in a big room. In a hive robotics system, "the loss of one of the team members will only have a minor effect on the outcome of the task," notes Potter. "The remaining robots could adapt their behavior based on the realization that one robot has been lost." Such abilities are well-suited for certain military applications.
    Click Here to View Full Article

  • "Dynamic Lighting System Colors 3-D Environments"
    Penn State Live (08/23/04)

    The Expressive Lighting Engine (ELE) is a dynamic, automatic lighting system that can enhance the video game experience for users as well as accelerate game development. ELE is the brainchild of Penn State School of Information Sciences and Technology assistant professor Seif El-Nasr, who reported with graduate student Chinmay Rao at the Siggraph Poster Session that the system can illuminate key elements in 3D scenes, based on tracking the eye movements of users playing a first-person shooter game. At the American Association of Artificial Intelligence's July conference, El-Nasr disclosed the results of an earlier user study that indicated that people playing the game Unreal Tournament had trouble responding fast enough to enemy attacks because they could not spot foes in time due to static lighting. "ELE draws on cinematic and theatric lighting design theory and enables game designers to fully use lighting's subtle but powerful effects," El-Nasr explains. The system can also benefit designers because it rapidly adapts the scene's lighting to reflect changes in action or tension. El-Nasr notes that ELE can be used to increase a game's difficulty level by adding more shadows to make characters less obvious, while the system comes with an override option as well. "Additionally, ELE supplies artists with a language to write rules for specific lighting changes or set up," reports the professor, who adds that the system can augment educational and military simulations.
    Click Here to View Full Article

  • "Domestic Bliss Through Mechanical Marvels?"
    USA Today (09/01/04) P. 1B; Maney, Kevin

    Robot technology has arrived at a watershed moment in which it has started to migrate from industrial and military settings into the home. Experts expect demand for domestic robots that function as caregivers, assistants, and companions to explode as baby boomers approach their autumn years and the social and health-care infrastructure struggles to accommodate them. Rodney Brooks, director of MIT's Artificial Intelligence lab, says, "As the demographics change, robots could help solve some problems. The question is, where is that transition?" Already, many robotics projects are underway: Companies such as iRobot are concentrating on delivering "homebots"--devices that can perform household chores--that will initially be single-purpose rather than multipurpose. "Making [the robots] do exactly what you want them to do for every possible home environment is a frustratingly long row to hoe," notes iRobot CEO Colin Angle. For the homebot market to really take off, the machines will need better visual capabilities, which are currently inferior to those of a two-year-old human. The usefulness of "carebots" will become apparent in the coming years as hospitals and nursing homes strain to help a burgeoning elderly population: Functions such machines could perform include monitoring a person's vital signs, alerting emergency services if there is a crisis, reminding owners of important events as well as day-to-day activities, and more complicated tasks facilitated by remote Internet control by a doctor or nurse. Making such visions reality involves giving robots tactile capability and vastly improving their manual dexterity. Some projects, particularly those in Japan, are focusing on robots that consumers can invest in emotionally for entertainment purposes or, in the case of the elderly, to combat loneliness. With so many different kinds of robots expected to inhabit future households, researchers face the daunting challenge of making them capable of communicating with each other and operating in teams.
    Click Here to View Full Article

  • "Code Name: Geekfun"
    Los Angeles Times (09/02/04) P. A1; Menn, Joseph

    It is typical of the tech industry that products which boast whimsical, cool-sounding code names in their development phase are marketed under less colorful, humdrum brands. Most other industries, by contrast, use mundane appellations for products in development only to roll them out commercially under catchier titles. Assigning cool code names to products helps stoke developers' excitement for the projects, but the practice also fulfills the pragmatic role of keeping projects secret. However, if these code names are leaked outside the company, they may risk the umbrage of copyright holders and even individuals defending their reputations. The threat of legal action can lead to additional complications: For example, Apple's uncertainty over its Power MacIntosh 7100's chances of commercial rollout caused managers to code-name it Sagan, after astronomer Carl Sagan, who took offense and sued for libel when Apple renamed the product BHA for "Butt-Head Astronomer." Sagan lost his case. Further down the road, the same product was re-titled LAW ("Lawyers Are Wimps"), which prompted attorneys to institute guidelines dictating that code names be completely inoffensive from then on. Apple tried to make leaks traceable by assigning different terms for the same project, but this strategy often bewildered personnel, according to some veterans. The growing public exposure of product code names is spurring some companies to hire branding specialists to come up with provocative yet generic internal titles. Maintaining the secrecy of projects has become increasingly difficult with the emergence of the Internet, but some companies are exploiting this by using cool names to generate interest among business partners and early adopters.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Paper or Mouse-Click? What's on Computers Is Easier to Find, Study Shows"
    UW News (09/01/04); Goldsmith, Steven

    Participants in a survey from the University of Washington's Information School indicated that managing and retrieving paper-based information is more troublesome than computerized information. Over 50 percent of the 219 respondents reported that they lost track of a paper document at least once a week, which is more than two times the percentage of people who admitted to losing track of electronic documents. "[The survey results suggest] that many will be happy to see a more complete transition to electronic storage," noted associate research professor William Jones. However, only 10 percent of respondents reported extreme satisfaction with their ability to deal with electronic records, compared to 2 percent for paper, and computer users noted that managing information comfortably involved a continuous trial-and-error search process that is likely to become even more challenging as music, notes, images, and data are distributed among an increasingly diverse population of electronic devices and computer applications. Jones and the Information School's associate dean for research Harry Bruce are working on a software tool dubbed a "Universal Labeler" that allows people to organize information as they see fit by dragging and dropping data into logical folders. The Universal Labeler project is part of Keeping Found Things Found, a National Science Foundation-funded initiative to understand how to improve information management irrespective of where that information is stored or what format it is in. Doctoral student Ammy Jiranida Phuwanartnurak explained that Keeping Found Things Found participants feel they need to organize their information even if it can be easily located via search tools, and Jones and Bruce are planning a new poll to better determine why such organization is desirable.
    Click Here to View Full Article

  • "Cooking Up a Digital Future"
    BBC News (08/31/04); Hardy, Ian

    Concocting futuristic kitchen technologies is the goal of MIT's Counter Intelligence Research initiative, whose areas of focus include smart appliances and more durable equipment. Concepts being researched include devices such as a sensor-studded plastic container that is aware of its contents and can clock how long before those contents go bad; oven mitts that read temperature and dispense culinary advice; and camera- and computer-equipped fridges that monitor stored items and add items that need to be replaced to a shopping list. Another project is a mug that tells the owner when it is hot through the incorporation of liquid crystal displays, bimetal strips, thermoresisters, and thermochromic ink. "We're really starting to think about what we can sense, but more importantly how we can use the sensors to change the way people do things, and improve them," explains Ted Selker, director of Counter Intelligence Research. "My most exciting example of that is a spoon that literally teaches you how to cook, by watching and tasting, and noticing the temperature of the thing you're mixing." Research assistant Leonardo Bonanni notes that the "dishmaker" project is particularly fascinating: It involves a machine that inflates thin, recyclable plastic wafers into dishes and reduces kitchen clutter. Bonanni also calls attention to a kitchen sink made of pliant silicon rubber that is tolerant of high temperatures and resistant to breakage. The technologies being developed are often rough around the edges, with design refinements handled by appliance companies.
    Click Here to View Full Article

  • "Five Photons Linked"
    Technology Research News (09/01/04); Smalley, Eric

    A team of international researchers has entangled five photons in a quantum computing setup that would be able to check for errors and transmit information through a distributed network. Quantum computers would require more stringent error-checking schemes than are used in traditional electronic computers because of the delicate nature of their operation: Particles in quantum states are able to represent both 0 and 1 values, allowing a string of such particles to calculate all the possible answers in that string in just one set of operations; but the particles are easily thrown out of their superpositioned state by environmental disturbances. To counter possible errors, quantum computing theorists have come up with numerous error-checking schemes that involve the entanglement of multiple particles. The new five-particle set-up demonstrated by Chinese, Austrian, and German researchers is one of the most efficient frameworks because it allows errors to be identified in just one step and does not require matching sets of linked particles. The researchers also used the entangled particles to perform open-destination teleportation, in which quantum information can be sent to other particles regardless of distance or physical barriers; this capability would be useful in sending information within a quantum computer, as well as transmitting information between nodes in a quantum network. Chinese researcher Jianwei Pan, a visiting fellow at the University of Heidelberg in Germany, estimates it will be at least a decade before the technology will have practical applications.
    Click Here to View Full Article

  • "Bringing Down Communication Barriers for the Hard of Hearing"
    IST Results (09/01/04)

    The IST program-funded Synface project coordinated by KTH in Sweden seeks to enhance telephone communications for people with hearing difficulties using software that produces a computer-generated face whose lips move in sync with the caller's speech. The software can be installed on a garden variety PC and uses a standard phone line, and 84 percent of users who tested Synface in the United Kingdom reported that the system helped them recognize words, while 74 percent claimed that the system significantly increased the effectiveness of telephone dialogs and helped them maintain normal conversation. The software is equipped with a speech recognizer that translates the audio signal into lip movements on the synthetic face, while a 200-millisecond processing delay enables synchronization between the audio and visual communications. "The system has a delay allowing it to recognize what sound is coming next before the audio and visual communication is passed onto the user," notes KTH's Inger Karlsson. Because Synface identifies sounds rather than words, the time it takes to generate lip movements is dramatically cut; the software only has to be programmed with sounds particular to different dialects and requires very few modifications if it is being adapted between similar languages. The two most common assistive telephone communications techniques, text phones and video phones, are less advantageous than Synface in terms of practicality: Text phones can slow or inhibit normal phone conversation because they require typing, while video phones cannot function without units on both ends of the line and large bandwidth. Karlsson expects Synface's potential customer base to expand significantly in the coming years as Europe's aged population increases and people start having hearing difficulties earlier in life because of noise pollution.
    Click Here to View Full Article

  • "When E-Mail Points the Way Down the Rabbit Hole"
    New York Times (09/02/04) P. E8; Johnson, Kirk

    Spam is a runaway technology phenomenon that focuses on better understanding human interests, according to academics and spam experts. Spam and technologies to counter it develop quickly, but are not developing in the traditional economic sense where the aim is to gain market share; instead, spam technologies are more similar to military stealth technologies, except that to succeed the spam must better understand human behavior. That is why a spam message that offers anti-spam solutions seems eerily self-aware, or at least sensitive enough to know a solicitation to stop messages such as itself is appealing to the targeted reader. Anti-spam research focuses on knowing what is truly of interest to the email user and seeks to block all other messages, while spam purveyors become successful by tapping the messages that users really want, or perhaps did not know they wanted. Interestingly, no one really knows where spam development is headed: "It brings home the idea of technology living an independent existence--a parallel universe of computer programs living in a world of their own, having their own quarrels," says MIT Center on Technology and Self director Sherry Turkle. Unlike self-conscious technology that is developed in the laboratories of science fiction, perhaps a future intelligent spam will be consumed with base human issues such as penis enlargement, online gambling, and debt consolidation. Turkle warns that spam is likely to continue to provide more accurate mirrors of human interests, even to the point where spam filtering technologies may discern users' subconscious desire to read some spam messages. Using Web activity records and personal data, spam and anti-spam software will become more attuned to individual minds.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Fund Cyber Infrastructure"
    Federal Computer Week (08/30/04) Vol. 18, No. 30, P. 58; Perera, David

    An Aug. 12 memo from the Office of Management and Budget and the Office of Science and Technology Policy indicates that supercomputing and cyber infrastructure will be the most pressing concerns for the Networking and Information Technology Research and Development (NITRD) program in fiscal 2006. Budget requests by NITRD's 13 member agencies should comply with this policy by shunting funds from "lower priority efforts" to the aforementioned areas of concentration, according to the memo. Peter Freeman of the National Science Foundation describes cyber infrastructure as a three-layered initiative to combine supercomputers, sensors, software, and networks into a seamless mechanism for translating the computational power and storage capacity on the bottom level to cyber infrastructure-enabled services and data on the top level. The middle layer, where the hardware, communications, institutions, and algorithms necessary to glue the levels together reside, represents the biggest challenge. Upon this framework's completion, Freeman predicts that scientific and engineering data and its collection will be able to span across different fields. Co-chairwoman of NITRD's Middleware and Grid Infrastructure Coordination Team Mary Anne Scott believes that additional capital for cyber infrastructure will help guarantee "that when we get to the solution down the road--five years or so [from now]--we don't have a balkanized list of solutions." Freeman thinks the memo is indicative of a resurgence of government interest in hardware research.
    Click Here to View Full Article

  • "A Better Distorted View"
    Science News (08/28/04) Vol. 166, No. 9, P. 136; Peterson, Ivars

    Readable cartogram maps based on population data can be generated rapidly on a computer using mathematics employed to describe diffusion, according to University of Michigan physicists Mark E.J. Newman and Michael T. Gastner, who detail their approach in the May 18 edition of Proceedings of the National Academy of Sciences. A cartogram is a map with geographic areas that are proportional to their populations, but currently available computer techniques can cause distortions or hinder readability, while the processing time for just a single cartogram is lengthy. Newman and Gastner's algorithm can generate a cartogram with uniform population density and no overlap by applying the equation describing the diffusion of a gas to a map using a step-by-step process that starts with a mathematical description of population density. "Our method...allows us to speed up the calculations a great deal and complete them in just seconds [rather than hours or days]," notes Newman. The physicists have used their technique to produce cartograms based on population data from the 2000 U.S. census, the results of the 2000 presidential election, lung cancer rates among men in the state of New York, and the dissemination of wire service news stories by state. In each case, the researchers had to determine what kind of geographic area would represent a basic unit to define the population-density function, and this choice was reflected in the level of distortion the resulting cartogram exhibited. Newman and Gastner intend to update their method to be faster and capable of tackling more complex problems. Applying the technique globally is especially challenging because the curvature of the Earth must be taken into account.
    Click Here to View Full Article

  • "The Short Life, Public Execution, and (Secret) Resurrection of Total Information Awareness"
    CSO Magazine (08/04); Berinato, Scott

    The Defense Advanced Research Project Agency's (DARPA) Total Information Awareness (TIA) program, touted as an initiative to detect terrorist activity by mining public and private transactional databases, raised fears of citizen surveillance that led to its termination by Congress, though elements of TIA have been transferred to other agencies where their development can continue under the public radar. Retired Adm. John Poindexter, who was placed in charge of TIA, argues that the program's cancellation stemmed from excessively emotional debate over its benefits, a tendency that he warns could lead to even more invasive yet less effective anti-terrorism technologies. Poindexter claims he foresaw TIA as a lightning rod for controversy, particularly among privacy advocates, which is why he advised DARPA to develop the technology and policies to govern its use at the same time. He says the project offered a "reasoned, open public discussion of the privacy issues" that was ignored, and cites journalistic sloppiness as reasons for TIA's negative media image: Poindexter notes that some journalists erroneously reported that the program would involve a single, all-encompassing database on Americans' transactions, and that DARPA would implement the TIA technology. Still, he admits that abuse of the TIA system is a valid concern, one that could have been addressed by adding transparency to the development process, as recommended. Although Poindexter reports that many experimental TIA technologies were proving to be both effective and cognizant of privacy issues, those triumphs were mooted with last summer's disclosure of a terrorist futures exchange project that spurred an outpouring of public and political outrage that ultimately led to TIA's dissolution. Poindexter advises that "a full public affairs, legislative affairs and legal staff has got to be on hand" if DARPA chooses to continue addressing contentious issues. He does not want the TIA debacle to dampen DARPA and the federal government's willingness to conduct controversial research.
    Click Here to View Full Article

    See http://www.acm.org/usacm/privacy/ for ACM's position on privacy and http://www.acm.org/announcements/tia.html for USACM's statement recommending an independent review of the U.S. Government's Total Information Awareness Program.