Volume 5, Issue 498: Wednesday, May 21, 2003
- "Pentagon Details New Surveillance System"
Washington Post (05/21/03) P. A6; Cha, Ariana Eunjung
The Pentagon released a comprehensive report about the proposed Terrorist Information Awareness (TIA) program (previously known as Total Information Awareness) to legislators on Tuesday, but the details about the computer surveillance system--its projected budget, the technologies and programs involved, etc.--have failed to mollify critics who say TIA could erode citizens' personal privacy and civil liberties. The Defense Advanced Research Projects Agency (DARPA) stated that the name change came about to get rid of the impression that TIA would be used to profile Americans, when the program's goal is to identify and thwart foreign terrorists before they can launch an attack. The initiative, which has a three-year budget in excess of $50 million, would involve a massive core database of public and private data--airline ticket purchases, financial and medical records, video surveillance, biometric identification, and more--that could be mined to detect signs of potential terrorist activity. The report states, "By augmenting human performance using...computer tools, the TIA Program expects to diminish the amount of time humans must spend discovering information and allow humans more time to focus their powerful intellects on things humans do best--thinking and analysis." The report says technologies and programs could be incorporated into TIA include FutureMAP, a system that evaluates sentiment on certain topics by studying public market fluctuations. Another speculative TIA component is a "Misinformation Detection" system designed to scan text for indications of fake or misleading data. However, Sens. Ron Wyden (D-Ore.) and Russell Feingold (D-Wis.) agreed that the report "fails to propose any specific new rules to address the [abuse] concerns raised by Congress."
Click Here to View Full Article
For information on ACM's arguments regarding TIA, visit http://www.acm.org/usacm.
- "Has Copyright Law Met Its Match?"
Medill News Service (05/19/03); Wenzel, Elsa
Disabled consumers and their proponents complain that the Digital Millennium Copyright Act (DMCA) severely restricts their access to reading material because most available electronic books lock out text-to-speech software. Advocates such as the American Foundation for the Blind's Paul Schroeder, who is visually impaired himself, are calling for amendments to the DMCA that excludes all e-books. E-publishers and other DMCA supporters counter that U.S. copyright law already sanctions the copying of literature to serve the blind, as long as it is done by nonprofit organizations, not individuals. In fact, Allan Robert Adler of the Association of American Publishers claims that if it were not for the DMCA, a great deal of literature would still be unavailable in electronic form. IP Justice attorney and executive director Robin Gross says that not all disabled consumers would benefit from e-book exemptions: People would still be violating the DMCA if they make or provide tools that circumvent e-book copy controls, and face hefty fines or even jail time. She says the only way to legalize text-to-speech conversion for e-books would be to legally challenge or revise the DMCA. Meanwhile, a number of bills currently going through Congress aim to give the disabled better access to electronic material; Under the Institutional Materials Accessibility Bill, publishers would be responsible for making books easily translatable for the handicapped, while Rep. Rick Boucher's (D-Va.) Digital Media Consumers Rights Act would amend the DMCA so that consumers could copy DVDs and other copyrighted products for personal use. "Congress did not intend to undo fair use, but until the industry figures out how to support fair use, the DMCA should not apply," declares Janina Sajka of the American Foundation for the Blind.
For information on ACM's activities regarding DMCA, visit http://www.acm.org/usacm.
- "Email's Backdoor Open to Spammers"
New York Times (05/20/03) P. A1; Hansell, Saul
Routing junk email through unwitting third parties, usually home and office Internet users, is the No. 1 distribution method spammers use, and ISPs such as America Online estimate that over 200,000 computers around the world have been exploited in this fashion over the last two years. For the most part, the spammers are taking advantage of security vulnerabilities in existing software, but more and more are secretly embedding email forwarding software in unsuspecting users' machines. Spammers are increasingly commandeering unaware users' computers to mask their own Internet protocol addresses in order to evade spam blockers. The most common point of vulnerability spammers focus on is a backdoor in the software users deploy to share an Internet link between multiple computers. Although these proxy servers can be adjusted to only grant access to authorized users, many remain vulnerable, either because the owners are lazy or unaware of the dangers. Makers of proxy servers balk at suggestions from anti-spam activists that the products be distributed with the security features on, arguing that such a measure would complicate setups. ISPs are employing various strategies to combat proxy server exploitation: AOL's solution is to refuse to accept any incoming email directly routed from the systems of home users with high-speed service, while Time Warner's Road Runner high-speed service is finding customers with open proxy servers using the same kind of scanning tools spammers use. Most users whose systems are hijacked by spammers do not realize they have been compromised until they receive complaints from ISPs, who claim that the lack of any direct personal impact causes these exploited users to ignore the problem.
(Access to this site is free; however, first-time visitors must register.)
- "Leave Me Alone!"
Boston Globe (05/18/03) P. D1; Bray, Hiawatha
Brightmail estimates that spam will account for about 50 percent of all Internet mail sent this year, while Ferris Research reckons that dealing with junk email will cost American businesses $10 billion. Many businesses are hesitant to use antispam software because the tools are about 17 percent likely to block legitimate email, according to research carried out by Assurance Systems. To prevent the end of email as a useful tool as a result of spam's overwhelming growth, the Internet Engineering Task Force brought in a team of experts to devise an antispam strategy, which proposed to completely revamp the worldwide email system using a variety of solutions. CipherTrust CTO Paul Judge acknowledges that a complete halt to spam is unlikely, and the goal of his panel is to "bring [spam] back to the nuisance it used to be." The panel is considering attaching digital certificates to legitimate email to prevent blockage by spam filters; Habeas' solution is to send a certified emailer's messages with a copyrighted haiku poem in the header, whose presence guarantees that the email is not spam. Also in the header is a contract that bans the haiku's unauthorized use, which would make spammers who copy the poem in their messages guilty of copyright violation. Another digital certificate solution from VeriSign includes authentication via digital signatures, which are virtually impossible to forge. Judge says CipherTrust is developing Reverse Mail Exchange, which would enable servers to determine if email does or does not originate from a legitimate address, while the hashcash system imposes a time penalty on spammers by forcing their sending computers to solve an equation, sharply reducing the amount of junk email spammers send and lowering the odds that they will profit.
Click Here to View Full Article
- "The Crisis of Computing's Dying Breed"
Financial Times (05/21/03) P. 11; Foremski, Tom
IT workers knowledgeable in mainframe operations are a dying breed, although the hardware they run has proven surprisingly resilient to extinction. IT pundits had predicted server systems would make the mainframe obsolete, but many companies are loath to abandon the security, reliability, and relatively low maintenance costs of their mainframes. Today, however, university graduates are much more likely to educate themselves in skills such as Web services and other, more flexible programming languages than mainframe Cobol code. Experts say organizations are facing pressure to find adequately trained staff to maintain their mainframes, but IBM computer hardware group head Bill Zeitler says his company's improvements to the mainframe add viability to the platform. The recently released and most powerful IBM mainframe yet, dubbed T-Rex, is capable of handling entire e-business operations. T-Rex can replace multiple mainframe machines and help consolidate dwindling ranks of mainframe administrators. In addition, IBM is pushing the Linux operating system, which gives newer IT workers a foot in the door in terms of mainframe operation, according to Sageza Group research director Charles King. Still, professionals with mainframe experience are much older than the norm, a situation that's a growing concern for companies. Over half of IT workers with mainframe experience were over 50 in a Meta Group survey last year, while less than 10 percent of workers with Windows NT and Unix skills were that age. Gartner's Diane Morello warns that few companies have planned for the day when they are forced to change their technology platform due to a lack of skilled workers. Zeitler notes that an estimated 60 percent of corporate data is on mainframes, and expects that companies will move very slowly in their shift away from the platform.
- "A Spy Machine of Darpa's Dreams"
Wired News (05/20/03); Shachtman, Noah
The Pentagon's Defense Advanced Research Projects Agency (Darpa) is sponsoring a new project that aims to record every movement, consumed media, transaction, and action in a person's life. The LifeLog project could be used as a computer training tool, battlefield computer assistant, or as a method to track epidemics, according to the agency. Opponents of the Total Information Awareness (TIA) project, however, say the new LifeLog program is even more threatening to individual privacy. "LifeLog has the potential to become something like 'TIA cubed,'" says Federation of American Scientists defense analyst Steven Aftergood. Unlike the TIA, which records just a person's transactions, LifeLog would capture transactions as well as every bit of TV, Internet, and print media they consume and all digital images taken. A GPS transmitter could track movement and audio sensors would record conversations. By making all this information available through a search engine interface, people could "retrieve a specific thread of past transactions, or recall an experience from a few seconds ago or from many years earlier," according to a Darpa briefing. Commercial and academic efforts are underway to do some of the same things, such as Microsoft's MyLifeBits project being developed by Gordon Bell. University of Toronto professor Steve Mann, who claims cyborg status, has been wearing sensors and video-recording equipment since the 1970s in an effort to develop "existential technology." The Darpa researchers will be their own subjects and the agency is soliciting proposals for an 18-month study with a possible two-year extension.
- "Congressional Caucus Targets Piracy"
CNet (05/19/03); McCullagh, Declan
Florida Reps. Robert Wexler (D) and Tom Feeney (R), along with Rep. Adam Smith (D-Wash.), are organizing the Congressional Caucus on Intellectual Property Promotion and Piracy Prevention, which is likely to cause Congress to narrow its focus on legislation aiming to curb peer-to-peer (P2P) piracy. A recent House committee hearing attributed the proliferation of illegal pornography to P2P, and another hearing declared that copyright infringement is running rampant at universities. Wexler, who invited colleagues to join the caucus in a letter submitted last Friday, co-sponsored a 2002 bill that would allow copyright owners to incapacitate PCs used for illegal file-trading, and is a member of the House Judiciary subcommittee that authors copyright legislation. Smith co-wrote a letter to fellow Democrats last fall accusing Linux's GNU General Public License of being a threat to the nation's "innovation and security," while Feeney is a former speaker of the Florida House of Representatives. The caucus announcement has been lauded by both the Recording Industry Association of America (RIAA) and the Motion Picture Association of America (MPAA). RIAA Chairman Hilary Rosen declared in an email that "It's initiatives like this, along with those of other congressional leaders, which help showcase the economic and cultural contributions of the creative community while shedding light on piracy's harmful impact." The MPAA's Rich Taylor concurred, and stated that his organization plans to collaborate with the new caucus "to help create an environment where a legitimate digital marketplace can thrive."
- "New System Developed by Pentagon Identifies Walkers"
Associated Press (05/19/03); Sniffen, Michael J.
One possible element of the Defense Advanced Projects Research Agency's (DARPA) proposed Total Information Awareness (TIA) U.S. citizen surveillance database could be "gait signatures" extracted by a device developed by Georgia Institute of Technology researchers with Pentagon funding. The device employs a radar that can record unique qualities of a person's walk even from 600 feet away, according to project leader Gene Greneker, who also notes that the technique has certain advantages over video cameras, such as being able to detect gait signatures in darkness, inclement weather, and with walkers wearing obscuring apparel. The radar is keyed to small changes in frequency in the reflected signal off the walker's legs, arms, and torso. The system could be used, for example, to alert security officers that an unauthorized person is in a restricted area by comparing his or her gait signature with those in a database, or to identify a suspicious individual who is repeatedly seen outside a sensitive location, possibly for nefarious purposes. Greneker says his team is not concerned about the privacy implications of the device, insisting that the government must address this issue. DARPA is also funding other research at Georgia Tech that uses computers and video cameras to determine gait signatures.
- "Viruses Learn How to IM"
EarthWeb (05/16/03); Saunders, Christopher
Computer viruses are adapting IM bot behavior to spread, as demonstrated by the recent Fizzer worm, which can receive hacker commands if it can link to the AOL Instant Messenger (AIM) network and the Internet Relay Chat (IRC) network. "Fizzer...creates it own account, and that account attaches to a chatroom on the Internet," explains David Loomstein of Symantec Security Response. "So [hackers] know anyone on that chat room is infected with the virus--and they use that chatroom as a backdoor to do hacking on the infected machine." Fizzer also sets up a command console by running an HTTP server on Port 81, allowing hackers to inquire about critical system information and execute AIM and IRC bot commands as well as Denial of Service attacks. Fizzer, AIM-Canbot, and Aphex/Aplore are unique from previous viruses in that they do not try to hijack the user's IM username, but rather create their own IM identity and attempt to link with their creators. The major antivirus companies have issued patches for these worms. Security holes in Microsoft Internet Explorer are also being exploited by viruses such as the Menger/Coolnow worm, which was able to commandeer a user's MSN Messenger IM client through this vulnerability. Viruses that include IM bot elements could benefit standalone IM vendors, as well as companies that sell applications designed to protect public IM systems. Loomstein says, "IM is definitely part of the equation now...It's a new frontier that's being exploited."
- "Bugged Out"
Salon.com (05/16/03); Rosenberg, Scott
Ellen Ullman, author of "Close to the Machine: Technophilia and Its Discontents," drew upon real-life experience for her new novel "The Bug," a parable about a computer programmer confronted with a bug that thwarts all attempts to lock it down. The basis for the novel was a resilient bug Ullman encountered as a programmer for Sybase, which she originally thought could be included in an essay designed to help people unfamiliar with coding understand the debugging process; later on, she found it much more appealing to turn the essay into "a historical, technical, Gothic mystery." Ullman decided to set the novel in a technical environment that no longer existed, in which programmers had to write everything themselves rather than employ pre-written layers of code, as is done today. The author explains that she wanted to include two opposing themes in the book: The idea that technology, no matter how complex, can be understood when deconstructed into its basic components, while the integration of those components is not necessarily understandable. Ullman says it becomes harder for programming to qualify as a science when engineers move away from hardware and software and toward applications, and supports this idea by noting that the advancement of computer capability has not been accompanied by a similar advance in software writing ability. Ullman admits that the novel's protagonist fits the stereotypical mold of programmers being more technically adroit than socially inclined, but she observes that programming now involves more social interaction thanks to the open-software movement. Ullman laments the technology recession, which has left a lot of talented programmers unemployed, resulting in a loss of "institutional memory." She also finds it disturbing that technology is moving into surveillance systems due to political pressures.
Click Here to View Full Article
- "Nanopits for Nanostorage"
IEEE Spectrum (05/14/03); Hellemans, Alexander
Scientists are looking for alternative data storage media that offer greater density than magnetic systems, and separate European research efforts have yielded significant breakthroughs in nanoscale indentation. A team at the IBM Zurich Research Laboratory used Millipede--an array of heated atomic force microscopy (ATM) probes--to make minuscule notches in a thin plastic substrate capable of storing up to 1 terabit per square inch. IEEE Fellow Peter Vettiger declared, "The Millipede has the potential for atomic-level data storage--a bit in a few atoms." However, researchers at the Universities of Bologna and Edinburgh announced they could achieve storage densities of up to 100 gigabits per square inch using rotaxanes as a medium. Rotaxane molecules deposited as a thin film undergo a structural change when they are scanned by the tip of an atomic force microscope, forming indentations in precise, self-configured patterns; the gentle pressure of the microscope's tip over the rotaxane film creates a line of identical dots that are the same distance apart, according to project leader Fabio Biscarini of the University of Bologna. "It was a way to demonstrate the writing of information in the form of strings of beads," he explained. The dots' size and the distance between each dot can be varied by changing the film's thickness, and Biscarini says the multiple-dot writing approach could give the system an edge over Millipede, which follows a one-by-one dot writing model. The commercial possibilities of such a system will not be realized until a viable method for reading out the film is discovered, but UCLA's James Gimzewski says Biscarini's breakthrough validates the greater capabilities of molecular storage systems.
Click Here to View Full Article
- "GPS Data Could Stop Wireless Network Attacks"
New Scientist (05/20/03); Knight, Will
Carnegie Mellon University's Yi-Chin Hu and Adrian Perrig, along with Rice University's David Johnson, furnished a report presented at the 12th World Wide Web conference detailing a new wireless network security threat and a possible defense strategy. A "wormhole attack" in which an intruder hijacks wireless data packets moving across one section of the network and re-implants them at another physical network node could be used to shut down an "ad-hoc" wireless network or thwart a wireless authentication system, even if it is encrypted. "The wormhole puts the attacker in a very powerful position relative to other nodes in the network," the researchers explain. "Possible ways for the attacker to then exploit the wormhole include discarding rather than forwarding all data packets, thereby creating a permanent Denial of Service...or selectively discarding or modifying certain data packets." The researchers' proposed solution is to tag each data packet with "packet leashes" that allow each network node to ascertain the packet's point of origin. Packet leashes could consist of GPS data or a timestamp derived from a synchronized network clock. Counterpane founder Bruce Schneier argues that deploying such safeguards would be very expensive, and suggests an alternative solution in which network nodes exchange bits very quickly and the actual distance between nodes is gauged using the speed of light.
- "State of the Art: Wasted Chip Power"
NewsFactor Network (05/16/03); Ryan, Vincent
Despite the marketing hype coming from AMD and Intel, 64-bit computing is unlikely to make an immediate, dramatic impact on corporate IT operations. Even if companies adopt 64-bit computing platforms, much of the possible processing power will be wasted because of a dearth of 64-bit applications. Technical applications and databases have been quicker to take advantage of 64-bit architecture, says Intel's Barbara Grimes, who also notes that about 300 64-bit applications and tools are already available, including Microsoft's SQL Server, Windows Server 2003, Oracle's 9i database, and some Linux versions. Improving database speeds will mean faster end-user applications regardless, but Aberdeen Group chief research officer Peter Kastner says day-to-day applications such as Siebel and SAP are not being ported to 64-bit versions. He says when operations move to "real-time" processing, there will be more tangible demand for 64-bit software. Gartner analyst Joseph Byrne says the faster processing speeds possible with 64-bit chips are not the only benefit: Integrated main memory interface and hypertransport support are also important gains and enable the use of 2-, 4-, and 8-way servers. On the desktop, 64-bit computing will not be a major issue until the end of the decade, according to Grimes, because there is no need for petabyte-level memory addressability in PCs. AMD's Athlon 64, which also handles 32-bit processing and competes against the Pentium 4, may provide a beachhead for 64-bit applications on the desktop, says IDC researcher Shane Rau, who notes that 64-bit adoption will ramp up as the prices of 64-bit and 32-bit processors even out and buyers have no reason not to buy 64-bit systems.
- "Ultra Wideband: Gaining Momentum"
InternetNews.com (05/16/03); Mark, Roy
Supporters of ultra wideband (UWB) are predicting a surge in home networking products based on the technology thanks to the FCC's February 2002 ruling authorizing commercial, unlicensed UWB implementation. The agency supposedly approved the authorization because of the technology's potential impact on public safety, which includes radar imaging that penetrates the ground and walls to aid in rescue operations. However, industry analysts think UWB-based wireless electronic products could debut as early as Christmas 2003, in the form of camcorders that can transmit wireless streaming video to computers or a TV. Other expected innovations include flat-screen computer monitors that can wirelessly link to a CPU anywhere in the house, wireless TV-to-TV transmission of programs, and wirelessly connected TVs and VCRs. To allay concerns from law enforcement, the military, and other entities that UWB devices may interfere with essential public services and military operations, the FCC ruled that such devices can only run in the 3.1 GHz to 10.6 GHz frequency band, while operation can only take place indoors or via handhelds if peer-to-peer communication is involved. The FCC hosted an Ultra Wideband event this past February, where several UWB technologies were spotlighted. Technologies on hand included Time Domain's handheld RadarVision device, a radar imaging system that uses the PulsON chipset, which the FCC certified last September; and a UWB intercommunications system from Multispectral Solutions, which was created for the U.S. Navy to reduce cabling clutter and prevent accidents. The odds for UWB's success would be significantly improved if the industry can work out a universal wireless standard interface.
- "Digital Solutions to Government Challenges"
This year's National Conference on Digital Government Research (dr.o2003) on May 19-21 showcased an array of digital government projects. UrbanSim from University of Washington researchers simulates city growth so that policymakers in Honolulu, Seattle, and Salt Lake City can anticipate future trends. Artificial intelligence is employed in Stanford University's Regnet, which outlines legal advice people can follow to more easily wade through government laws and regulations, while the University of Arizona's Coplink data-mining engine also uses AI to make deductions from random clues. Pennsylvania State University's Quality Graphics Project is designed to extract patterns from American data to map out health, habits, family life, and living conditions in the United States. Researchers at Iowa State University and the University of California, Santa Barbara, have collaborated on Project Battula, a wearable database-navigation uplink that is allowing field data collectors to construct databases at the scene. More than 200 participants from the government and academic sectors convened at dr.o2003, which is hosted by the National Science Foundation (NSF) under the aegis of the Digital Government Research Center. The Digital Government initiative of the NSF focuses on innovations in government-citizen interchange, studies how information technology affects democratic processes, and bolsters government agency applications, among other things.
- "W3C Readies New Tech Patent Policy"
Computerworld (05/19/03) Vol. 37, No. 20, P. 12; Sliwa, Carol
World Wide Web Consortium (W3C) director Tim Berners-Lee recently announced that a decision on the organization's technology patent policy--one that addresses patent claims that could be a hindrance to interoperable Web standards development--is imminent, now that the consortium has had time to evaluate public comments. W3C patent policy working group chairman Daniel Weitzner says his group has been laboring for more than three years to fashion a policy in keeping with the "overwhelming goal" to develop royalty-free standards. However, the group added a loophole that allows members to contemplate other licensing terms in situations where royalty-free standards are precluded. Weitzner says this provision was tacked on to address concerns of members who support a reasonable and nondiscriminatory patent licensing model. "I think the W3C is being practical in allowing a specification to go forward and working out the [intellectual property] from some third party as they go," says IBM's Karla Norsworthy. The W3C's patent policy was at the heart of a recent inquiry among certain consortium members who questioned why IBM, Microsoft, and other vendors have gone through the Organization for the Advancement of Structured Information Standards (OASIS) to submit Web services standards proposals instead of the W3C. IBM and Microsoft said that patent issues were not the reason the vendors sent a proposed Business Process Execution Language (BPEL) standard to OASIS. Norsworthy insisted that IBM will only submit proposals that entail a royalty-free license, given the high desirability to broadly adopt Web services standards.
Click Here to View Full Article
- "Open Source Gets Secure"
Washington Technology (05/12/03) Vol. 18, No. 3, P. 84; Jackson, Joab
The government sector is pushing for official security credentials for open-source products. A coalition involving the Open Source Software Institute, Hewlett-Packard, IBM, and other groups, is working to certify an encryption technology commonly used in Web pages. The coalition is trying to obtain FIPS 140-2 certification for OpenSSL, which is used to provide security for credit card numbers and other sensitive data. The other new initiative is a DARPA-funded Navy project. The Navy has developed software designed to examine Linux computers in a way that complies with military requirements, and informs pre-selected groups if any unusual actions take place. The two offerings affect agencies and service providers working on the Defense Department's Global Information Grid project and other network projects. Meanwhile, more and more government agencies are opting to use open-source solutions, including Linux deployments by the Federal Aviation Administration and the departments of Agriculture and Energy. Open Source Software Institute Chairman John Weathersby says that once open source products pass stringent security tests, they likely will be used in mission-critical applications. He says, "We're working with folks right now within branches of the military who really want to use some of this technology."
Click Here to View Full Article
- "Where Are Your New Ideas Coming From?"
Darwin (05/03); Chesbrough, Henry
The closed innovation model, in which companies build central labs to research and develop technology and products that pay for continued R&D, is still valid for certain industries, but is no longer applicable for many more, writes Henry Chesbrough in his book, "Open Innovation: The New Imperative for Creating and Profiting from Technology." The obsolescence of the closed innovation paradigm has been hastened by a quartet of factors. The explosive growth of skilled, mobile workers created an auction market that allows startups or entrepreneurs to hire "the best and the brightest" away from larger companies, thus threatening the bigger firms' ability to sustain R&D investment--IBM, for example, sold off its hard-disk-drive business when its leadership in that area was eroded as engineers left to work for other companies or start their own. The availability of skilled workers grew even more with changes in U.S. immigration policy. Between 1980 and 2000, U.S. venture capital ballooned from roughly $700 million to over $80 billion; this, combined with lucrative stock option packages, attracted many key lab personnel in large companies to startups, which threatened to dilute the knowledge bank their former employers relied on. Together, the VC explosion and worker availability and mobility has given research employees an alternative to keeping ideas on the shelf while waiting for internal development to explore them: Commercializing those ideas independently. External suppliers are becoming more proficient at delivering products that are just as good if not better than what an internal corporate R&D lab can concoct, a trend that allows large firms to bring more products to market faster while simultaneously giving smaller rivals the opportunity to overtake larger companies. To take advantage of the changes wrought by the above factors, companies must move to an open innovation model that can tap into a much more diffuse knowledge base and leverage external research inputs that could yield new products and services.
- "Surveillance Nation--Part Two"
Technology Review (05/03) Vol. 106, No. 4, P. 46; Farmer, Dan; Mann, Charles C.
As the United States ramps up widescale surveillance initiatives with little grumbling from citizens, technologies are being developed that carry both the promise of better security and the threat of privacy invasion. But personal privacy may not necessarily be sacrificed, given that the configuration of the vast databases needed to record and manage surveillance data may help set up accountability and usage safeguards. "Accountability as to who is accessing what, altering what data, not updating stuff that should have been corrected, etc., is absolutely vital" to uncovering data misuse, proclaims SRI computer scientist Peter G. Neumann. Techniques used to prevent databases from being choked with input, such as preprocessing and distributed computing, could be leveraged to ensure accountability. Meanwhile, Stanford University's Lawrence Lessig argues that usage restrictions must be implemented in order to curb data abuse; access controls such as biometric smart cards could allow only authorized people to access databases, which would record the people logged on and their activities. However, Lessig strongly doubts that such measures will be deployed, given the shift in the government's policy on privacy in the wake of terrorist incidents, as well as people's inclination to use new technologies without first considering the ramifications. Another argument for embedding accountability is the fact that database integrity is inherently suspect, and a combination of equal access for everyone and careful usage monitoring will deter database exploitation. Carl Botan of Purdue University observes that the point of pervasive surveillance could be nullified by unexpected, "panoptic effects," such as a withdrawal of communications as a result of people's awareness of being monitored. Legislation favoring regulation of surveillance databases will not happen without sufficient demand from the public, which is why open discussion is urgently needed.
(Access to the full article is available to paid subscribers only.)
Peter G. Neumann is co-chair of ACM's Advisory Committee on Security and Privacy: http://www.acm.org/usacm/ACSP/homepage.htm