Volume 5, Issue 489: Wednesday, April 30, 2003
- "Scientists Protest EU Software Patents"
CNet (04/28/03); Broersma, Matthew
A cadre of 31 European scientists has signed a petition submitted to the European Parliament opposing a proposal that they fear could establish a U.S.-style patent system in the European Union that allows software ideas and algorithms to be patented. They argue that such a development would force small software developers to pay royalties to large firms that own thousands of software-related patents. The scientists want the parliament to embrace provisions that would "make impossible, clearly, for today and tomorrow, any patenting of the underlying ideas of software [or algorithms], of information processing methods, of representations of information and data, and of interaction between human beings and computers." The proposal would legalize the European Patent Office's (EPO) granting of such patents, which the petition calls an abusive practice that violates the spirit of the European Patent Convention. Although the convention expressly forbids the patenting of computer programs or business methods "as such," lawyers contend that the EPO's recent case law is blurry as to what constitutes a computer program, in contrast to something that uses a computer as one element, for instance. The European Parliamentary Committee on Legal Affairs and the Internal Market (JURI) supports the measure, according to European Digital Rights. However, Bristows lawyer Alex Batteson does not believe the EU will adopt a U.S.-style system, and argues that it is premature for protesters to pass judgment on the proposal, as it may be significantly revised or even rejected. The European scientists are adding their voice to those of 143,000 people opposed to software patenting who signed a separate but similar EuroLinux Alliance petition.
- "Sending of Spam With Fraud Is Now Felony in Virginia"
New York Times (04/30/03) P. A1; Hansell, Saul
Growing public anger toward unsolicited commercial email and the deceptive methods that spammers use is causing Congress and U.S. states to consider tough solutions, and one of the harshest anti-spam measures was passed into law by the state of Virginia yesterday. Under the new law, bulk emailers who use fraudulent practices to send over 10,000 emails to or from Virginia in the course of a day could spend up to five years in prison, and would be stripped of all profits and assets related to their spamming activities. The FTC reported yesterday that two-thirds of spam is sent with a misleading subject line or a false return address, methods that have been criminalized under the Virginia statute. Virginia Gov. Mark R. Warner is hopeful that the law will deter spammers, since 50 percent of all Internet traffic passes through Virginia-based ISPs such as AOL, which significantly extends the reach of prosecutors to bulk emailers in other states and jurisdictions. Although over 24 states have enacted anti-spam laws, their effectiveness has been undercut by low penalties and enforcement difficulties, and some industry members think federal anti-spam legislation is a better solution. "We need a single strong national policy to deal with spam so that no one can play the states off against each other," asserts Shane Ham of the Progressive Policy Institute. Proposals include a bill from Sens. Conrad Burns (R-Mont.) and Ron Wyden (D-Ore.) requiring purveyors of spam to clearly label their messages as such and outlawing falsification of the sender's name or subject; a measure from Sen. Charles E. Schumer (D-N.Y.) calling for a national registry of people who wish not to receive spam; and a law from Rep. Zoe Lofgren (D-Calif.) that would, among other things, force spammers to pay a bounty to computer users who report their activities.
(Access to this site is free; however, first-time visitors must register.)
- "DARPA Funds TIA Privacy Study"
InternetNews.com (04/29/03); Mark, Roy
The Air Force Research Laboratory (AFRL) Information Directorate has awarded a $3.5 million contract to the Palo Alto Research Center (PARC) to study the individual privacy protections of the Total Information Awareness (TIA) program. The TIA is being developed under the aegis of the Defense Advanced Research Projects Agency (DARPA), which is also funding the study. DARPA is additionally supporting a database integration project that would allow TIA to map out the "information signature" of people in order to detect and track possible terrorist activity. PARC engineers will develop privacy filters, "aliasing" techniques, and automated data purging agents to ensure that the privacy of American citizens is adequately shielded. "We will develop techniques that restrict analysts looking for potential terrorists activities from necessarily knowing the identities of the individuals who might fit patterns attributed to that activity," says AFRL's Patrick K. McCabe. The Senate recently voted to suspend TIA funding if the intelligence community fails to provide Congress with a detailed report on how the system could impact privacy and civil liberties. The vote also dictates that no agency can implement TIA without congressional permission. Nevertheless, the president can approve continued TIA funding as well as the deployment of TIA for foreign military operations.
- "Are Internet Ballots a Vote-Fixer's Dream?"
International Herald Tribune (04/28/03); Dembart, Lee
A number of computer experts are worried about government elections being electronically tabulated, with some even collecting votes via the Internet. The United Kingdom is conducting 17 such electronic elections this week in different localities, with over 1.5 million citizens participating. Many other European countries and U.S. states have already conducted pilot elections online or using computers, and others have such tests planned. Technologists note that electronic elections, especially those that are Internet-based, are tremendously vulnerable because of the many ways people could manipulate the system or simply shut it down using a denial-of-service attack, for instance. Some experts note that a computer virus on people's computers could become active only on election day and change their vote on the user-level, a type of tampering that would not be noticeable to authorities. Stanford University professor David Dill has assembled signatures from about 500 computer experts asking governments not to implement electronic elections until certain security practices are in place. Their requirements include an audit trail allowing citizens to verify their vote, publicly monitored electronic ballot box to ensure count accuracy, and protections for people's anonymity. Existing systems do not offer this level of security, and experts warn government officials not to be lulled into a false sense of security because perpetrators might be waiting for a large-scale election or may already be changing votes undetected. Ian Brown, director of the Foundation for Information Policy Research, said, "We are worried about the security of electronic voting systems, especially remote ones...we don't think that home PCs are a secure enough platform for something as truly vital to democracy as the voting system."
To read more about e-voting concerns, visit http://www.acm.org/usacm.
- "The War Against Spam"
Financial Times (04/29/03) P. 11; Morrison, Scott
Jupiter Research estimates that the number of unsolicited commercial emails users receive annually has skyrocketed from 140 billion in 2001 to 319 billion in 2003, while the average email recipient is expected to have to wade through more than 3,900 pieces of spam each year by 2007. Meanwhile, Ferris Research reckons that U.S. corporations lose nearly $9 billion a year because of spam. "People are seeing their online activities buried under this avalanche of trash," declared an aide to Sen. Ron Wyden (D-Ore.), who proposed an anti-spam bill this month. Both legitimate marketers and Internet companies have started to warm to the idea that government regulation may be a key ingredient of controlling spam. The FCC will open a three-day workshop tomorrow where representatives of ISPs and software firms, as well as regulators, anti-spam activists, and spammers will convene to outline a strategy to deal with spam. Three weapons are being leveraged against the growing pile of spam: Technology such as filtering software and anti-spam blacklists, which have had marginal success because spammers are constantly tweaking their methods to bypass such safeguards; litigation, an expensive and time-consuming option that does not always guarantee a victory for ISPs that file suit against spammers; and federal legislation, which is gaining acceptance. Activists are lobbying the government for a prohibition on commercial bulk email that is not specifically requested by recipients, a measure that major ISPs and organizations such as the Direct Marketing Association are opposed to. Earthlink VP David Baker cautions that bickering between commercial interests and anti-spam advocates will only hamper the passage of anti-spam legislation while the spam problem gets worse.
- "As Privacy vs. Security Debate Heats Up, NSF Primes Sensor Pump"
Small Times (04/25/03); Fitzgerald, Michael
The National Science Foundation (NSF) is set to fuel
standards-setting and new products in the nascent wireless sensor sector. John Cozzens, program director for the NSF's Signal Processing Sensor Program, said at a recent Palo Alto Research Center (PARC) workshop that wireless sensors are one of the most important areas of technology research. More than one-third of the NSF's $110 million Information Technology research fund will go toward its Sensors and Sensor Networks initiative. PARC hosted the second Information Processing in Sensor Networks workshop for 180 participants, which Cozzens called the most significant gathering of researchers in the area of sensor networks. He said the drivers for sensor networks are security and surveillance, and that privacy dangers are a trade-off for increased security. Dust CEO Kris Pister said the basic technology needs a lot of work; Dust builds networks of sensors, or motes, but is struggling to increase their active lifespan. Pister said the larger the network becomes, the faster it runs out of power--a 300-mote network could send 10 alarms each day for two weeks, while a three-mote network could operate for four years under the same conditions. Pister forecasted a nanoscale sensor if the necessary components are produced first, including communications filters, storage, processors, and power cells. PARC researcher Feng Zhao is working on microelectromechanical systems (MEMS) sensors and accompanying applications, and said the major barrier to the field currently is the lack of a "killer app;" once that is found, Zhao predicted the quick entry of commercial capital and lowered cost barriers due to mass production. He said NSF funding is key to the development of standards and that his laboratory is researching the possible involvement of TCP/IP.
Click Here to View Full Article
- "Licensed to War Drive in N.H."
Wired News (04/29/03); McWilliams, Brian
New Hampshire is considering legislation that could make it legal to exploit open wireless networks in the state, a first that the Electronic Frontier Foundation (EFF) calls "enlightened." "It seems like a fairly clean way of accommodating the geek-culture practice of having open wireless access points without doing anything bad for security," declared EFF attorney Lee Tien. House Bill 495, which would take effect next January if approved by the state's Senate Judiciary Committee and signed into law, would limit wireless network operators' ability to prosecute war-drivers and others who tap into their networks if they fail to secure them properly. Furthermore, alleged intruders can be acquitted under existing state law if they can prove that they accessed a wireless network believing it was supposed to be open. Security features on wireless systems are typically deactivated when they are shipped, while some wireless owners intentionally keep their access points open; they are usually members of the open network movement, which seeks to build a worldwide grid of Web-linked wireless access points. However, such access points could inadvertently connect to insecure wireless local area networks (WLANs) in close proximity, and the policy for handling such unintentional intrusions is vague. There are a number of safeguards network owners can deploy to prevent intrusions, such as MAC address filtering, passwords, and Wired Equivalent Privacy enablement, but ZNQ3 CEO Jeff Stutzman notes that most home and small-business users cannot afford such tools. N.H. Senate Judiciary Committee Chairman Andrew Peterson said the purpose of HB 495 is to protect anyone who accidentally encounters vulnerable wireless networks, and added that arguments from anyone who thinks the proposal could limit protections for victims of wireless hacking are welcome.
- "Speech Recognition Programs Still on ABCs"
Boston Globe (04/28/03) P. C2; Bray, Hiawatha
The adoption of speech recognition technology has proceeded at a slow pace due to computers' general inability to understand the many nuances of human speech. Speech software and embedded speech devices are doing well, but they specialize in niche applications such as telephone-based speech recognition. Radiologists and other medical personnel are also using speech recognition software to transcribe diagnoses because it fits well with their limited terminology. Speech recognition software such as IBM's ViaVoice can interpret words with a high degree of accuracy, but understanding the context of speech is beyond the capability of even the most powerful computers in use today. For instance, the computer cannot comprehend vocal inflections, and must rely on explicit punctuation commands. Noise is another stumbling block--humans can filter out extraneous sounds, but speech recognition software considers all noise to be speech. The noisy environment of the workplace is a major hurdle that speech technology must overcome if it is to be widely adopted by corporate users. Until context-aware speech programs emerge, the technology will be shut out of most corporate and home-based markets.
Click Here to View Full Article
- "A New Way to Catch a Hacker"
New York Times (04/28/03) P. C4; Thompson, Nicholas
The nonprofit Honeynet Project, the brainchild of computer security expert Lance Spitzner, has spent the last four years studying hackers and the intrusion methods they use by allowing them to break into honeypots--systems intentionally designed to be compromised. Spitzner's latest area of concentration is honeytokens, a 17-year-old security methodology in which seemingly important information that actually serves no useful purpose triggers an alert whenever it is viewed, captured, or downloaded. For instance, a hacker who steals files from a credit card company could be detected because the purloined information includes a bogus credit card number keyed to a "sniffer" program that raises an alarm when that false data is accessed. Honeytokens can help reduce incidents in which innocent parties are identified as hackers, since they are designed to only be accessed intentionally. Michael Vatis of Dartmouth University's Institute for Security Technology Studies notes that the Defense Department could employ honeytokens to catch people trying to access unauthorized data on weapons systems. The technology could also be used to trace internal security leaks: Institute for Security and Open Methodologies managing director Pete Herzog says he has inserted honeytoken typos into corporate memos to catch employees downloading prohibited content. However, honeytokens are not invulnerable--crafty hackers can circumvent them in a number of ways, including compressing and password-protecting stolen information. In addition, some experts are concerned that the use of honeytokens could constitute a violation of the federal Wiretap Act.
(Access to this site is free; however, first-time visitors must register.)
- "Geek Debate Gains National Prominence"
Financial Times Special Reports (04/30/03) P. 4; Waldmeir, Patti
Major American institutions have begun to work on balancing out the public and private ownership of ideas--a topic formerly limited to computer enthusiasts--in a way that spurs innovation without hindering follow-on innovation. This debate stems from the issue of digital piracy, which has recently become the subject of intense judicial wrangling. Controversy has erupted over a bevy of lower-court digital piracy cases and decisions, including a federal district court's ruling that ISPs must give the names of alleged pirates to music companies when asked, without requiring a court order; and a Los Angeles justice's determination that Sharman, the parent company of the Kazaa file-sharing service, can be sued under U.S. law despite being geographically based elsewhere. Legal experts believe that a true digital piracy solution will involve a combination of technological and business solutions concocted by content owners and the entertainment industry, as well as federal regulations. Meanwhile, the Supreme Court earlier this year upheld a 1998 copyright extension statute in the case of Eldred v. Ashcroft, a move that infuriated critics who think that the law violates the right to free speech and its copyright clause, and only serves to stifle innovation. The Supreme Court also ruled in March that a plaintiff who files suit under the Federal Trademark Dilution Act must show clear proof of dilution rather than merely demonstrating its likelihood, which has engendered criticism from the International Trademark Association. Legal experts took issue that the ruling does not clearly define dilution, or what trademarks qualify to claim under the dilution act. Federal Reserve Chairman Alan Greenspan commented that at the heart of all these cases is the need to sync "the interests of those who innovate and those who would benefit from innovation."
- "Middleware Initiative Contributes Third Software Release"
The National Science Foundation Middleware Initiative (NMI) has made the third release of its software toolset available to the public. NMI seeks to create the middleware and other software components necessary for wider online scientific collaboration among government, academia, and other research entities. The two main components of the recent release are software from the Enterprise and Desktop Integration Technologies (EDIT) Consortium and from the Grid Research Integration Deployment and Support (GRIDS) Center. EDIT components address the difference in authentication policies and technologies between scientific collaborators, and the recent release includes Shibboleth, an architecture that allows tighter integration of local and distributed content through better IP authentication; the GRIDS Center software suite includes the leading software for connecting grid computing systems. While containing grid computing mainstays such as the Globus Toolkit, the package features new software such as credential authority MyProxy and a customization tool called GridConfig. Dan Reed, a principal project investigator for the Extensible Terascale Facility (ETF), says GRIDS Center software helped accelerate the deployment of the TeraGrid, a major component system of the ETF. Besides coordinating the packaging of these collaboration tools, the NMI also awards grants to other smaller projects. More than 70 proposals were received by the group for fiscal year 2003 consideration.
- "Suppliers Spar as Fast USB Nears"
EE Times (04/28/03); Cataldo, Anthony
A 480 Mbps "high-speed" version of the USB 2.0 connectivity standard is expected to come out before 2004, and most vendors are mating it to the "On The Go" (OTG) specification. USB On-The-Go products are already being offered by several suppliers, and a certification-testing methodology for OTG devices is expected to be completed this summer by the USB Implementers Forum. ARC officials believe USB On-The-Go products will hit the market by year's end, and Microsoft is embedding USB On-The-Go into its Media2Go platform. However, standardizing the high-speed USB version could prove difficult because not all vendors agree on the best way to connect USB controller chips to a transceiver with a minimum number of pins. Philips' solution is a bus "wrapper" known as the Low Pin Count interface, which would convert the data passing through the transceiver into packets in a rapid, serialized fashion; Cypress' Super High PHY is not only designed to minimize the pin count, but to be compatible with I/O protocols outside of USB, according to people close to the specification. The dissension between interfaces is fueled by the mainstreaming of process technology and the move to 0.13 micron chip features, which has led to a shift away from including host and transceiver on the same piece of silicon. The UTMI-plus on-chip interface, which will be finalized in a few weeks, should allow chip vendors to combine controller and transceiver for chips with features larger than 0.13 micron.
- "Borg Aimed for Achievement"
Investor's Business Daily (04/29/03) P. A3; Bonasia, J.
Computer scientist and pioneer Anita Borg, who passed away on April 7 at the age of 54, dedicated much of her professional life to encouraging women to pursue careers in the high-tech and science fields. "The industry owes Anita, the woman, the pioneer, the scientist, the entrepreneur, a debt of gratitude," declared Hewlett-Packard CEO Carly Fiorina. "She dared to dream herself. She inspired others to dare, and she supported so many to realize their dreams." In response to the low number of women in the computer industry, Borg founded Systers, an email list designed to provide women with an online community where they could network and mentor each other; the organization currently consists of 2,500 members throughout 38 countries. Borg also established educational programs to make women proficient in methods and technology in preparation for high-tech jobs. She set up the Institute for Women and Computing in 1997: The institute's Virtual Development Center, one of her brainchilds, partners undergraduates with everyday women and girls to collaboratively design technology that fulfills their needs. Among her technical achievements were the development of precedent-setting computer memory and caching technologies, and the conception of a fault-tolerant Unix-based operating system. Borg married her research to a practical viewpoint, and proposed technology such as a digital calendar that can be displayed on a refrigerator, and "smart plumbing" that detects leaks via sensors and networks.
To read more about Anita Borg, visit http://www.iwt.org/news/anitaborg/inmemory.htm.
- "Georgia Tech Researchers Use Lab Cultures to Control Robotic Device"
A research team at Georgia Institute of Technology's Laboratory for Neuroengineering aims to build computing systems whose performance mirrors that of the human brain. Their latest innovation is the Hybrot, a robotic device that is controlled by a network of cultured rodent neurons; lead researcher Steve Potter describes the robot as a "neurally-controlled animat." The Hybrot features a few thousand living rat neurons encapsulated in a droplet that is deposited in an incubator with 60 micro-electrodes. The electrodes record neural activity and transmit it to the robot body, which moves accordingly and sends electrical feedback to the neurons. Potter's team keeps track of morphological and connectivity changes in the culture via high-speed cameras and voltage-sensitive dyes in order to determine if the neural networks are growing and learning as time goes on. Potter not only hopes the Hybrot will teach scientists how to apply organic neural networks to artificial computing systems, but also help them better understand the fundamentals of learning, memory, and information processes in humans in order to assist people that are physically or mentally handicapped--for instance, neural interfaces such as those Potter's group are developing could allow persons to control prosthetic limbs by thought. Other advances that this research could help nurture include artificially intelligent systems such as self-driving automobiles, and more sophisticated computing architectures. Potter's research is funded by a $1.2 million grant from the National Institutes of Health.
Click Here to View Full Article
- "Halting Nanotech Research 'Illogical', Says Pioneer"
New Scientist (04/29/03); Knight, Will
Foresight Institute Chairman Eric Drexler, who coined the term "nanotechnology," argues that a ban on nanotech research, as suggested by a team of researchers at the University of Toronto's Joint Center for Bioethics, makes little sense. Although he agrees that nano-particles carry a certain degree of safety risks, he calls the proposed moratorium "an attention grabbing mechanism" based on a politically-driven misinterpretation rather than technical documentation. "These days nanotechnology is more a marketing term than a field and you can't do regulation based on marketing terms," Drexler explains. In his opinion, the two biggest potential threats are nanotech being controlled by unforthcoming or reckless people and oppressive over-regulation of the technology. Drexler, who defines nanotech as technology based on molecular machines that can reproduce themselves, believes the mechanical positioning of reactive molecules is the key breakthrough needed for such systems to emerge. Researchers who are trying to accomplish this include Nadrian Seeman's research team at New York University, which is assembling structures out of DNA. Drexler thinks that the first molecular machine applications to emerge could include a device that can read a genome very quickly and is about the same size of a small bunch of protein molecules. He says that carbon nanotubes are much simpler than the molecular machines he envisions, but notes that they could contribute to the machines' development.
- "Digital Cells"
Science News (04/26/03) Vol. 163, No. 17, P. 267; Klarreich, Erica
Researchers are working to implant computer programs into human cells so they can fulfill a wide array of functions, including pollutant cleanup, detection of cancer cells, and the manufacture of antibiotics or molecule-sized electronics. Cells have their own versions of on-off switches and feedback loops, while DNA strands store programs for making proteins; genetic circuits--novel strings of genes and control centers--are being constructed and given the ability to perform complex operations through the use of inverters. Although scientists do not expect cellular computers to supplant conventional machines, they hope to leverage properties and processes unsurpassed by silicon devices, such as high tolerance to hot and acidic surroundings, efficient synthesis of useful chemicals, self-replication, awareness of the slightest external variances, and environmental interaction. A team led by Princeton University's Ron Weiss recently completed a five-gene circuit in E. coli bacteria that activates a fluorescent protein when a specific concentration of a particular chemical is present, a breakthrough that could be employed to trace environmental toxins. Engineers use mathematical simulation--via the computer-aided BioSPICE design tool--to predict how their genetic circuits will act prior to their synthesis, and Weiss describes genetic circuit design as "a continual process of simulation, refinement, simulation, refinement, until it works." Rather than having engineers revise designs over and over, Weiss allows the DNA to mutate, thus enabling the optimum circuit design to emerge through natural selection. Technical barriers complicate the assembly of large circuits from cellular logic gates, and Drew Endy and Thomas Knight of MIT are trying to simplify the process by developing a library of BioBricks--standard fundamental components that genetic circuit engineers could connect. Knight says, "We want to move away from the situation where you build the system and pray that it will work, toward the situation where you build the system and, unless you've done something stupid, it will work."
- "Sliver of the Pie"
InformationWeek (04/28/03) No. 937; McGee, Marianne Kolbasuk
IT salaries appear to have more or less flattened, while job satisfaction has declined 10 percent over the past two years, according to InformationWeek's 2003 National IT Salary Survey; to succeed in the current market, IT workers must focus on constantly updating their skills and never losing sight of their employers' business strategies. Furthermore, professionals must be willing to accept certain trade-offs (pay cuts, for instance) in return for health-care benefits, more opportunities to expand their skills, and job stability--in fact, the number of people who cite stability as the reason they are looking for new jobs has risen in the past year from 18 percent to approximately one-third. Managers and staffers list challenge as the No. 1 ingredient in job satisfaction, followed by stability, flexibility, and base pay. The survey notes that over 80 percent of staffers and nearly 90 percent of managers find their jobs to be at least somewhat intellectually satisfying. The number of hours that business-technology professionals work remains the same, although there have been reductions in the hours managers and staffers spend on call. The highest-earning IT staffers, according to job title, are IT architects, systems architects, and project leaders; when classified by skills, the highest-paid workers are those proficient in enterprise application integration, enterprise resource planning, and data mining or data warehousing. There has been a fall-off in base salaries and total compensation for general security and Web security, while salaries for wireless-related professionals have also declined. DaimlerChrysler's Michele Desvignes Schilling reports that bonuses and other kinds of compensation for having special skills or certifications are more customary among technology-oriented firms rather than non-technological companies.
Click Here to View Full Article
- "Leveraging a Global Advantage"
InfoWorld (04/21/03) Vol. 25, No. 16, P. 33; Udell, Jon
Dynamic, just-in-time software development is being driven by increases in freelance programmers, open-source skills, and offshore outsourcing. The gap between dispersed workers is being bridged by emerging frameworks and the application of collaborative platforms and open source. Assembla founder Andy Singleton says, "It's no accident that all significant open-source projects are global. That thought should be stuck in the mind of anyone who wants to produce world-class software." The development methodology can be directed either by the client or the outsourcer. Outsourcers should be flexible enough to accommodate both methodologies, as EPAM and Virtusa have done. Both clients and outsourcers enlist developers and project leads in a just-in-time software development team that also takes advantage of the global open-source community. Collaborative transparency is a key element of open source's modus operandi, and an essential component of dynamic development; at the same time, it prevents vendor lock-in. IT managers find that open-source software offers them more control, but requires a hefty commitment in terms of time and intellectual effort. Fortunately, offshore outsourcers possess a wealth of such resources.
Click Here to View Full Article
- "Who Loves Ya, Baby?"
Discover (04/03) Vol. 24, No. 4; Johnson, Steven
Social-network software that visualizes the interactions and relationships within groups of people promises to radically transform large organizations. Mapping social interactions has become easier thanks to the advent of email, chat rooms, Web personals, and bulletin boards. Software designer Valdis Krebs' InFlow--the end result of 15 years' development--provides organizational maps that resemble molecular configurations, with an employee representing each individual molecule; these maps are derived from employee surveys used to determine their collaborative relationships and work patterns. MIT grad student Danah Boyd and programmer Jeff Potter have developed Social Network Fragments, a software program that studies emails sent and received, and from them constructs a map of social networks. The program can outline not only the size of different social groups but the bonds between them. "If we're going to spend more of our social life online, how can we improve what that experience feels like?" asks Judith Donath of MIT Media Lab's Sociable Media Group. "You have this enormous archive of your social interactions, but you need tools for visualizing that history, for feeling like you're actually inhabiting it." Social-network software can also be a useful tool for political analysis.