Volume 5, Issue 518: Friday, July 11, 2003
- "Half-Dozen Anti-Spam Bills Presented to Congress"
SiliconValley.com (07/09/03); Phillips, Heather Fleming
Rep. Heather Wilson (R-N.M.) says the spam issue has reached "a tipping point" where Congress will either have to impose tough spam regulations or institute a prohibition on all forms of unsolicited commercial email, which is estimated to be costing businesses as much as $10 billion every year. Six anti-spam proposals are currently floating in Congress, and confidence is high among industry and consumer groups that a national anti-spam measure will be passed in 2003, despite two earlier failed attempts. California is among several U.S. states working on their own anti-spam laws, but a federal bill will circumvent state-approved restrictions. Companies such as AOL, Yahoo!, and Microsoft are particularly laudatory toward a proposal from Reps. Billy Tauzin (R-La.) and Jim Sensenbrenner (R-Wis.) that would require spammers to label their emails as advertisements and include valid return addresses, while giving consumers the right to opt out of receiving commercial email. Consumer advocates, however, are more in favor of opt-in legislation that requires marketers to get users' permission before sending them spam. Another bill arousing interest from Congress, supported by Wilson along with Reps. Gene Greene (D-Texas) and Anna Eshoo (D-Calif.), would enable private citizens to file civil suits against recidivist spammers, grant state attorneys more authority to execute anti-spam regulations, and force companies to remove people from email marketing lists for all of their affiliates once they opt out. However, Internet companies such as AOL and Yahoo! do not like this last measure, and all agree that legislation alone cannot curb spam. Douglas Sabo of Network Associates says the only truly effective solution is "a combination of legislate, innovate and educate."
Click Here to View Full Article
- "Cybersecurity Laws Expected"
IDG News Service (07/11/03); Gross, Grant
Rep. Adam Putnam (R-Fla.) told attendees at a recent forum on e-government and cybersecurity that Congress will pass legislation this year that outlines the standards businesses should follow to fortify their cyber-defenses, but added that it is too early to give specifics. Putnam, chairman of the House Government Reform Committee's Subcommittee on Technology, Information Policy, Intergovernmental Relations, and the Census, commented that Capitol Hill's cyberattack response strategy needs work, and acknowledged that many members of Congress as well as the public are unaware of how essential networked technology is to the critical infrastructure of the United States. "We want to put something out there that makes sense, that's balanced, that accomplishes the same goals, without it being this headlong rush to prove that we're doing something for our constituents because we were asleep at the switch when there was this digital Pearl Harbor," Putnam noted. He criticized his colleagues and the Bush administration for prioritizing physical threats over cyber-threats. Putnam also found fault with the cybersecurity efforts of government agencies, which suffer from problems stemming from staff and workplace conflicts rather than technology, and said his subcommittee will debate whether federal agencies outside the Defense Department should comply with certain software security regulations. The congressman's views were aired in response to an inquiry from Entrust Technologies' Daniel Burton, who was concerned with a "creeping aggregation of regulations" such as 1996's Health Insurance Portability and Accountability Act and 2002's Sarbanes-Oxley Act. Burton expressed confidence after Putnam's presentation that his subcommittee will shed further light on private-sector cybersecurity regulations.
- "Visionaries See Flexible Computers Using Less Power"
EE Times (07/10/03); Hammerschmidt, Christoph
Expert computer engineering groups convening in Germany predict Moore's Law will continue for approximately 10 more years, while semiconductors will continue to be silicon-based. Although quantum and neuro-computers may exist on the sidelines, nanotechnology will extend the life of silicon by allowing more advanced wiring techniques. The Association of German Engineers (Information Technology Organization in the VDE) and the Information Science Association (GI) also said wireless technology will become the backbone of computer-to-computer communication, especially as computers are embedded in more devices. In that regard, University of Hanover professor Christian Muller-Schloer said Bluetooth is likely to become more widespread in the future. More numerous and connected computers will also spur adaptive characteristics, such as self-configuring hardware. University of Karlsruhe professor Hartmut Schmeck commented that power supply issues for portable devices would be key, since "a wearable computer that needs a new battery every couple of hours has no prospect for market success." Smaller power demands will help, but Schmeck said alternative supplies were needed, including possibly harnessed kinetic energy. Fuel cell success, however, is uncertain in 10 years' time, according to Schmeck. Software will change along with hardware, becoming dynamic application elements and functions instead of monolithic operating systems and programs. Besides being more technologically advanced, computers at the end of the decade will adopt more human traits similar to those foreshadowed in "2001: A Space Odyssey."
- "Hacker Plot Hijacks PCs for Sex Sites"
New York Times (07/11/03) P. C1; Schwartz, John
Security experts recently indicated that a ring of hackers are hijacking home computers with high-speed Internet access, and equipping them with software that sends them pornographic material and offers to sign up for explicit Web pages as customers. The software downloaded onto the computers does not appear to have any adverse effects on the functions of the devices, but the hackers can take control of the computer to send spam messages to unsuspecting users via the hijacked computer, in order to boost Internet traffic to the pornographic Web pages. The ability to cloak their identities allows them to fool Internet service providers into believing the email messages they are sending are legitimate and not spam. The hackers are receiving funds from the explicit Web sites to sign up users. The Justice Department says the hackers are violating at least two provisions of the Computer Fraud and Abuse Act, and the agency urged computer users to install firewalls to prevent hackers from gaining access to their computers. Independent computer researcher Richard M. Smith, who first detected the hacker plot, has found over a thousand computers affected so far. He says, "We're dealing with somebody here who is very clever." The ring system is not completely anonymous yet, since the ads the ads for the pornography are downloaded from a single server owned by a Houston ISP, which denies any involvement. However, future versions of the ring could be made anonymous and used to steal credit card data or for other crimes, says security analyst Joe Stewart. Stewart says, "This system is especially worrisome because they have an end-to-end anonymous system for spamming and running scams."
(Access to this site is free; however, first-time visitors must register.)
- "Machines That Reproduce May Be Reality"
NewsFactor Network (07/10/03); Martin, Mike
A team of Canadian researchers from the University of Waterloo and the National Research Council of Canada are attempting to illustrate the importance of self-replicating machines to nanotechnology by developing a digital primordial soup where virtual DNA is spontaneously manufactured. Within this artificial medium, "codons" swim and self-configure into single, double, and even triple strands, and roboticist Peter Turney of the National Research Council argues in a paper that these computer-generated particles can be organized into data-encoding patterns. "We demonstrate that, if an arbitrary seed pattern is put in a soup of separate individual particles, the pattern will replicate by assembling the individual particles into copies of itself," he explains. The project, which is nicknamed JohnnyVon after European researcher John von Neumann, takes a cue from self-replicating cellular automata, but is more effective because the program operates in a "continuous space" that more closely resembles a real-world environment. Turney believes that nanoscale machines running the JohnnyVon program could help facilitate such long-sought goals as economical mass production, environmental reclamation, or any operation that depends on large numbers of robots. Moshe Sipper of Ben Gurion University adds that the JohnnyVon project could lead to a self-reproducing machine that will not duplicate itself into overwhelming numbers, which has been a source of concern. The JohnnyVon project and its implications were detailed in a recent edition of Artificial
- "U.S. Researchers Invent Wall Climbing Robot"
rediff.com (07/10/03); Parasuram, T.V.
A number of initiatives being overseen by the U.S. Defense Advanced Research Projects Agency (DARPA) focus on technologies that will enhance homeland security and military operations. American and Canadian engineers have collaborated to develop a prototype multi-legged robot called Rhex that can swim and cover difficult terrain, and will soon be refined to scale walls; the machine will eventually be outfitted with cameras and biochemical sensors so that it may serve as both a remote reconnaissance device and a biological agent detector. DARPA director Tony Tether testified in late March before the House Armed Services' Subcommittee on Terrorism, Unconventional Threats, and Capabilities, arguing that current computer systems are marked by limited usability and intelligence, and citing cognitive computing as the solution. Cognitive computer systems would exhibit more human-like behavior, higher intelligence, and greater interactivity, and be capable of learning as well as teaching. "The idea is not simply to replace people with machines, but to team people with robots to create a more capable, agile and cost-effective force that lowers U.S. casualties," Tether explained. Anti-terrorism projects underway by the Pentagon's Technical Support Working Group include mass transit surveillance, fingerprint DNA extraction, a device that can irradiate luggage to sterilize biological and chemical weapons, a cooling system that can be embedded within body armor, and a handheld explosives detector. So that soldiers can safely probe enemy facilities located underground or in other hard-to-reach places, American researchers are devising sensor technologies that pick up seismic, chemical, electro-optical, radio frequency, and acoustic signals.
- "Taking the Measure of IT's Pain"
CNet (07/08/03); Frauenheim, Ed
Economic Policy Institute expert Jared Bernstein says Labor Department statistics show U.S. mathematicians and computer scientists are suffering from record unemployment and real-wage decreases. Jobless rolls for those workers are at the highest point since 1982, at around 5 percent at the end of last year, while payrolls grew just 1.7 percent in the year leading up to the fourth quarter of 2002, less than inflation growth and the smallest increase since 1976. "The bottom line for me is that in many policy discussions we argued that your skills will insulate you from the ups and downs of the New Economy; this latest downturn has proved that to be a pretty dubious contention," Bernstein explains. Bernstein, a former deputy chief economist with the Labor Department, says the Bush tax cuts are a long-term approach, but that demand for IT will return soon and the government should extend unemployment benefits for IT workers till then. In regards to foreign labor, Bernstein suggests limiting the H-1B visa program, given the availability of domestic workers, but says the program should not be eliminated. Similarly, he says offshore outsourcing will slow, but not stop growth in the U.S. IT sector. Bernstein says recent Meta Group studies showing an average 5 percent gain in IT worker wages is a possible indication that the current situation is improving. He says the current downturn was spurred by overinvestment encouraged by productivity growth during the last IT upturn. "The bubble burst, and demand sharply dropped for IT goods and highly skilled employees," Bernstein concludes.
- "Military Campaigns for New Net"
Investor's Business Daily (07/10/03); Howell, Donna
The U.S. Defense Department plans to dramatically improve Internet-coordinated battlefield tactics by becoming an early adopter of Internet Protocol version 6 (IPv6), a next-generation IP designed to eliminate the overcrowding of Net-connected devices. The current Internet operating system, IPv4, is running out of Internet address space, and switching to the new standard is expected to allow numerous military devices and vehicles--Comanche helicopters, for instance--to support always-on Net connections. "We want to [Internet-enable] all sorts of things--real property, from unattended aerial vehicles to satellite capabilities to individual soldiers on the battlefield," explains John Osterholz, director of architecture and interoperability at the Defense Department. IPv6 also promises to boost security and facilitate online multimedia transmission, while Osterholz notes that one of the standard's initial advantages will be augmented war games simulation. "IP version 6 is clearly something that would increase the flexibility of executing...war plans," he says. Key Defense Department IPv6 planner Marine Capt. Roswell Dixon says the military expects to have effected a transition to the new standard by fiscal year 2008, and technologists such as IPv6 Global Summit organizer Alex Lightman expect the military's adoption of the protocol to fuel industry acceptance as well. MCI notes that many equipment vendors have begun to make their products IPv6-compliant in anticipation of an October deadline set by the Defense Department. The department plans to recognize broad pilot IPv6 projects and evaluate vendor readiness over the next several months.
- "The New Card Shark"
New York Times (07/10/03) P. E1; Wayner, Peter
Online gambling and poker software is changing the face of the real-world game, allowing players who hardly play in casinos to learn effective card strategy. The 2003 World Series of Poker coup by Chris Moneymaker, an accountant who had never before played in a tournament, is a case in point. Moneymaker says he learned his craft playing in online gambling houses, away from the casino surroundings and with no feel for the traditional human readings, such as unconscious tics or gestures. Instead, Moneymaker attributes his win to playing so many hands and focusing on the actual game, such as the frequency and size of bets. Computer-trained players often use software programs which mimic human archetypes and provide a simulated environment in which to test strategies. University of Alberta Ph.D. student Darse Billings is refining a poker program online with the aim of defeating human players, not training them. Billings and his professors are drafting a game model for poker that allows computer programs to manage the game according to strict mathematic formulas. Some human experts that challenge the game online, which is open to all comers for free, say the program is hard to beat, but will almost always have limited winnings because of its conservative style. Billings says he is trying to add capabilities that let the program decode human strategy and capitalize on weaknesses to boost wins. Analysts say understanding the game better lessens the satisfaction of playing in the same way tic-tac-toe becomes routine when understood. However, the evening of playing ability also injects an added chance component, as demonstrated at this year's World Series of Poker, where last year's champion was eliminated on the first day.
Click Here to View Full Article
(Access to this site is free; however, first-time visitors must register.)
- "A Linux Mystery: Configuring for Virtual Processors"
NewsFactor Network (07/09/03); Ryan, Vincent
Hyperthreading, or simultaneous multithreading, is a process designed to speed up application performance by coaxing the operating system to regard a single physical processor as two or more logical processors, thus allowing one processor to accommodate multiple threads or instruction segments. Linux kernels 2.4 and 2.5 boast full hyperthreading support, but the feature may not necessarily be appropriate for all applications, according to certain experts. Though a January test by IBM's Linux Technology Center demonstrated that the performance of multithreaded applications improved by 30 percent with Linux kernel 2.4.19 hyperthreading technology and up to 51 percent with Linux kernel 2.5.32, center director Dan Frye cautions that performance benefits differ for every application, and suggests that Linux 2.6 will offer performance gains across a wider spectrum of applications. Because the Linux kernel cannot tell the difference between a physical processor and a logical processor, certain applications may end up vying for resources and boost performance by a mere 15 percent, according to Red Hat's Brian Stevens. Red Hat engineers are trying to make Enterprise Linux 3 capable of distinguishing between physical and logical CPUs so the operating system can automatically choose the most sensible applications. Stevens believes that single-threaded applications operating on Linux alone can experience performance improvements via hyperthreading, but real-world testing disputes this claim. Linuxhardware.org site manager Kris Kersey says that some single-threaded applications would improve their performance slightly when hyperthreading is deactivated, but users would not notice. IT administrators will generally have to decide for themselves whether hyperthreading has a positive effect on user setups, at least until Red Hat releases an improved Linux kernel.
- "OASIS Takes the Wraps Off SPML"
InternetNews.com (07/09/03); Singer, Michael
Members of the Organization for the Advancement of Structured Information Standards (OASIS) used the Burton Group's recent Catalyst conference to unveil Service Provisioning Markup Language (SPML), an open standards-based XML modification that allows uncommunicative companies to supply outside access to critical information and corporate systems while simultaneously keeping federated ID management secure. The spec, whose supporters include Sun Microsystems, Novell, BMC Software, BEA, PeopleSoft, and Waveset, is designed to be interoperable with open standards such as OASIS' Security Assertion Markup Language (SAML) and Web Services-Security, as well as the World Wide Web Consortium's Simple Object Access Protocol (SOAP). SPML Chairman and OASIS member Darran Rolls said that SPML could enable Human Resources departments to work with outside vendors and offer similar advantages to ISPs, airlines, and financial companies. Rolls carried out a demonstration at the conference in which a made-up PeopleSoft employee was enrolled on a SOAP-based platform and summarized inside an SPML document; a multiplexer from Mycroft then built a sub document and transmitted it to everyone in the conference room. "The security danger is gone," noted Rolls. "The message turned up, but the system operates so that you can have an administrator do a visual check before accepting the Web service or document." SPML was posted on June 1, and OASIS' at-large voting membership could pass the standard before September. A formal, public SPML test is expected to be conducted by OASIS members on July 9.
- "Researchers Keep an Eye on the Future of Security"
Computer Weekly (07/08/03); Cushing, Karl
Biometric security systems such as fingerprint and iris scanners are currently a niche technology, but a team of Kent University researchers expects to widen the acceptance of such interfaces through the development of stronger and more dependable tools. Iris scanners can be obtrusive and difficult for users to interact with, while the accuracy of fingerprint recognition systems can be reduced if the users' fingers are dirty or worn. Impostors can thwart such devices with a variety of techniques, including wearing sophisticated contact lenses to fool iris scanners and displaying photographs to biometric readers. "One of the key problems is that there is no single biometric device that is reliable and accurate enough for all applications, and not everyone recognizes that," explains Mike Fairhurst of Kent University's electronics department. His team has devised an "intelligent processing framework" featuring software that possesses centralized biometrics management in order to select the biometrics mode best suited to the job at hand. The system can also meld multiple forms together to boost reliability and accuracy. The Intelligent Agent for Multimodal Biometric Identification and Control (Iambic) interface, which the Kent team is operating in collaboration with Neusciences, uses a sensor- and microphone-outfitted laptop as a client and a networked desktop PC as a server; Iambic requires users to register up to three biometrics--voice, fingerprint, and facial recognition--that must be confirmed for each ensuing visit. During the system's trial period, researchers discovered that users had problems enrolling with the voice recognition mode, while handicapped or injured users may also have difficulty. "A lot of work needs to be done to improve the interface and make it more intuitive," comments Fairhurst.
Click Here to View Full Article
- "Software Finds Tunes You Want to Hear"
Rochester Democrat & Chronicle (07/07/03); Daneman, Matthew
University of Rochester researcher Mitsunori Ogihara is developing software that analyzes signals and patterns in songs and sorts them by genre and emotional content. Such a program could be used to sift through stored digital files or radio stations to find music that is in keeping with users' preferences. Ogihara's project incorporates wavelets, a mathematical operation that divides data into frequencies and then analyzes and contrasts those frequency functions. Wavelets have been applied to a wide spectrum of fields, including seismic geology and electrical engineering. Ogihara is now concentrating on refining the software and harmonizing the program to iTunes and other types of music software. Avid fans of music who store thousands of digital files on their PCs and laptops could find the software very helpful. "I don't know if you've ever gone through five [gigabytes] of music and put in ID tags, but it takes an awful long time," notes Albany, N.Y., resident Michael Szebeni. Ogihara filed a patent application for his software in May, but the U.S. Patent Office could take over a year to notify the UR researcher of the patent's approval, assuming the patent is approved at all.
Click Here to View Full Article
- "Computer That Can Tell the Write Sex Just by Reading"
Scotsman (UK) (07/09/03); Doherty, James
A computer algorithm developed by Moshe Koppel at Israel's Bar Ilan University can reportedly determine the gender of an author with 80 percent accuracy. The program was fed 604 documents from the British National Corpus that were equally split between male and female authors. Following the program's sampling of the 604 texts, the remaining 3,520 documents in the British National Corpus were added so that the algorithm could ascertain writing elements unique to either men or women, and the program came up with 50 features that could be used to predict the writer's sex. Through the program, the researchers discovered that women are more likely than men to employ personal pronouns such as "I," "you," and "she," whereas men regularly use "a," "the," "these," and numbers and quantifiers. However, Koppel notes that a paper detailing the research team's findings was rejected by the National Academy of Sciences, which thought the experiment's conclusions were sexist. The team then tried to quell criticism by using the program to analyze scientific documents. Old Dominion University linguist Janet Bing also finds fault with gender-detection experiments, which she says cannot account for lesbian, gay, bisexual, and transgendered individuals. She says, "This whole rush to categorization usually works against women."
- "Sensors of the World, Unite!"
Technology Review (06/27/03); Huang, Gregory T.
Ember CTO Robert Poor expects self-organizing wireless sensor networks to provide users with numerous advantages, and his company, a spinoff of MIT, is one of the first to commercialize the technology. Poor remarks that "sensor networks" should not imply one-way communications, and notes that Ember's products support two-way interaction, which is an important consideration when adjusting sensors, monitoring the networking environment, and directing the operation of devices. He cites the experiences of Ember client Tyco Thermal Control, a manufacturer of heating tape used to cover pipes; temperature sensors need to be installed in the tape so that a controller can regulate the heat of the pipes, but deploying and wiring these sensors is expensive, while the reliance on field installers to position the units raises the margin for error. Tyco has found that installed costs can be lowered considerably using wireless sensor networks that maximize redundancy and ensure more reliable temperature readings. Poor envisions wireless sensor networks becoming as pervasive as bar codes in people's daily lives: The technology could be incorporated into lights, switches, air conditioning, and thermostats, as well as devices that are not currently part of existing infrastructure, such as microcontroller chips. However, Poor thinks the assumption that networked devices will become intelligent is deceptive. He says that predicting future hurdles for the technology to overcome is a difficult proposition. "The requirements of a system are highly dependent on the actual application, and it will be a delectably messy environment for a while," Poor concedes. "Coming up with scalable and adaptable architectures that work across multiple applications is going to be critical."
(Access to this site is free; however, first-time visitors must register.)
- "Pipelining the Web"
InfoWorld (07/07/03) Vol. 25, No. 26, P. 56; Windley, Phillip J.
Businesses are laying down Web-service active intermediaries or "pipelines" to link their computer systems to internal business units, business partners, and clients; the flexibility of a pipeline architecture enables an enterprise to more easily adapt to changing business objectives. As Web services proliferate among businesses, active intermediaries will become the next logical progression toward compact application integration. Active intermediaries can dependably store and transfer messages between Web services while filtering, transforming, logging, and scanning the message flow en route. Thanks to pipelines, there is no need to modify the Web services in order to add reliability, scalability, security, availability, and interoperability. Web services networks linked together by active intermediaries offer much more flexibility and intelligence than linear, preassembled architectures, but the tradeoff is that only expert managers can configure these tools. A number of factors must be explored before selecting an active-intermediary product: Whether the product is sold as a service (and can act as a broker to reconcile security policies or connection techniques) or as software, whether the product's main function is building intra-enterprise connections (particularly useful for legacy system integration) or external links, and whether the offering resides atop a message-oriented bus and can support an active memory store and forward tool, or lacks this transport layer and is dependent on the transport options of the connecting Web services. Active-intermediary services on the horizon include more flexible and robust routing, simpler message translation, rules engines that control workflow, and job scheduling.
Click Here to View Full Article
- "IT Professionals Cash in on Company Training"
InformationWeek (06/30/03) No. 946, P. 67; D'Antoni, Helen
The latest InformationWeek Research IT Salary Survey suggest that a large number of American IT employees plan to access training as they approach 2004, despite having to work extended schedules and take on extra assignments. The survey examines the responses of 7,748 IT workers and 6,820 IT managers. Almost 50 percent of respondents said they plan to expand their education or access company supplied training. A third of people polled reported that they would seek company repayment for training or educational expenses, and a fifth said they plan to earn certification for which they company will pay. Very few of the IT employees are financially rewarded for seeking training or education--only 4 percent of IT staff and 2 percent of IT managers get cash bonuses. Most IT workers take courses in network and systems infrastructure, project management, network administration and engineering, application development, and server administration. IT workers also reported that business skills are being offered to them. For example, among the 2,476 IT staff who said they would receive corporate training, nearly 50 percent said their firms offer project management courses. Meanwhile, research firm IDC forecasts that the IT education and training market will grow worldwide at a compound annual rate of about 5 percent from 2002 to 2007, whereas the market in the United States is expected to grow at a rate of 10.5 percent in the same timeframe.
Click Here to View Full Article
- "A Conversation With Jim Gray"
Queue (06/03) Vol. 1, No. 4, P. 8; Patterson, Dave
In an interview with the University of California, Berkeley's David Patterson, Microsoft Bay Area Research Center director and ACM Turing Award recipient Jim Gray discusses the future of disk storage and its implications. Gray notes that disk capacity is expanding by 100% per year, while disk access times are improving at an annual rate of around 10 percent. A 20 terabyte disk is on the horizon, and Gray believes overcoming capacity versus access difficulties will require programmers to redefine disks as sequential devices rather than random access devices. One solution is to use idle disk space as tape or as archive, and Gray suggests that feeding databases into disks and mailing the disks--or better yet, whole computers--to others is cheaper and easier for senders and receivers than dealing with tape, and it also cancels certain incompatibility issues. Gray predicts that disks will supplant tape and boast limitless access within the next 10 years, and this development will radically transform file system architecture. He forecasts a migration of the processors to the transducers, which will allow intelligence to be embedded in all displays, NICs, and disks. Gray says it is just a matter of time before processors, a network interface, and an operating system are embedded into disk drives, although he acknowledges that today only the bare minimum is put into them. Gray also says, "The two things that are going to be real shifts in storage are tertiary storage going online so there is no distinction; and intelligent storage, so that we raise the level above SCSI." Gray characterizes the MySQL and Postgress teams as being on the leading edge of open-source database development, and believes open-source database vendors could dislodge the incumbents.
- "Model Maker"
CIO Insight (06/03) Vol. 1, No. 27, P. 32; Rothfeder, Jeffrey
Icosystem Chairman Eric Bonabeau is a strong advocate of agent-based modeling, a simulation technique that is used to pattern the behavior of complex systems such as computer networks by studying the function of their most fundamental elements. "With agent-based modeling, you describe a system from the bottom up, from the point of view of its constituent units, as opposed to a top-down description, where you look at properties at the aggregate level without worrying about the system's constituent elements," Bonabeau explains. He notes that the method is particularly useful in predicting counterintuitive phenomena. Bonabeau says that a supposedly accurate model is a rudimentary description of a real-world system that must refer to the problem it is designed to answer. "You're not going to build an agent-based model--or any kind of model, for that matter--without taking into account how you're going to calibrate it and validate it against what you're trying to model," he explains, adding that human judgment is required to determine how accurately the model reflects the system's reaction to unexpected variables. Bonabeau notes that solutions organized through agent-based modeling should be robust rather than optimal, because optimal solutions only work within narrowly defined parameters, and are thus ineffective in unforeseen situations. Another option is to use agent-based modeling to build evolvable systems that can adapt to future generations of products. Applications for agent-based modeling Icosystem is focusing on include new-product launch simulations, portfolio management for the pharmaceutical sector, and consumer behavior, which Bonabeau expects to emerge as a hot area for agent-based modeling in several years.