Volume 5, Issue 515: Wednesday, July 2, 2003
- "Multiple Attack Only Hope in Spam Battle"
New Scientist (07/01/03); Knight, Will
The problem of rapidly growing spam can only be defeated through a multi-pronged strategy, one that combines new technology, new legislation, and user awareness, according to speakers at Britain's first-ever "spam summit" on July 1. U.K. e-commerce minister Stephen Timms told attendees that legislation alone is not a panacea for spam, but technical experts noted that new laws are still an important ingredient of an effective anti-spam approach. Steve Linford of spam-tracking organization Spamhaus warned that proposed American legislation--specifically the Reduce Spam Act and the Can Spam Act--would backfire and generate a huge surge in spamming. The bills would require users to opt out of receiving unsolicited commercial email; the European Union, by contrast, passed a law requiring bulk emailers to get recipients' permission before sending them spam, a policy known as opt-in. Linford argued that spam would be legitimized by the passage of an opt-out law, and give some 23 million small businesses the legal right to send spam. Linford also stressed the importance of international cooperation, because many spammers are based outside the countries where recipients live, and more will probably relocate if threatened by prosecution. Technological measures such as spam filtering can slow spam's progress, but cannot halt it: No filter can identify spam with 100 percent accuracy, while spammers are fiercely dedicated to subverting every new anti-spam technology. Piper Marbury Rudnick & Wolfe's Jim Halpert warned that spammers are becoming more crafty, using computer-hijacking Trojans and viruses to distribute their wares.
Click Here to View Full Article
- "Pentagon Project Could Keep Close Eye on Cities"
Philadelphia Inquirer (07/02/03) P. A2; Sniffen, Michael J.
The Pentagon's Combat Zones That See (CTS) project aims to safeguard U.S. troops and enhance battle tactics through a combination of computers and surveillance cameras set up to monitor and study each vehicle moving throughout foreign urban areas. Scientists, law enforcement officers, and privacy advocates note that the technology could also be used to watch the movements of American citizens. "Government would have a reasonably good idea of where everyone is most of the time," warns GlobalSecurity.org defense analyst John Pike. At the heart of the system is a software program that can automatically identify vehicles based on their size, shape, color, and license tag, and can also recognize drivers and passengers by face; in addition, the software can transmit alerts if a monitored vehicle's tag is on a watch list, and sift through copious records to locate and compare vehicles seen near the sites of terrorist incidents. Jan Walker of the Defense Advanced Research Projects Agency (DARPA) insists that CTS technology was not designed with local law enforcement or homeland security in mind, and employing it for any other purpose would require sweeping alterations. Nevertheless, Steven Aftergood of the Federation of American Scientists remarks, "One can easily foresee pressure to adopt a similar approach to crime-ridden areas of American cities or to the Super Bowl or any site where crowds gather." New York deputy police commissioner James Fyfe adds that police will be eager to adopt such technologies. DARPA is planning to spend up to $12 million to deploy CTS technology over four years: Phase one will involve the installation of about 30 computer-connected cameras in a fixed site to enhance troop protection; phase two will set up at least 100 cameras to support military operations, and incorporate software that analyzes video footage to distinguish between routine and abnormal activity.
- "New Law Requires Customer Notification of Data Theft"
SiliconValley.com (07/01/03); Bazeley, Michael
A new California law is now in effect requiring any company, nonprofit organization, or government agency that even suspects its network has been hacked to immediately inform California-based customers that their personal data may have been compromised. Adding weight to the law, which could become a model for national policy, are several high-profile network intrusions and Justice Department estimates that identity thieves claim up to 700,000 victims a year. Security experts report that many firms are struggling to quickly fortify their security measures in order to avoid lawsuits or damage to their reputations, while many other companies are ignorant of the California law's existence. The law extends to any companies that do business in California, even if they are based elsewhere in the country, and only firms that encrypt their customers' information are exempt. Complying with the law will require businesses to not only improve their hacker security, but beef up their traffic-monitoring systems as well. "This might actually be good by forcing companies to take the action they should be taking," declares Hewlett-Packard chief security strategist Ira Winkler. Consumer and privacy rights proponents laud the legislation, arguing that it will enable consumers to respond faster to threats of identity theft, while those who do not know what to do in such situations will be able to learn via public awareness programs. However, some companies are concerned that publicizing electronic break-ins could tarnish their image and shake customer confidence. Eugene Spafford of Purdue University's Center for Education and Research in Information Assurance and Security says that some companies may collect less personal customer data in an effort to lower their liability in case their networks are breached.
Click Here to View Full Article
Eugene Spafford is co-chair of ACM's U.S. Public Policy Committee; http://www.acm.org/usacm
- "Giving Sharers Ears Without Faces"
Wired News (07/01/03); Jardin, Xeni
Tools that claim to keep file-sharers anonymous are drawing more attention in the wake of recent lawsuits filed by the Recording Industry Association of America (RIAA) that target individual peer-to-peer network subscribers for allegedly pirating copyrighted material. "The RIAA lawsuits...will probably drive more technologically adept consumers to systems that profess to offer more security against legal assault," predicts Sharman Networks lobbyist Philip Corwin, who expects file traders will move to anonymous networks and "sneaker networks." Another option is for P2P developers to launch decentralized systems that provide users with accounts that keep them incognito, an example being the Manolito Peer-to-Peer (MP2P) network. One MP2P customer, Blubster, handles content look-up and transfer negotiating with the UDP Internet data transfer protocol, a "connectionless" standard. "It may be possible to gather IP addresses from the network, but not data about what content specific users are sharing," says Madrid-based Blubster developer Pablo Soto. Meanwhile, P2P developers are trying to amalgamate their lobbying and public relations initiatives by organizing coalitions in Europe and the United States. Freenet founder Ian Clarke and others harbor strong doubts about the industry's technical prowess in keeping tabs on unlawfully shared content. The catalyst for much of the renewed interest in anonymity-supporting P2P applications was a June 25 announcement in which the RIAA expressed its intent to sue hundreds of file swappers, as well as an earlier court ruling that forced Verizon to disclose the identities of customers charged with digital piracy.
- "Heat Becomes Computing's Hottest Topic"
Financial Times (07/02/03) P. 9; London, Simon
As semiconductors become more powerful, they also generate more heat and devour more electricity. Increasing heat output can wear chips down faster, while installing cooling systems to deal with the heat can make computing more expensive and complicated. Chipmakers have tried to diffuse the problem by reducing processor size, but this method is no longer effective. There have been significant upgrades in speed and energy efficiency as a result of miniaturization, but the tradeoff is increased electron leakage due to reduced gate length; Intel analyst Rob Willoner reckons that within several years as much as 60 percent of the power consumed by semiconductors will go to waste unless action is taken. Several solutions that aim to tackle the heat and electron leakage problems are under development: Intel has created a prototype "trigate" or 3D transistor and is now attempting to demonstrate the feasibility of mass production, while other designers are considering lowering transistor voltage, though reducing it too much could inhibit the operation of circuits. Other solutions being looked into involve assigning different voltages to different areas of the processor depending on how often they are used, or using new materials technology to modify each transistor's voltage threshold. Also under development is software that puts inactive chip parts into a "sleep mode" or slows down the rate of instruction processing. However, all of these solutions are likely to entail rises in cost and complexity, as well as lowered chip performance.
- "High-Tech Leads Improvement in U.S. Job Market"
CyberAtlas (07/01/03); Gaudin, Sharon
The high-tech job attrition rate experienced a 13 percent decline between May and June, according to Challenger, Gray & Christmas. High-tech job losses were 37 percent lower in June 2003 compared to June 2002, while layoffs fell by 53 percent between April 2003 and May 2003. The computer and telecommunications industries are estimated to have collectively laid off 54,278 professionals between January and May this year, a 67 percent reduction compared to job losses recorded in the same interval last year. Some 630,532 job cuts have been announced since the beginning of 2003, which is 14 percent less than those announced in mid 2002. However, Challenger, Gray & Christmas CEO John Challenger cautions that this decline "does not necessarily mean an immediate rebound for the millions of people who remain jobless." He suggests that a hiring slowdown may be accompanying the job loss slowdown as employers strive to retain the talent they already possess rather than hire new staff. Furthermore, 90 percent of June 2003 college graduates polled by MonsterTRAK expressed a willingness to work part-time or as interns over the summer, indicating strong doubts that full-time jobs will be widely available. Still, MonsterTRAK founder Jeff Taylor is confident that "As the market shifts, these recent graduates will prove themselves to be valuable assets to the business world and American economy."
Click Here to View Full Article
- "Computer Vision Links How Brain Recognizes Faces, Moods"
Ohio State University researcher Aleix Martinez has developed a computer model that could form the basis for systems that identify authorized users facially via video camera. The computer model is in turn based on a model Martinez worked out on how the human brain can recognize faces and emotional states: He determined that the brain combines its knowledge of motion and shape to identify faces and moods. Martinez took photos of a large group of volunteers expressing four different emotional states; each person was photographed multiple times under variable lighting conditions and with accessories (sunglasses and the like). Two groups of volunteers then viewed the photos--the first group was tasked with discerning whether the photos they were looking at depicted the same person with different emotional expressions, while the second group was tested to see if they could correctly identify moods. The results of these tests--how fast the volunteers were able to correctly ascertain faces and emotions--were compared to the computer model, and Martinez discovered that there were close correlations. Both the people and the computer model were able to recognize faces and facial expressions that involved little muscle movement faster, while faces and expressions that involved a lot of movement took longer to identify. Martinez and his associates plan to harness the test results to design a machine with a computer vision system that can recognize people without referring to a large image database. "Ideally, we want a computer that can recognize someone, even though there is only one picture of that person on file, and it was taken at a different angle, in different lighting, or they were wearing sunglasses," Martinez explains.
Click Here to View Full Article
- "New Tech Feeds Spectrum Debate"
United Press International (06/30/03); Bourge, Christian
The Defense Advanced Research Projects Agency (DARPA) is developing a cognitive radio technology under its NeXt Generation (XG) Communications Program that could help resolve the debate concerning the best way to allocate wireless spectrum, which is fueled by the proliferation of wireless devices and the continued development of wireless technology. Critics charge that the FCC has limited dedicated spectrum licensing so that mainly the federal government and a few companies benefit while locking out emerging technologies such as Wi-Fi. The XG standard is supposed to function within a single band of spectrum as well as across a wide frequency range; a cognitive radio device that uses the protocol could theoretically adapt to gaps in frequency use across an extended spectrum without disrupting already-existing transmissions. "We have the technology that will achieve a lot of the objectives for enabling people [users and devices] to work cooperatively in the spectrum," boasted DARPA XG program manager Preston Marshall at a recent New America Foundation forum. Consumer Federation of America research director Mark Cooper says the XP program proves the feasibility of spectrum-sharing. The U.S. military is also interested, as the technology would theoretically enable U.S. forces to communicate by radio within any country without having to wade through each nation's individual spectrum regulations. Marshall noted at the forum that a trio of military contractors are developing XG-based radio devices, and expected prototype models to be ready within three years. XG is thought to be especially conducive to wireless broadband applications, and DARPA says commercial users should be encouraged to embrace the XG protocol by developing it as an unlicensed, open standard.
- "Debate Rages Over Need for IT Union"
Search400.com (06/27/03); Evans-Correia, Kate
American IT workers are divided over whether they should unionize, with some saying the industry has irrevocably shifted to cheaper foreign labor and others warning against social stigmas to unionization. Search400.com found in an exclusive survey that 44 percent of IT workers did not think unionization would protect their jobs from either offshore outsourcing or foreign workers on H-1B or L-1 visas, though 39 percent thought the opposite. IT unionization advocate Dante Vignaroli said consolidating American IT workers' political voice would have been more effective two years ago, but that IT workers are unduly biased against such a move. Cimetrics Technologies senior network engineer Kenneth K. Atri says that global competition demands U.S. companies take advantage of lower labor costs overseas or else lose their market dominance. Hain Celestial Group ERP applications support analyst Raymond Bassett says unionizing now would have little effect and suggests that U.S. IT workers should instead adjust their mindset and accept salaries lower than the premium they are used to. Other survey participants said national IT unionization was unlikely, given the difficulty smaller efforts have had, including the Washington Alliance of Technology Workers affiliated with the Communications Workers of America, and the Programmers Guild. Former Programmers Guild Chairman John Miano said U.S. IT workers had little choice but to unionize, and expressed shock at the continuing ambivalence toward unionization. Part of the problem is that programmers tend to lump themselves with management, unlike blue-collar workers whose interests are traditionally at odds with company leaders, according to Excel Data Systems President Carl Sastram.
Click Here to View Full Article
- "Engineer's Focus: Accessible Technology for All"
SiliconValley.com (07/02/03); Ha, K. Oanh
IBM software engineer T.V. Raman, who lost his eyesight to glaucoma in his adolescence, specializes in developing speech technology that is accessible to everyone, not just the disabled. His objective is to create standards for next-generation Web applications--specifically, generating Web content that can be visual, textual, or spoken, depending on the user's preference. Projects Raman is working on include "x-forms," a technology designed to ease Web data collection; such a technology could, for instance, allow Web forms that currently must be filled out by typing to be completed by voice or by a message transmitted from a personal digital assistant. Raman's interest in speech technology was nurtured at Cornell University, when he devised software that allowed complex mathematical equations on a computer screen to be read out when the program he originally had to contend with proved unreliable. The engineer's emacspeak program, which turns a computer desktop into an audio interface, has been freely distributed online, and will be included in an forthcoming suite of server software from IBM. Raman and many others are especially excited about Extensible Markup Language (XML), which allows Web content to be displayed as text, audio, or graphics, and has the potential to facilitate greater accessibility for all. One of Raman's research collaborators is the World Wide Web Consortium (W3C), which has taken a vanguard position in the development of Web standards that support the handicapped. Benetech owner Dan Fruchterman explains, "the goal is universal design that's integrated and equal: Don't make disabled people use a different Web structure but make it so they can use it too."
Click Here to View Full Article
- "USC Researchers Build Machine Translation System--and More--for Hindi in Less Than a Month"
USC Information Sciences Institute (07/01/03)
The University of Southern California's Information Sciences Institute (ISI), along with 11 other participating institutions, developed a system that can translate Hindi text into English in 29 days under the aegis of the Defense Advanced Research Projects Agency's (DARPA) "Surprise Language" project. The machine translation system can also direct queries in English to Hindi databases. ISI researchers concentrated on resource building, machine translation, summarization, and an efficient navigational interface for users; ISI researcher Franz Josef Och says the most probable translation for a given inquiry is determined through statistical models. The process involves feeding the system a series of parallel texts--material written in a foreign language and its English equivalents--so that it can find the most likely English translation for the given text via statistical analysis. ISI computational linguist Ulrich Germann notes that building parallel texts was especially challenging because nearly every Hindi Web site boasts a unique encoding. ISI researchers Anton Leuski and Chin-Yew Lin's contribution to the project was a multi-document search, summarization, and translation tool that lets users enter English search terms and produces results categorized by textual similarities. Graduate student Liang Zhou devised a technique for creating headlines for each collection of similar stories; Leuski's Lighthouse visualization system arranges results as clustered spheres. Four Hindi translation systems were developed independently by the participating sites, and their performances will be rated and compared in upcoming trials.
- "Computers Are Getting Better at Poker"
Australian IT (07/01/03); Gengler, Barbara
A research team at Canada's University of Alberta has labored for over a decade to produce a poker-playing computer program, the ultimate goal being one that can beat world-class poker players. Although this objective has yet to be met, the project's latest brainchild, PsiOpti, can compete with strong players. Poker strategy decisions rely on little, if any, hard data, which makes the game perfect for artificial intelligence research. PsiOpti, which combines algorithmic methods, statistical analysis, observation, and Monte Carlo Simulation, is based on a mathematical formula derived from the work of Nobel Prize-winning game theorist John Nash. "Nash showed that there is an optimal point for an imperfect information domain, like poker," explains UA professor Jonathan Schaeffer, who notes that uncovering that optimal point is beyond current computational capabilities. Schaeffer's research team therefore devised a "pseudo-optimal" solution that Schaeffer describes as "close enough [to Nash's optimal] that the program plays well." Some of PsiOpti's Java-based source code has been issued to help other poker research projects; the UA team disclosed a Texas Hold'em communication protocol that enables new computer programs and people to compete over the Internet. Schaeffer's collaborator in the project is former student-turned professional poker player Darse Billings.
Click Here to View Full Article
- "'Hotspots,' Cold Feet"
Boston Globe (06/30/03) P. C1; Howe, Peter J.
Analysts warn that the current build-out of Wi-Fi "hotspots" nationwide is leading to a dot-com-like bubble. Many experts see Wi-Fi's promise of providing high-speed, wireless Internet access to millions of mobile users as a key driver to the next stage of the Web economy, but how that market evolves and when is widely debated. The business models surrounding public Wi-Fi access were the main topic at the recent Jupiter Media industry convention in Boston. Many observers noted that Wi-Fi is valuable to existing services firms such as landline ISPs, hotels, or restaurants, but that new subscription companies are basing their future profitability on an untested market. Verizon, for instance, has already deployed 200 hotspots on top of public pay phones in New York City as a value-added service for existing DSL subscribers. Pyramid Research analyst Daniel Torras says Wi-Fi connectivity is becoming another competitive requirement for services industries such as hotels and restaurants, where businesses that offer the service for free will gain customers from those not offering Wi-Fi. Boingo Wireless, the largest public Wi-Fi pure play, estimates there are more 2 million locations worthy of Wi-Fi hotspots, but CEO David Hagan says the infrastructure is still not developed enough to spur mass-market adoption. Wayport's Dan Lowden says the key is finding places where many people need access to the Internet on a temporary basis, such as airports and hotels. Another potential booster for public Wi-Fi is the cell phone industry, which sees the technology as a way to alleviate traffic congestion for its wireless data services, which offer slower throughput but much
Click Here to View Full Article
- "Just a Walk in the PARC--33 Years On"
Sydney Morning Herald (07/01/03); Cochrane, Nathan
The Palo Alto Research Center (PARC) is celebrating its 33rd year, a distinguished history marked by innovation. Ethernet, the personal computer, and the graphical user interface were all invented or refined at PARC, spawning commercial successes such as Microsoft's Windows and Apple's Macintosh. Although PARC's former corporate parent was able to capitalize broadly only on the laboratory's laser printing invention, researchers say PARC's newfound independence has honed its commercial focus. Electronic materials laboratory manager Ross Bringans expects his work in printable organic electronics to be used soon in cell phones and radio-frequency identification (RFID), for instance. Bringans says cell phones could be equipped with larger roll-up displays, and RFID tags could be printed cheaply for inventory management. Former PARC computer science laboratory manager turned independent consultant,
Dr. Craig Mudge (the current CSL manager is Richard Bruce) says PARC is ahead of the curve in terms of commercially focused research, with government laboratories being among the last to realize the shift. He expects PARC's work in "sensor nets" to make a big impact in supply-chain management, outdoor fire fighting, and warfare. Meanwhile, so-called smart materials promise dramatic commercial benefits, such as airplane skins that change texture according to aerodynamic requirements.
Click Here to View Full Article
- "Grokking the Infoviz"
Economist Technology Quarterly (06/19/03) Vol. 367, No. 8329, P. 25
Information visualization (infoviz) technology, which is gearing up to penetrate the mass market, is supposed to provide a next-generation user interface superior to the conventional desktop interface by allowing corporate, online, and PC-stored information to be graphically rendered in a way that eases data-searching. Inxight CTO Ramana Rao posits that infoviz's slow route to commercialization is attributable to the fact that the current PC interface is deeply rooted in the consumer sector, while interface development is coordinated by only a few technology providers. However, the proliferation of infoviz has started to pick up because firms are eager to cut costs and the infrastructure is maturing--more and more data is either kept in a "structured" format or can be re-formatted in such a way with special software. Many infoviz solutions depict data as surface maps: Antarctica Systems' Visual Net converts data into a geographical representation that marks directory categories as countries and Web sites as cities. Smart Money's MarketMap delivers a heat-map representation where share price, interest rates, and other financial vectors are color-coded to indicate their strength or weakness. Inxight and Plumb Design offer infoviz tools that display information in cluster configurations such as trees or webs; Inxight's Star Tree interface also allows users to shift their focus to specific cluster nodes. Even more elaborate is Groxis' Grokker interface, which represents data as circles inside of circles, and enables users to view information from multiple angles and share their visualizations with other people. Stumbling blocks to infoviz's mainstream adoption include users' reluctance to learn new technologies and the unstructured format typical of most data.
Click Here to View Full Article
- "Future Watch: Taming Data Complexity"
Computerworld (06/30/03) Vol. 31, No. 32, P. 31; King, Julia
Carnegie Mellon spin-off Maya Design offers a way to make information in relational databases available to users as they need it. The concept uses data "containers" called u-forms to package information in a pure form, with optional metadata layered on top. U-forms exist in an information commons accessible through peer-to-peer connections. With their PC-based Maya browser application, users would be able to call up information stored in u-forms in a way that makes sense to them--for instance, an inventory manager could see a map with inventory levels while a distribution manager could see the same map but with logistics data. ZapThink analyst Jason Bloomberg says the idea differs from the Semantic Web in that it helps humans understand data in their particular context, whereas the Semantic Web is meant to facilitate computer-to-computer communications. The U.S. Transportation Command is one test user, and has employed Maya technology when storing data from different sources and distributing it to customers. Lt. Col. Cody Smith says data is presented in metric feet for certain users and in standard measures for others. Another project using Maya technology is the Pittsburgh Green Map, which consolidates "green" asset information in western Pennsylvania: Users can locate the specific types of environmental or recreational assets they are looking for using the system. Maya Design CEO Peter Lucas envisions all public data housed in a large peer-to-peer information commons, which he calls the Civium Project; he estimates Maya technology is still in its infant stages and will gain momentum over the next few years.
Click Here to View Full Article
- "6 Myths"
InformationWeek (06/30/03) No. 946, P. 30; Foley, John; Murphy, Chris; Zaino, Jennifer
InformationWeek's John Foley and Chris Murphy address six critical misconceptions about the current state of business IT. IT commoditization is not a sign of business technology's declining strategic value: Harris CTO Richard Plane says that technology alone cannot support business-technology environments; rather, it is a combination of "multidimensional" factors such as people skills, business processes, and intellectual capital that drives environments through such approaches as business-to-business collaboration, real-time data delivery, business-process optimization, and customer-driven decision making. The popular consensus that innovation has ground to a halt is attributed to reduced venture capital for startups and the belief that CIOs are buying chiefly from the major IT providers, but the authors argue that the situation is not that bleak. For one thing, startup funding for 2002, although less than in 2000, was higher than in 1998; furthermore, the spending downturn has actually helped spur innovation in markets such as business technology management. Foley and Murphy assert that CIOs' status and job security are not falling, as is assumed: Norbert Kubilus of Tatum CIO Partners reports that CIOs in large companies have slightly more influence and longevity than before, and those with the best prospects combine technical and business skills and can relate to CFOs and CEOs. The economic recession and the growth of IT outsourcing have depressed IT workers, leading them to think IT jobs are a dead end, but the authors counter that business-technology professionals still earn more than workers in other disciplines, while salaries are at more or less the same levels as they were before the recession; Vanguard Group CIO Tim Buckley says IT job opportunities abound, but the competition is fierce and IT staff cannot get ahead without the help of mentors or managers. The assumption that long-range IT projects are dead is refuted by the fact that long-term projects are being undertaken in all industries: Such projects include business-technology architecture updates, enterprise resource planning deployments, customer interface overhauls, and other initiatives closely tied to business strategy. Finally, Foley and Murphy dispute the conceit that IT budget cuts are responsible for fewer IT implementations, citing a June InformationWeek Research survey noting that, while 33 percent of business-technology executives expect a decrease in IT spending in 2003, the remaining percentage predict current spending levels will increase or remain the same; more value can also be squeezed out of every dollar thanks to lower software and hardware costs.
Click Here to View Full Article
- "The New Science of Networks"
Business Communications Review (06/03) Vol. 33, No. 6, P. 22; Herman, Jim
The fundamental component of many networks, including social networks, is the "small-world" network, in which each node is no more than six degrees of separation away from any other node, writes consultant Jim Herman. Herman says this is because real-world networks are usually compact clusters with a small number of connections to other clusters. Furthermore, Notre Dame University researcher Albert-Laszlo Barabasi's effort to map out the World Wide Web revealed that the Web follows a scale-free network distribution pattern characterized by both a small number of sites (nodes) with many site links and a large number of sites with only a few links. The small-world architecture supports short routes through large-scale networks, which is why computer viruses and other disruptions spread so rapidly over the Internet. This phenomenon also accounts for the accelerated distribution of pirated content and networks' vulnerability to cascading failures. The flipside is that networks are very resilient against random failures. On the other hand, the existence of short paths is not a guarantee that such paths can be tapped, as evidenced by the difficulty users encounter with carrying out directed searches on a corporate network. Herman writes that this "New Science of Networks" offers "a new way of looking at networks, and borrows from cutting-edge approaches to chaos theory and advanced non-linear mathematics."
- "Interactions in Education: A Conversation With Brenda Laurel"
Syllabus (06/03) Vol. 16, No. 11, P. 22; Cavalier, Robert
Human Computer Interaction (HCI) expert Brenda Laurel, chair of the Art Center College of Design's Media Design Program, believes HCI can play an essential role in higher education, but such programs should follow a specific philosophy. Central to this philosophy is the perception that a computer should be personalized to the user's needs, not vice-versa. Laurel provides an example of such a program in the Art Center's trans-disciplinary studios, where several academic departments can engage in collaborative projects where digital media is a key ingredient. Laurel notes that her school has made the completion of digital media courses a student requirement. Another example she provides is a motion capture laboratory at Ohio State that actors can use to enhance theatrical works by generating animated characters that move in real time. Laurel once wrote that "The coming times call for a complex form of ethical literacy--the ability to determine what is true, good, and valuable in a world that is radically different from our own;" this means that the Web can potentially reverse society's growing passivity, which Laurel blames on media-disseminated "programming content." Laurel's class, for instance, is trying to convert the Web into a popular forum for public discourse by developing methods for amassing personal stories and observations into an interactive medium that carries as much authority and cultural relevance as broadcasting media. Accomplishing this entails solving a problem that is both technological and political in nature, Laurel points out. She says the most manageable challenge for educators is to use technology to simulate the physical world and how people affect it.
[ Archives ] [ Home ]