HomeFeedbackJoinShopSearch
Home
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of ACM.

To send comments, please write to [email protected].

Volume 6, Issue 592: Friday, January 9, 2004

  • "U.S. Could Lose Technology Dominance, Executives Say"
    Washington Post (01/08/04) P. E5; Krim, Jonathan

    The United States' leadership position in the knowledge economy is threatened by expanding tech efforts in India, Russia, China, and elsewhere; the U.S. technology infrastructure must be shored up with more federal funding for research and development, financial incentives for private-sector tech rollouts, and improvements in mathematics and science education, according to representatives of the Computer Systems Policy Project (CSPP) trade group. Intel CEO Craig R. Barrett and Hewlett-Packard CEO Carleton S. Fiorina declared at a Jan. 7 press conference that the competitive edge of the United States is being blunted by falling government R&D budgets and inadequate educational initiatives in the subjects of science and technology in grades K-12. Barrett also warned against knee-jerk protectionist reactions to the offshore outsourcing of IT jobs to lower-wage countries, which he argued is a normal process of business cycles. Fiorina predicted that the outsourcing trend will facilitate a change in the leading tech jobs in the United States, and that skills such as the ability to coordinate multiple systems and networks will be highly valued. She also cautioned that short-term financial and employment issues are a distraction, saying, "The biggest barrier [to solutions] is our nation's attention span." The CSPP has introduced several proposals to help maintain U.S. tech dominance, which Barrett claimed would collectively cost less than the $30 billion federal investment in agricultural subsidies. One CSPP proposal is the Infrastructure Investment Act of 2004, which calls for the institution of special tax and regulatory rules to promote the construction of more broadband networks. Another is the Mathematics and Science Improvement Act of 2004, a proposal to make education and school-testing in those subjects more scrupulous.
    Click Here to View Full Article

  • "MCNC Develops New Protocol, Device for Optical, Grid Networking"
    LocalTechWire.com (01/07/04); Smith, Rick

    North Carolina State University professors and the nonprofit Microelectronics Center of North Carolina (MCNC) have created a new networking protocol and accompanying appliance enabling on-demand provisioning of bandwidth. The JITPAC project is useful for real-time provisioning required in computing grids and in next-generation research networks such as the recently announced National Lambda Rail. The Just-in-Time (JIT) protocol takes advantage of microelectromechanical switches to provision optical connections in just milliseconds, compared to the minutes required with generalized multi-protocol label switching, the current industry standard. JIT does not require specific content formats or single types of signals, and works with off-the-shelf commercial equipment. MCNC Advanced Network Research Division vice president Dan Stevenson says the JIT protocol allows applications to easily request, use, and release bandwidth without the need for resource-consuming dedicated circuits; in this sense, JIT lets networks provision resources similarly to how computing grids allocate processing power and storage, he says. The second part of the project is the protocol acceleration circuit (PAC) that augments conventional high-speed routers to allow data to flow uninterrupted throughout the network. Only the traffic control information must be converted from optical signals to electronic and back, while the bulk of the data does not have to be converted. PAC is a "different kind of router" that has generated interest from major networking equipment companies such as Cisco and Nortel, says Stevenson. JITPAC was funded in part by NASA and the Defense Department and is being used by Defense's Advanced Technology Demonstration Network.
    Click Here to View Full Article

  • "NASA's IT Success"
    InformationWeek (01/07/04); Greenemeier, Larry

    The successful landing and operation of NASA's robotic exploration rover on Mars is a confidence booster for CIOs struggling with business technology, since the complexity involved in transmitting and receiving massive amounts of data between Earth and Mars has many parallels in the business world. Jennifer Trosper of the Jet Propulsion Laboratory (JPL) notes that the transmission of data from the Spirit rover to NASA's Deep Space Network Antennas on Earth consumes a great deal of power, so some of that data is first sent to the orbiting Odyssey and Global Surveyor spacecraft, which then relay the information to the NASA receivers. The rover and orbiters can communicate for approximately eight minutes at a time and transfer roughly 60 MB of data; Trosper reports that Spirit, Odyssey, and Surveyor send as much as 150 MB of data to Earth each day, and this data will increase when a second Martian rover touches down later in January. Four operational storage servers with a capacity of 4 TB have been installed at JPL by NASA and Sun Microsystems to deal with this information deluge. A lot of the orbiters and rovers' operations technology has been commercially available for some time. Wind River Systems' VxWorks real-time operating system, for example, helps Spirit's software and hardware elements operate cooperatively. Trosper and Wind River's Mike Deliman believe that technology introduced by JPL for the Mars mission will have a role in future business technology. Deliman also notes that JPL's AutoNav system, which allows the rover to navigate through its environment without human aid, could contribute to the development of a self-driving automobile.
    Click Here to View Full Article

  • "At Tech Trade Show, Devices Don't Speak Same Language"
    Los Angeles Times (01/08/04) P. C1; Healey, Jon

    Device incompatibility was a running theme at this week's International Consumer Electronics Show, despite companies flaunting new products that promise a universal interface for computer, home entertainment systems, and the Internet. Yet businesses are forging ahead with the rollout of proprietary offerings despite their lack of interoperability. "What that means is all these content-protection technologies are going to collide in the home," warned Intel VP Donald M. Whiteside. "And every device in the home isn't going to be able to recognize all of those protection technologies." An NPD Group survey to be released Jan. 8 indicates that more and more users are interested in being able to access movies, games, and music on any convenient household device, but establishing network-based communications between computers, TVs, and stereos is a formidable challenge. The Digital Home Working Group, an association made up of technology and consumer-electronics companies, has been working on a unified strategy to networking home-based PCs and entertainment devices. The coalition has almost finalized its first round of standards, which relate to audio, video, and image files that are not electronically shielded; but those standards will not apply to encrypted Hollywood movies and popular music delivered over the Internet. The advent of digital home networks will rely on the availability of compelling mainstream content.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "New UCI Center Promotes Diversity in Technology Fields"
    EurekAlert (01/08/04)

    The Ada Byron Research Center (ABRC) at UC Irvine's School of Information and Computer Science is dedicated to the study of diversity in technology fields, with an emphasis on boosting the recruitment and retention of women, Latinos, African Americans, and other minorities in IT through research, outreach, and educational programming. "Under Dean [Debra J.] Richardson's leadership, ABRC will formalize and leverage current diversity efforts and expand interdisciplinary research and curricular revisions to encourage a more diverse population studying, teaching and creating information technology applications," stated William Parker, UCI vice chancellor for research and graduate studies. ABRC will establish new classes and academic majors at Irvine and other UC campuses to determine the fundamental reasons for the scarcity of women and other underrepresented populations in IT and computing, and tackle the barriers to their full participation in these fields. The center also has a major role in the National Center for Women and Information Technology, a newly established national association of organizations committed to increasing women's presence in the IT sector. Women account for less than 18 percent of all IT jobs, even though almost 50 percent of the U.S. workforce is female; meanwhile, only 30 out of every 1,000 computer science graduates is African American, 12 are Hispanic, and one is Native American. Among the initiatives to be promoted at ABRC is the Outreach Road Show program, which is designed to expose junior high and high school students to science-related career opportunities through classroom demonstrations. Another ABRC program is Laptops for Literacy, which will study laptop computers' potential contribution to students' computer skills and academic achievement, particularly in culturally and linguistically diverse schools.
    Click Here to View Full Article

  • "Let There Be L.E.D.'s"
    New York Times (01/08/04) P. E1; Austen, Ian

    Light-emitting diodes (LEDs) are being touted as a more power-efficient substitute for incandescent light bulbs, though the technology is not without its disadvantages, cost being one of the biggest. Other pluses of LEDs in addition to efficiency include more longevity and less heat output than incandescent bulbs. Light bulb manufacturers large and small are interested in LEDs as a core technology for new products, but power conservation is also an issue; lighting accounts for approximately 20 percent of all electricity in the United States, and a transition from bulbs to LEDs could halve that percentage, relieving pressure on the electrical infrastructure and reducing the chances of major outages. Dr. Anil R. Duggal of General Electric reports that the incandescent bulb has the least amount of efficiency--in fact, roughly 90 percent to 95 percent of the electricity channeled into most incandescent bulbs is turned into heat. Consumers, however, prefer such bulbs because of their cheapness, though researchers are working on making solid-state lighting technology more affordable. Meeting this challenge requires overcoming several technical barriers, such as LEDs' inability to generate light of a hue appropriate for everyday household use, and their lack of diffuse brightness. LEDs could also face challenges from a sister technology, organic light-emitting diodes (OLEDs): Plastic-based OLEDs do not need to be fabricated in semiconductor factories, and Duggan notes that they could be stamped out in much the same way newspapers are printed. Moreover, they could be printed on flexible materials, which could lead to strips of lighting that could be rolled up like wallpaper.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Internet Cleans Its Own House"
    USA Today (01/08/04) P. 13A; Downes, Larry

    Larry Downes of the University of California, Berkeley's School of Information Management and Systems thinks it is far more effective to let the Internet community solve spam, privacy infringement, and other problems plaguing the Web than leaving it to government regulation. This is because the Internet community instinctively knows the best time to tackle such challenges--when they stop being mere irritants and become threats to productivity. For example, the FTC has limited its campaign against pop-up ads to the most deplorable examples, and has been repeatedly defeated in the courts. However, Google, Earthlink, and other companies are offering free "pop-up stopper" software that eliminates nearly all such ads, while Microsoft declared in November 2003 that a pop-up blocker would be embedded into the next version of Internet Explorer. Meanwhile, Downes writes that the federal anti-spam law that went into effect on Jan. 1 does not appear to have had a significant impact on purveyors of junk email: Criminalizing the use of bogus email addresses has little effect when the addresses cannot be traced in the first place. Another problem that could undermine the law is the likelihood that it will face First Amendment challenges. The Internet community is developing its own anti-spam solutions: NetsEdge Research anticipates that the spam glut will be mitigated within half a year thanks to the efforts of ISPs and email hosts who introduce address-verification technology, while reputation systems will give users the ability to choose which legitimate marketers they want to receive messages from.
    Click Here to View Full Article

  • "Next Digital Screen Could Fold Like Paper"
    Christian Science Monitor (01/08/04) P. 14; Valigra, Lori

    Flexible display technology is on the horizon, with "smart" papers and super-thin glass displays expected to debut this year; planned further down the line are wearable display technologies such as clothes that change color, automatic camouflage systems for soldiers, reusable paper, and lightweight, updateable electronic books. A collaborative effort between E Ink of Cambridge, Mass., Royal Philips Electronics, and Toppan Printing has yielded thin e-books to be sold by Sony and Matsushita in early 2004. The books utilize electronic ink, which Philips scientist Mark Johnson describes as a medium consisting of black-and-white capsules that configure into text and images in response to electrical current, with content downloaded through wireless networks; the e-ink is coated onto a thin polymer substrate, which is placed on a sheet of electrode-studded glass, while electronics positioned along the edge control the ink. The e-book to be sold by Matsushita employs cholesteric liquid-crystal displays, which produce images when an electric field triggers a switch between a color and black. Another application E Ink is working on is "Radio Paper," a flexible display capable of wireless updates that could be rolled out in three or four years. An even more tantalizing product is an electrically conductive ink that can produce sounds or light up. Meanwhile, roll-up digital displays could become a reality thanks to a semiconductor ink that prints transistors developed by researchers at Xerox's Palo Alto Research Center. Flexible displays are a major area of focus for the U.S. military: Army Research Lab physicist David Morton reports that future uniforms will be enhanced with such displays, and adds that the Army has a 10- to 15-year goal "to go to a flexible substrate [display backing material] that can be printed roll-to-roll like newspapers, then cut to size."
    Click Here to View Full Article

  • "Lord of the Nano-Rings May Hold Key to I.T."
    NewsFactor Network (01/06/04); Martin, Mike

    Purdue University researchers believe it is theoretically possible to develop nonvolatile memory based on self-assembling cobalt "nano-rings" capable of storing data at room temperature. Purdue chemist Alexander Wei describes the rings as "essentially tiny magnets with a north and south pole, just like the magnets you played with as a kid." The nano-ring formation is attributed to the magnetic dipoles of nano-particles, which generate a "flux closure" magnetic state in which the magnetic force is cut off from the outside once the rings close. The Purdue researchers were able to monitor the stability of the flux-closure states at room temperature through electron holography. Wei notes that the nano-rings contain a magnetic field that travels either clockwise or counterclockwise, which gives them the ability to store binary data. He adds that errors during data processing are also reduced because the rings keep the magnetic field isolated. Furthermore, the rings could also lower the likelihood that information will be lost when the power is cut. "Systems like this could be what the data-storage industry is looking for," remarks Wei. The cobalt nano-rings have the potential to compress storage space to less than 1/10,000th the width of a human hair.
    Click Here to View Full Article

  • "A Virtual Cash Register Rings Up Tiny Transactions"
    New York Times (01/08/04) P. E5; Eisenberg, Anne

    Electronic micropayment systems may get a shot in the arm with the emergence of new technology. MIT researchers Ronald L. Rivest and Silvio Micali founded Peppercoin, a company offering software designed to cut online merchants' transaction costs by employing sophisticated encryption and mathematical models to avoid charging sellers a fee for every sale of a specific item; the system statistically chooses a representative example of the transactions for billing, using algorithms that were developed and tweaked over the past two decades. Smithsonian Folkways Recordings has entered into an agreement with Peppercoin to reduce transaction fees from the electronic sale of 33,000 folk recordings. Another new micropayment firm is BitPass, which offers a simplified system that lumps all gatekeeping and financial processing services into one file that can be easily downloaded to a Web server for use by a hosted site. "This way the little guy can engage in digital commerce," notes BitPass CEO Kurt Huang. DaVinci Institute executive director Dr. Thomas Frey believes electronic micropayment systems will make it possible for consumers to purchase a wide variety of low-cost items online, including cell phone ring tones, video game upgrades, and fashion items for electronic avatars. However, some people doubt that a business can realize success through micropayments alone. Visa International's Sandy Thaw argues that successful companies will also have to take into account extra costs for brand and service promotions, as well as efforts to make those services available to merchants and consumers.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "UI 'Sitting on a Breakthrough'"
    Champaign News-Gazette (Il) (01/03/04); Kline, Greg

    Light-speed computing and electronic communications are some of the potential applications of a new light-emitting transistor suggested by University of Illinois professor Nick Holonyak. The transistor was developed by UI researchers working at laboratories headed by Holonyak and fellow electrical and computer engineering professor Milton Feng. The device, which was assembled by graduate student Walid Hafez in the UI Micro and Nanotechnology Lab, was touted as the world's fastest transistor when it was unveiled in 2003. The device runs at a speed of 509 GHz. Holonyak believes the technology has more near-term potential to effect advances in computing power than in nanotechnology. Feng says the transistor boasts both an optical and electronic output, whereas conventional transistors only have an electronic output. Thus, the possibility exists of substituting optical interconnects for electrical wiring between circuit board components so that photons can be directed around a chip to conduct operations exponentially faster than is feasible with electronics only. Feng adds that visual displays could benefit from the technology because it would obviate the need for independent light-emitting devices and transistors. Holonyak says the breakthrough technology could usher in a new era of computing where time and distance are compressed and "you could have access to anything, anywhere in terms of information."
    Click Here to View Full Article

  • "Instrumenting the Enterprise"
    ZDNet (01/05/04); Farber, Dan

    Dan Farber writes that the instrumentation of IT--in which the behavior of IT infrastructure "instruments" is monitored and managed at multiple levels across the extended enterprise--will be a critical theme in 2004. The goal of instrumentation is to incorporate, or architect, rules of acceptable behavior within software and hardware, along with the ability to disseminate essential data about an instrument's "state." Farber comments that enterprises are already applying instrumentation to parts of data centers, security infrastructure, and other areas, and adds, "We are moving from isolated and primitive instruments in an increasingly complex distributed computing environment to more comprehensive instrumentation, analogous to how a patient in an intensive care unit is instrumented." Tools for instrumenting applications as a component of systems management are being offered by several companies: For instance, business activities monitoring (BAM) supplies grist for business process management through the output of instrumented applications. Farber points out that the integration of architecting and instrumenting IT across the enterprise is a nascent practice, while most instrumentation revolves around monitoring, rather than following a course of action dictated by rules and inputs. The author writes that enterprises can boost the automation and flexibility of their IT operations through instrumentation, and deploying such instrumentation will become easier as standards emerge. At the cutting edge of IT instrumentation are efforts by major platform vendors to give utility computing a mainstream foothold, such as Sun Microsystems' JFluid project, IBM's autonomic computing initiative, and Microsoft's System Definition Model.
    Click Here to View Full Article

  • "CITRIS Q&A: Interview With Bernd Hamann"
    CITRIS Newsletter (12/03) Vol. 2, No. 5

    Bernd Hamann, co-director of UC Davis' Center for Image Processing and Integrated Computing (CIPIC), has developed data visualization techniques that have helped researchers obtain new insights about external phenomena and human biology. He says the goal of CIPIC is to pioneer visualization technology that helps people quickly and easily understand massive volumes of data. His center processes information from large sensor networks that collect data on virtually anything, including the structural integrity of bridges following a seismic event and the state of a building environment after a fire. Hamann explains that people want to instantly generate images representing recorded data to facilitate simple analysis. He notes that one of the biggest technical challenges in creating new visualization technology is enabling scientists to remotely access and browse huge portions of data over high-speed networks. CIPIC is also involved in a collaborative project with the UC Davis Center for Neuroscience devoted to visualizing human and primate brain data derived from magnetic resonance imaging and other forms of analysis. One of the project's goals is to visually represent cerebral abnormalities such as schizophrenia via comparison to a "brain atlas," a reference model of a standard brain architecture.
    Click Here to View Full Article

  • "SmartAdvice: IT Generalist Background Gives an Edge"
    InformationWeek (01/05/04); Cohen, Beth; Guibord, Alan; Taglia, Peter

    The Advisory Council (TAC) Thought Leader Beth Cohen sympathizes with an IT manager who works at a small company and feels overwhelmed by work and underappreciated by the management; she comments that this feeling is common to many IT generalists, but adds that their wide range of skills gives them advantages over IT specialists when they move to a larger company. Citing her personal experience, Cohen notes that IT generalists comprehend the integration of systems, applications, and networks into an enterprise IT infrastructure. She points out that although IT specialists are in high demand, their concentration in one particular skill or technology will not serve them well if and when that technology becomes outdated. Cohen recommends that IT generalists boost their skills repertoire and keep abreast of tech trends by reading IT trade publications, attending free local vendor seminars and as many local IT conferences as possible, building project-management skills, and becoming more familiar with people management. "As long as you focus on maintaining your technical edge while building your project and people-management skills, you'll ultimately remain highly employable and challenged by your work," she concludes. TAC Expert Peter Taglia writes in response to an inquiry that middleware is an tech infrastructure component with a high reassessment priority, and points out that benefits can only be realized with the deployment of a comprehensive strategic business plan and an architecture infrastructure; changeovers require careful planning and implementation in order to ensure the establishment of a standardized software framework or "design pattern." Taglia notes that middleware is more conducive to standardization--and consumes more resources--than operating systems and databases. His advice is to re-evaluate infrastructure software before pursuing other infrastructure projects such as Web services, grid computing, enterprise portal frameworks, or process automation.
    Click Here to View Full Article

  • "EMMA: W3C's Extended Multimodal Annotation Markup Language"
    Speech Technology (12/03) Vol. 8, No. 6, P. 6; Larson, James A.

    The W3C Multimodal Working Group recently released an important first draft of its Extended MultiModal Annotation markup language (EMMA) that will allow different multimodal computer input systems to work synchronously. A standard EMMA language generator included in speech recognition, handwriting recognition, keyboard and mouse, and other user input recognizers/interpreters would convert that modality-specific information into a standard EMMA language. Users, applications, and other back-end systems would receive an integrated picture of information; for example, users could say "Zoom in here" while circling an area with a pen, and the computer would be able to zoom in on the specified area. So far, EMMA uses three types of descriptions for user-entered information: The data model that contains command name and parameter values, the instance data that details parameter values, and the meta data that annotates the instance data. Examples of EMMA meta data include confidence values assigned to interpreted spoken and handwritten input. Other EMMA-defined concepts include: Interpretation, derived-from information, process information, language used, timestamps, medium, mode, function, and others. EMMA is still evolving and the W3C Multimodal Working Group is soliciting practitioner feedback concerning stated concepts and integration of meta data, for example; once completed, developers will be able to easily create multimodal platforms choosing best-of-breed input components. The single, integrated representation of multimodal user input will also be useful for dialog managers, inference engines, and other sophisticated information processing components.
    Click Here to View Full Article

  • "Raise You 50..."
    New Scientist (01/03/04) Vol. 180, No. 2428, P. 64; Wilson, Clare

    Poker is thought by many to be a game that computer programs cannot possibly win against seasoned opponents, given all the psychological, instinctual, and mathematical factors that come into play. Nevertheless, researchers at the University of Alberta in Edmonton are working on poker-playing programs that can best human champions, and such a breakthrough could lead to the design of more intelligent computers that can extrapolate the best course of action from imperfect data. The Edmonton team's first poker program, Sparbot, is organized around limit Texas hold'em, and employs game theory as its chief operating principle; however, an increase in the number of possible hands and betting sequences is accompanied by greater calculative difficulty, and to accommodate this Sparbot categorizes its hand into one of seven groups according to its strength in the poker ranking table. Members of the public were invited to try their luck against Sparbot online last year, and the computer generally beat most players, coming out an average of more than one dollar ahead per hand. Sparbot lead designer Darse Billings has no doubt that the program was helped by the "strong emotional reactions" players experienced when Sparbot was winning. However, the program's chief weakness was uncovered when a player took Sparbot's lack of fear into consideration, and played less aggressively. The Edmonton team has started work on a program designed to overcome Sparbot's inability to predict people's playing styles based on past performance; the program, Adaptibot, records all data of every hand its opponent plays, but this means a reduction in performance when it is busy opponent modeling. The team is now mulling over a hybrid of Sparbot and Adaptibot that employs both game theory and opponent modeling.

  • "IT Agenda 2004"
    Computerworld (01/05/04) Vol. 32, No. 1, P. 42; Brandel, Mary

    Premier 100 IT leaders say they will be taking a great interest in how wireless, business intelligence, Web services, and grid computing pan out in the coming year. Wireless technology is a sensible choice for organizations where mobility is a key component: Dominion Resources, for instance, plans to deploy remote meter-reading capability via radio frequency information technology and the direct transmission of cellular signals to databases by the end of next year, and is also investigating wireless LANs as a vehicle for boosting information-sharing efficiency in its offices. The University of Notre Dame, meanwhile, wants to increase students' and faculty's mobility by setting up 802.11 access points in campus buildings and Vivato switch/antenna technology in its green spaces. Notre Dame and TRW Automotive are among those hoping to enhance their competitiveness through the implementation of business intelligence. Notre Dame's goal is to have fast access to data that allows the school to expand its academic and research initiatives, while TRW wants to accelerate how quickly its sales and marketing executives can get hold of customer information culled from disparate back-end systems. Budget restrictions and other crises are prompting organizations such as the Florida Department of Children and Services to invest in Web services technology, which offers a less costly way to integrate data from diverse databases than the data warehouse approach. Grid computing has been mostly restricted to scientific research, but companies such as Minnesota Life have started to see the technology's value as a tool to help relieve some of the computational burden of their financial management systems. Most IT leaders agree that the best way to keep track of promising new technologies while balancing budgetary limitations and business pressures is to rely on trusted sources, such as advisory groups, steering committees, and domain experts.
    Click Here to View Full Article

  • "Software and the City"
    IEEE Spectrum (12/03); Guizzo, Erico

    The most advanced city modeling and simulation software currently in existence is UrbanSim, developed by researchers at the University of Washington in Seattle. The software's developers created UrbanSim to address some of the uncertainties and complexities that other urban modeling tools are ill-equipped to handle, such as the relationship between transportation and land use, the effects of public policies at the street level, and discrepancies between the choices households are likely to make and the choices they actually make. One of the things UrbanSim project director Paul Waddell wanted the tool to provide was a level of resolution that could clearly simulate the land development process as it takes place on individual land parcels, comprising grid cells of 150 by 150 meters that detail local populations, housing, business, and real estate prices; the UrbanSim model is also designed to present clear, explainable data with real-life representations of people, objects, and actions. The software simulates four central agents--households, businesses, developers, and governments--whose interactions are represented by land usage and development, which can be extrapolated for several decades because the agents make the same decisions on an annual basis. One of the chief values of UrbanSim is its potential to resolve disputes and promote consensual collaboration between the different people and advocacy groups involved in city planning. The fine-grained detail furnished by UrbanSim can be problematic: The bigger the city to be simulated, the more calculations needed to determine agents' choices. UrbanSim may also not be viable in many regions because available data is insufficient to provide the necessary level of detail. The UrbanSim project, which resides at the University of Washington's Center for Urban Simulation and Policy Analysis, has received over $5 million in funding from the National Science Foundation.
    Click Here to View Full Article

  • "It's in the Algorithms"
    Information Highways (12/01/03) Vol. 11, No. 1, P. 12; MacKinnon, Paula

    Research projects to develop more intuitive search algorithms could offer more personalized alternatives to the "group consensus" algorithms supporting Google's search engine. Kaltix's technology apparently aims to recalculate search results on a per-user basis by accelerating the computation of PageRank--a critical component of Google's search algorithm--by 30 percent, while Kaltix co-founder Taher Haveliwala suggests several personalized search algorithms in the Stanford University Technical Report. A Mathematics of Information Technology and Complex Systems (MITACS)-funded project in Canada is exploring the possibilities of a "focused crawler" that roams the Web following page-to-page links but only indexes pages and documents that correlate with particular subjects; the crawler could take note of how users browse online and imitate their browsing patterns. "What we're hoping to be able to do is to take the research that [MITACS is] doing and improve it in such a way that we won't have to wait for you to search," says IT Interactive Services CEO Barbara Manning. "We will be able to track your pattern of searches and return information that will be useful to you based on that pattern." Google could also face competition from several other emergent search service providers developing personalized search algorithms. IBM's WebFountain project will collate, archive, and study data from Web sites, news feeds, Weblogs, bulletin boards, and other unstructured and semi-structured sources, and will predict gestating trends and "buzz" through the use of natural language processing, statistics, probabilities, machine learning, pattern recognition and artificial intelligence. Meanwhile, Netnose promotes itself as the first "people-powered" search engine; the voting public, not a computer algorithm, will influence search accuracy via a process in which people vote to define word-Web site matches.
    Click Here to View Full Article



    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM