HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 660:  Wednesday, June 23, 2004

  • "Embedding Their Hopes In RFID"
    Washington Post (06/23/04) P. E1; Krim, Jonathan

    Hailed as a revolutionary technology for tracking inventories and individual items, radio frequency identification (RFID) is now facing tough economic realities and privacy concerns. Some deployments over the past decade have been unquestionably beneficial, such as the E-ZPass system that speeds drivers through highway toll stations; other uses, such as a charter school in Buffalo affixing RFID tags on students for taking attendance, have created unease. Retailing is seeing the fastest movement on RFID, with large buyers such as Wal-Mart, Target, and Albertson's setting deadlines for their top suppliers: Those deployments will focus on the pallet and case level, and so will avoid the controversy item-level tracking has caused. Benetton Group and Gillette have been boycotted for testing RFID tracking for their individual products, and some of the fear surrounding RFID may come from early boosterism on the part of RFID researchers and suppliers. Even the now disbanded MIT program dedicated to RFID research said its mission was to "create a single global technology that will enable computers to identify any object, anywhere, automatically." Similar sound bites are now coming from RFID opponents, such as Consumers Against Supermarket Privacy Invasion and Numbering founder and privacy activist Katherine Albrecht, who worries that supermarkets and retailers could sign agreements with carpet manufacturers, for example, to track how consumers use products in their homes. More than 40 public-interest groups have signed a joint agreement demanding consumer-facing RFID applications adhere to strict guidelines, such as enabling consumers to disable the item tag, but Gartner RFID analyst Jeff Woods notes that the per-chip price of RFID tags is still too high to allow the type of ubiquitous deployment many worry about. RFID readers are also currently limited to a 20-foot range and signals are affected by moisture and nearby metals.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Internet Ups Power Grid IQ"
    Technology Research News (06/23/04); Bowen, Ted Smalley

    Most buildings are now wired into the Internet, which makes possible programmable buildings that dynamically adjust electric power consumption according to infrastructure, economic, and environmental conditions. A system developed by Lawrence Berkeley National Laboratory researchers could be a key enabling technology. The system was put through its paces in a two-week test that demonstrated the feasibility of using Internet-based price broadcasting to adjust power consumption according to prearranged price margins: The management systems of five buildings in California responded to price changes within two minutes, explains Mary Ann Piette with Berkeley Lab's Environmental Energy Technologies Division. She adds that business and consumer power usage could be reduced and power outages avoided with such a system, noting that similar setups could facilitate power consumption based on other variables like brownout prevention. Stephen Connors of MIT's Institute of Technology Laboratory for Energy and the Environment reports that the systems could be modified to respond to fluctuating air quality, mine sustainable energy sources, and operate within emissions trading schemes, among other things. The building management software was customized for employment with Internet-based demand-response systems under the aegis of California Energy Commission-funded initiatives. The Berkeley scientists devised an XML blueprint that was built upon the work of Infotility, an energy market information software and Web services provider. Among the technical challenges the researchers faced to conduct the test was correctly configuring Web services at the five test sites; deploying measurements systems to collect data on buildings' contractions in power demand; and characterizing systems with paltry documentation.
    Click Here to View Full Article

  • "Software Industry Seeking New Ways to Fight Piracy"
    Investor's Business Daily (06/22/04) P. A4; Bonasia, J.

    The software industry has been attempting to counteract digital piracy through education and technological measures, but the results have been uneven. Business Software Alliance (BSA) VP Bob Kruger says program-sharing employees at small and midsize firms are chiefly responsible for the rampant spread of software piracy, which costs the industry $13 billion annually, by BSA estimates. The software industry's anti-piracy tactics have evolved from unwieldy "dongles" to the application of serial numbers to software products that verify licensed users online when a new program is activated, but Autodesk government affairs director David Crane believes the optimum solution is a greater emphasis on education and anti-piracy enforcement. The nonprofit BSA raises public awareness of digital piracy through representation at industry events, offices, and schools, and via notices and advertisements; in addition, people can report on their current or former employers through a BSA Web site or a toll-free hot line. If companies are not complying with software license terms, BSA fires off a letter of warning to the CEO, and then may request a court order for a surprise software audit if the company remains noncompliant. "We want to bring these companies into the fold of responsible software users," says Kruger. Perpetrators of black-market organized digital piracy may also face the wrath of the Justice Department: Two years ago, John Sankus Jr., chief architect of the notorious DrinkOrDie software piracy ring, received a prison sentence of 46 months. Kruger says such incidents can serve as reminders to corporate tech managers of the importance of software license enforcement.

  • "Spam-Sending PCs Could Be Kicked Offline"
    MSNBC (06/22/04); Sullivan, Bob

    The Anti-Spam Technical Alliance, which counts Yahoo!, AOL, Earthlink, and Microsoft among its members, released a set of recommendations on June 22 for halting the proliferation of junk email. One of the recommendations calls for ISPs to cut email service for any users whose computers have been turned into "zombie" spam-launching platforms, even if they are unaware that their systems have been hijacked. MessageLabs.com estimates that almost two-thirds of all spam is sent by zombie systems, while AOL believes that figure could be closer to 90 percent. MessageLabs' Brian Czarny doubts that ISPs would be able to suspend service for so many users, given the massive volume of customer service calls they would be inundated with; a more realistic expectation is for the firms to restrict outgoing emails to 100 or 500 per day, and then notify users that their machines must be purged before they can send any more messages. MessageLabs researchers have also determined that spammers are increasingly personalizing spam by monitoring recipients through spyware programs--in fact, a recent Earthlink poll calculates that one-third of all Net-linked computers have been infected with spyware. More accurately identifying actual email senders is another priority of the Alliance, and among its proposals for reaching this goal is restricting the number of emails spam purveyors can send, if not shutting off their email altogether. "It's much the way a credit-card company would look for...suspicious spending on your credit card and either contact you or secure your account immediately," explains AOL director of anti-spam operations Carl Hutzler. Earthlink chief architect Robert Sanders argues that deactivating consumers' email benefits them since their PCs are already contaminated by malware.
    Click Here to View Full Article

  • "Oxygen Burst"
    Boston Globe (06/21/04); Weisman, Robert

    MIT's Project Oxygen is making progress on pairing computer technology with human needs: Launched in June 2000 by Laboratory of Computer Science director Michael Dertouzos, who died one year later, Project Oxygen has stayed on course with the original goal, according to current project leader Victor Zue. MIT recently showcased new technological innovations to emerge from the Project Oxygen effort, including a reconfigurable chip called Raw that can quickly adapt to specialized tasks by downloading software instructions. An Oxygen Kiosk network, dubbed OK-Net, is meant to be a building-specific repository of information about work going on inside. Web-crawling software agents keep workers informed about the most up-to-date information, such as meeting times and project deadlines. A network of distributed sensors, called Crickets, is meant to track the location of autonomous robots without a fixed reference point. Project Oxygen also displayed the world's largest microphone array to date, which uses 1,020 microphones, the Raw chip and a 3D tracking camera to isolate a single conversation or voice in a room; MIT officials plan to incorporate the microphone array in their new Stata Center building's auditorium. Also gathered at the MIT meeting were Project Oxygen corporate sponsors, a relatively small group of companies that has contributed an inordinate amount of funds toward the project. Insiders say Project Oxygen is a potential model for future corporate-sponsored research projects in that it focuses on building an environment and meeting needs rather than product innovation alone: Nokia strategic planning director Juha Yla-Jaaski says many Project Oxygen goals are also Nokia research goals, such as voice control and improved user interfaces. Hewlett-Packard research director Frederick Kitson says investing heavily in university projects is beneficial in that it is easier to develop standards than outside the university.
    Click Here to View Full Article

  • "Apple Tapped for Defense Dept. Research Supercomputer"
    IDG News Service (06/22/04); McMillan, Robert

    Apple Computer yesterday announced an agreement with Colsa, a U.S. military contractor, to build a $5.8 million, 1,566-microprocessor supercomputer capable of 25 TFLOPS for aero-thermodynamic simulation. The Multiple Advanced Computers for Hypersonic Research (MACH 5) will be employed by the Army's Missile Research, Development, and Engineering Center, and would earn a place just behind Japan's 35.9 TFLOPS Earth Simulator if it were ranked on the Top500 list of the world's fastest computers today, according to Colsa. Meanwhile, Virginia Polytechnic Institute and State University's Apple-based supercomputer, which earned the No. 3 spot on last November's Top500 list, did not make the latest version of the list. The university failed to submit a benchmark because it is currently occupied with assembling a new machine based on Apple's dual-processor Xserve G5 rack-mounted server, explains Apple server and storage hardware director Alex Grossman. The Xserve G5 will be used in Colsa's MACH 5, which should run the Top500 Linpack benchmark in time for the November ranking, says Colsa executive VP Antony DiRienzo. Some supercomputer users doubt the high-performance computing capability of Apple systems, beyond running benchmarks. "The science impacts of these systems still haven't been demonstrated, and the fact that they disappeared from the most recent Top500 list tells me that the first system didn't work or it was put together solely for Linpacking, which isn't a useful measure of a supercomputer," contends Scott Studham of Pacific Northwest National Laboratory's Molecular Science Computing Facility. Jeff Nichols with Oak Ridge National Laboratory notes that the prolonged downtime of the Virginia Tech system casts doubt on Apple's ability to transition clients to new high-performance computers.
    Click Here to View Full Article

  • "I.B.M. Decides to Market A Blue Streak of a Computer"
    New York Times (06/21/04) P. C3; Lohr, Steve

    IBM's Blue Gene supercomputer was originally designed for protein-folding research, and now variants of the computer cluster are being readied for nuclear weapons simulation and commercial applications. The fastest machine in the world is the NEC Earth Simulator in Japan, a massive and expensive vector supercomputer that uses custom silicon processors; a Blue Gene/L being prepped for the Lawrence Livermore National Laboratory, in contrast, is a cluster configuration of standard microprocessors whose cost, size, and power consumption levels are a mere fraction of those of the Earth Simulator, while its speed is expected to overtake the Japanese supercomputer by a factor of nine. The Blue Gene/L consumes even less power than other cluster designs because it uses microprocessors IBM provides for video game consoles. Other factors that must be considered in the design of supercomputers include peak performance, floor space, and the distribution of complex scientific applications across vast numbers of microprocessors. Livermore lab researcher Michel McCoy remarks, "With Blue Gene/L, IBM has addressed all of these issues." The Livermore Blue Gene/L will harness the power of 131,000 chips to enhance nuclear-blast models, and its expected cost is under $100 million--more than three times smaller than the Earth Simulator's price tag. The commercial version of Blue Gene/L could find customers in the oil, investment bank, and pharmaceutical industries, to name a few possibilities. Two Blue Gene/L prototype systems currently own the No. 4 and No. 8 spots in the biannual ranking of the world's 500 fastest computers.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Surfing In the Dark"
    SiliconValley.com (06/21/04); Ha, K. Oanh

    Despite a federal mandate that disabled users must be able to access government Web sites and those of its suppliers, the percentage of inaccessible sites still outweighs the percentage of accessible sites. The World Wide Web Consortium's (W3C) Web Accessibility Initiative director, Judy Brewer, explains that accessibility solutions that benefit the visually impaired can often help people with other handicaps. Still, online obstacles are particularly frustrating to blind users who rely on systems that convert online data into audio. For instance, screen reader software can be thrown off by a site's graphical content and interpret PDF files and flash movies as unreadable, while page links may lack adequate description. Screen readers can also be interrupted by pop-up adds, while their usefulness for reading email has increased the blind's accessibility to annoying spam messages. Furthermore, secure sites often require users to enter a displayed password on the screen, but automated screen readers are incapable of recognizing the confirmation passwords. Brewer says that many of these problems can be solved through the widespread adoption of common accessibility standards, such as those the W3C promotes. She adds that businesses must come to realize that supporting universally accessible products makes sound financial sense.
    Click Here to View Full Article

  • "Open Source as Weapon"
    InternetNews.com (06/18/04); Kuchinskas, Susan

    Major software makers are jumping on the open source bandwagon, but not because sharing produces better code or more value for customers; companies are using open source tactics to reduce the value of their competitors' proprietary offerings while increasing the market for their own add-on services or software products. IBM has led in open source development for some time, and in January 2004 even reorganized its Eclipse developer framework effort as an independent non-profit. Other notable open source moves among major software companies include Sun Microsystems' willingness to open Solaris code and Novell's acquisition of major Linux distributions. Even Microsoft is getting in the game with its Shared Source Licensing programs and the recent donation of the Windows Template Library to SourceForge. Business software book author Martin Fink notes that while commercial software normally loses value over time, the release of open source alternatives dramatically speeds that devaluation. IBM was getting squeezed by the open-source Apache Web server and Microsoft's server offerings in the late 1990s; IBM abandoned its own server software for Apache and began selling WebSphere middleware to run atop the Apache platform. Real Networks has also used open source licensing as a way to maneuver against the Microsoft behemoth: Real Networks' Helix DNA Project is the only way Real can keep from being out-engineered by Microsoft, which has vastly more resources at its disposal, says Helix technology general manager Kevin Foreman. The Helix program permits OEMs and ISVs, in addition to independent developers, to create versions of the Real mobile phone platform for hundreds of different handsets. The company would have no other way to reach such a diversified market, Foreman says.
    Click Here to View Full Article

  • "Printable Silicon for Ultrahigh Performance Flexible Electronic Systems"
    ScienceDaily (06/18/04)

    Using funding from the Defense Advanced Research Projects Agency and the U.S. Energy Department, scientists at the University of Illinois at Urbana-Champaign have demonstrated a technique for fabricating mechanically flexible thin-film silicon transistors that yield ultrahigh performance. U of I materials science and engineering professor John Rogers notes that the size of silicon wafers limits conventional silicon devices, and rather than increasing the size--and the price--of the wafer, the researchers elected to divide the wafer and distribute it as desired on large, cheap substrates such as flexible polymers. The demonstration involved the use of traditional lithographic patterning and etching methods to assemble single-crystal, nanoscale silicon objects, which were then transferred to substrates to form thin-film transistors. Ralph Nuzzo of U of I's Frederick Seitz Materials Research Laboratory explains that two approaches were used to effect the transfer: One approach involved the objects being scattered in a solvent and then cast via solution-based printing, while the other used high-resolution rubber stamps. Keeping the processing of the silicon and the fabrication of other transistor elements separate allows the devices to be combined with a wide spectrum of materials. The technique could be used to build new consumer electronics products such as cheap, wall-sized displays and low-cost radio frequency identification tags. This could support the incorporation of electronic intelligence into ordinary objects, which Nuzzo says could revolutionize communication and information exchange.
    Click Here to View Full Article

  • "Memo to Steve Jobs: Give iPod Color!"
    Technology Review (06/04); Hellweg, Eric

    Eric Hellweg writes that Apple Computer's iPod music player, despite its stylish design and easy-to-use controls, suffers from a major drawback that threatens the product's growth potential: An unappealing display with primitive, monochromatic font. If Apple wishes to sustain its leadership in the digital music arena, it should augment the iPod with a color display as well as offer non-music applications. Hellweg notes that there is certainly demand for the latter, citing a recent Wall Street Journal report about increasing ranks of programmers who have authored maps, games, and other non-music applications for the iPod. Several developments clearly indicate that it is in Apple's best interests to expand the iPod's usability, most notably Toshiba's June 2 proclamation that it boosted the storage capacity of the 1.8-inch hard drive iPod uses from 40 GB to 60 GB, and the market entry of new digital music players such as the Sony Connect online music store and an upcoming digital music service from Microsoft. Inside Digital Media analyst Phil Lee says the storage increase represents "more music [storage] than anyone will ever need," which makes non-music applications not only supportable, but desirable. Meanwhile, increasingly sophisticated mobile phones with greater storage space could also endanger the iPod's dominance. Hellweg writes that the introduction of a color display and a more spacious hard drive does not have to imply a move away from music-only applications. Such enhancements could allow iTunes customers to retrieve an image file of the album cover art, or view videos or video segments of the songs they download, for example.
    Click Here to View Full Article

  • "When Will Wireless Hit WiMax Speeds?"
    IDG News Service (06/17/04); Lawson, Stephen

    Companies offering services that use DSL, cable modem, or leased lines have had little to fear from wireless broadband services, whose widescale implementation has been hindered by costs, a lack of interoperability due to proprietary systems, and complexity. But vendors and analysts think that WiMax-based technologies could harmonize the broadband industry and trigger price reductions critical to the proliferation of high-speed Internet connectivity. The first WiMax rollout will employ the IEEE 802.16d standard and support links to fixed locations at speeds ranging from 300 Kbps to 2 Mbps across a maximum range of 30 miles, while a mobile version of WiMax based on the 802.16e specification could be ready in about 12 months. The WiMax Forum expects to start certification of interoperable WiMax products by the end of the year, while Intel anticipates a surge in WiMax's popularity comparable to the accelerated ramp-up of Wi-Fi. On the other hand, WiMax's growth potential will partly stem from the technology's ability to resolve issues related to its use of radio spectrum inside and outside the United States, as well as how it will fare against competing technologies. Any frequency band between 2 GHz and 11 GHz is permitted by WiMax standards, and Francois Draper with the WiMax Forum reports that the organization is streamlining that range by profiling specific spectrum bands. Around three bands are being prepped by the group in the vicinity of 5.8 GHz, 3.5 GHz, and 2.5 GHz; the Forum is ready to announce a working group to push for the global unification of the bands' management as well as promote the allotment of spectrum in lower bands. Vendors and industry analysts agree that licensed frequencies are essential to enabling the delivery of business-class WiMax services.
    Click Here to View Full Article

  • "A Confederacy of Smarts"
    Scientific American (06/04) Vol. 290, No. 6, P. 40; Stix, Gary

    Microsoft has assembled some of the best minds in computers within its research division, yet the relatively new unit has yet to prove itself by changing today's computing paradigm for the better. Despite its intellectual clout, Microsoft is seemingly reluctant to fundamentally change computing in order to eliminate viruses and other security risks, for example; "For some reason, they haven't been tackling some of the most fundamental problems, and I'm confused by that," says former Xerox Palo Alto Research Center (PARC) director John Seely Brown. Microsoft's research effort took off at a time when other major IT laboratories were moving away from pure research: The Xerox PARC community had famously failed to capitalize on major computer innovations, partly because of the distance and cultural difference between the company's researchers and product development teams. The first of Microsoft's 700 researchers were located in the company's Redmond, Wash., headquarters, though satellite laboratories have since been set up in Beijing, San Francisco, Silicon Valley, and Cambridge, England. Former Carnegie Mellon University professor Richard F. Rashid now manages the research organization and says there are no budgets and little bureaucracy. He notes that the research group has dispelled the idea that Microsoft mainly builds on the innovations of others since Microsoft is now leading in a number of computer science fields, including computer vision, graphics, and machine translation. CHI Research ranked Microsoft higher than any other company in terms of "science linkage" in its awarded patents, meaning that patents referred to scientific papers published by that company. IBM Research communications director Bill O'Leary says Microsoft still has yet to build close enough relationships between its researchers and developers, while Hewlett-Packard Laboratories director Dick Lampman says, "I see individual islands of excellence but nothing that's moved the needle for Microsoft."
    Click Here to View Full Article

  • "Location! Location! Location!"
    Better Software (06/04) Vol. 6, No. 5, P. 32; Kolawa, Adam

    Automated Error Prevention (AEP) promises to make software quality better, teams more productive, development costs less expensive, and time to market/deployment shorter--and transfer the responsibility for finding construction errors from quality assurance (QA) to developers. AEP offers an architecture for elevating the software industry and the software development process to heights of maturity reached by other manufacturing industries. There are five processes supported by AEP that work together to prevent software bugs and improve development: Error detection, the isolation of error causes, pinpointing where the error was introduced into the process, deployment of practices to thwart error recurrence, and monitoring for improvements. AEP's failure to achieve the desired results can be attributed to inadequate management support or architect and QA support, teams being overwhelmed by hastily instituted AEP practices, or a shortage of buy-in to the importance of error prevention. The QA team's architect and members must be willing the widen the scope of their AEP-defined roles: In addition to designing the system, the architect must guarantee that the system is properly constructed and operates correctly, and be highly skilled in writing code, determining where code problems can crop up, finding algorithmic solutions for problems, and abstracting a higher-level comprehension of code operations from technical details. QA team members, upon finding an error, must look for any related errors and abstract a general error from them, as well as attempt to understand each error's root cause by having a deep knowledge of the system's overall architecture and the architectural component related to the error. By understanding the statistics that underlie measuring, stabilizing, and managing processes, and recognizing whether a process is affected by a special cause or variation, QA members can improve the development process and track those improvements.

  • "Smart Sensors to Network the World"
    Scientific American (06/04) Vol. 290, No. 6, P. 84; Culler, David E.; Mulder, Hans

    Integrating simple computers with sensors, batteries, and radio transceivers has given birth to minuscule "motes" that can self-organize into perceptive networks whose applications range from wildlife and ecosystem monitoring to factory maintenance to emergency management, provided that such devices are designed for low power consumption, affordability, portability, unobtrusiveness, and longevity. Each individual mote can collate and analyze sensor readings independently, but also can wirelessly connect to its neighbors and collectively perform tasks outside the computational abilities of conventional computer systems. Motes are cheap enough to distribute in great numbers, which enables the networks they comprise to gather detailed data about the environment as well as perform reliably even if some motes malfunction or fail. A key design criterion for mote technology is power conservation: Motes are required to spend about 99 percent of the time in a standby mode in which only a few millionths of a watt are consumed; power-saving techniques include deactivation of unneeded resources in response to a predetermined signal, the storage and aggregation of sensor readings, data compression, and sensor log summarization. Motes' ability to self-network is critical in deployments where their configuration changes constantly. Each mote requires an individual operating system and an application program that can be parsed among nodes in order to effect mote-to-mote communications, and the modular, power-efficient TinyOS operating system fulfills this need. As more and more sensor nodes are added to a network, it becomes a challenge to program and debug the network; solutions include TinyDB software and "viral" upgrades. Computer scientists are working on methods to gauge how healthy a perceptive network is by disrupting its operations and measuring its responses.

    David Culler was co-guest editor of a special section on wireless sensor networks in the June 2004 issue of Communications of the ACM. That issue is freely available for the next month.
    Click Here to Access Issue

  • "Proprietary to Open: Middleware Evolves"
    CIO (06/15/04) Vol. 17, No. 17, P. 89; Paul, Lauren Gibbons

    The importance of middleware to CIOs is growing significantly as the software makes the transition from proprietary schemes to open, flexible standards with the help of XML and Web services. More and more CIOs need to integrate increasingly dissimilar systems, and this has led to a fracturing of middleware into sub-classes such as business process management, service-oriented architecture (SOA), and enterprise service bus. AMR Research's Eric Austvold describes first-generation middleware as an "IT-centric" technology that was too costly, inflexible, and excessively difficult to use and learn. But middleware became more useful and affordable with the emergence of the Internet and a parade of data transmission and access standards (HTTP, SOAP, and XML), culminating in Web services. Rather than gluing applications together with tenuous connections, middleware's aim is to rapidly and simply integrate components into "composite applications" on an as-needed basis. Not only is middleware being used to boost the efficiency of end users, but also to streamline maintenance chores and cut costs. Advanced middleware offerings such as MetaStage from Ascential and Pantero's Shared Data Services tool can perform data mapping across systems so that different entities throughout the enterprise can use a unified representation of data. CompuCredit CIO Guido Sacchi opted for an SOA based on Software AG middleware to establish compatibility between disparate systems and databases; all users view the same data due to an XML meta-data repository. "Connecting our systems via standards shrinks our time-to-market," notes Sacchi, while other business benefits include improved customer service, better data for agents, and more precise customization of product offerings.
    Click Here to View Full Article

  • "Fuzzy Logic and Neural Nets: Still Viable After All These Years?"
    EDN Magazine (06/10/04) Vol. 49, No. 12, P. 69; Prophet, Graham

    The high profile of fuzzy logic and neural networks has waned because the systems failed to demonstrate a clear enough performance advantage over established approaches, and also because they do not integrate well with traditional logic-based thinking. Yet both fuzzy and neural techniques are being employed by companies to design and construct innovative systems that address complicated, obstinate, and nonlinear control problems. Fuzzy logic captures verbally expressed data and allows computations to be carried out with the information, while neural nets are designed to simulate people's intuitive comprehension of complex processes without modeling them algorithmically. Although many of the companies that emphasized fuzzy and neural methods died when the techniques fell out of favor, others have survived by converting into software design and consultancy outfits. These organizations' chief application of neural and fuzzy approaches involves software simulation running on conventional computers for the purposes of financial services, financial modeling, and data mining. Fuzzy logic is particularly amenable to general-purpose control systems with limited resources; neural nets, on the other hand, are not. Both techniques--neural nets especially--can be enhanced for adaptability, enabling control of fluctuating processes to be continually optimized.
    Click Here to View Full Article

  • "ICANN's Crisis of Legitimacy"
    eWeek (06/14/04) Vol. 21, No. 24, P. 45; Zuck, Jonathan

    As ICANN prepares for a legal battle over its shut-down of VeriSign's SiteFinder navigation service, the Internet overseer is facing criticism for being unclear and inconsistent in its regulatory policies. VeriSign has filed a breach of contract and antitrust suit against ICANN, which decided to close SiteFinder last fall after determining that it was a danger to Internet security. ICANN had asked its security committee, under the leadership of Stephen Crocker, to conduct a series of hearings investigating the SiteFinder service in October, but Crocker has so far failed to deliver a report on the matter. ICANN board member Vinton Cerf has indicated that a lack of clerical help may be delaying the release of the report. Critics of ICANN ranging from national governments to private Internet companies are questioning the quasi-private body's legitimacy and accusing it of failing to make objective and consistent decisions regarding domain names and other Internet-related issues under its jurisdiction. ICANN will have to resolve these issues soon or it will risk losing power to another governing body, such as the United Nations.


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM