HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 692: Friday, September 10, 2004

  • "Intel Calls for Internet Overhaul"
    CNet (09/09/04); Shankland, Stephen; Frauenheim, Ed

    In an Aug. 9 speech at the Intel Developer Forum, Intel CTO Pat Gelsinger cited the experimental PlanetLab network as an example of the direction the Internet needs to go if it is to be successfully upgraded to resolve issues of adaptability, reliability, and capacity. "We think the work [PlanetLab is] doing today is laying the foundation for the Internet of tomorrow," he stated. Intel and Hewlett-Packard are striving to commercialize PlanetLab to add a new layer of adaptability and intelligence services to the Internet; such services include event processing to monitor Internet activity, network mapping to organize computer links, content distribution to optimize data repositories, and Web casting to boost broadcast efficiency. Gelsinger said the Public Broadcasting Service will also employ PlanetLab to broadcast high-definition TV programs. The Intel CTO demonstrated several PlanetLab projects that could significantly benefit the Internet: The University of California at Berkeley-based Public Health for the Internet project has devised a system for monitoring and tracing the source of network attacks, while Carnegie Mellon University's End System Multicast technology aims to enhance Web casting by deploying media proxy servers to relieve the primary computer of the burden of streaming out content. Pioneering Internet developer Vint Cerf, who shared the stage with Gelsinger, said the current Internet needs much more capacity to handle the connection of tens of billions of additional devices, while the bandwidth demands of users and applications will outweigh the capabilities of the current infrastructure. Furthermore, Cerf noted that the Internet's quality of services will be less foreseeable as variations in network service response times increase along with the size of the Net.
    Click Here to View Full Article

  • "In Computers We Trust, Even Fallible Ones"
    IST Results (09/09/04)

    Building intrusion-tolerant networked computer systems that are reliable and secure is the goal of the IST project MAFTIA, a contender for the European Union's 1-million-euro 2004 Descartes Prize to be awarded in December. MAFTIA has dedicated three years to outlining a conceptual intrusion tolerance model and framework, designing intrusion-tolerant mechanisms and protocols, developing fundamental elements for large-scale dependable applications, and composing formal confirmation of selected components of the MAFTIA middleware via sophisticated mathematical methods. "Our model decides what is meant by intrusion, how to react to it and how to bring it within a fault-tolerant system," notes project coordinator Robert Stroud of the University of Newcastle's School of Computing Science. "It also bridges the gap between security and 'dependability,' a term encompassing areas such as safety-critical systems and intrusion detection." MAFTIA facilitated collaboration between scientific communities rarely known for working together, and was the first project in history to follow a comprehensive strategy for building computer systems that can stand up to intentional assault. In addition, several partners are seeking collaborative industry alliances to further develop their research prototypes, while others plan to participate in follow-on European initiatives that deal with cryptography and privacy. Several dependability and survivability workshops hosted by the United States and the European Union have utilized MAFTIA research, and the Internet Domain Name Service has deployed MAFTIA technology as well.
    Click Here to View Full Article

  • "House Panel Gets Tough on Spyware, P2P Piracy"
    InternetNews.com (09/08/04); Mark, Roy

    The House Judiciary Committee has toughened its stance on peer-to-peer digital piracy and spyware with the Sept. 8 passage of the Piracy Deterrence and Education Act and the Internet Spyware Prevention Act. The former bill goes after the digital dissemination of copyrighted content "with reckless disregard for the risk of further infringement," and proposes a maximum prison sentence of three years to violators who electronically distribute 1,000 or more copyrighted materials over a 180-day period. Furthermore, the bill sets aside $15 million for the establishment of an Internet use education program coordinated by the Department of Justice (DOJ). The Spyware Prevention Act criminalizes the deliberate access of a computer without authorization as well as the intentional circumvention of authorized access, and calls for a maximum jail term of five years if the goal of such an intrusion is to support another federal crime. The legislation also calls for a prison sentence of up to two years for violators who intentionally injure or defraud a person or damage a computer by installing spyware without permission, and allocates $10 million to the DOJ to fight spyware and phishing scams. The act's approval follows the passage of an earlier spyware bill by the House Energy and Commerce Committee that requires consumer notification of spyware's presence prior to downloading software, injunctions against unfair or deceitful practices such as computer hijacking and keystroke logging, and the provision of an opt-in screen before the transmission or enablement of any data collection software by anyone who is not the owner or authorized user of a computer. Judiciary spyware bill co-sponsor Rep. Lamar Smith (R-Texas) says that his bill, unlike the Energy and Commerce version, targets bad behavior rather than technology. "At the same time, the legislation leaves the door open for innovative technology developments to continue to combat spyware programs," attested Rep. Bob Goodlatte (R-Va.).
    Click Here to View Full Article

  • "System Alert: Web Meltdown"
    Independent (London) (09/08/04); Grossman, Wendy

    The Internet has already "melted down" when considering it is impossible for users to avoid spam and viruses, poor-quality software, and vaguely defined restrictions on how they can use their ISP accounts, according to networking expert Lauren Weinstein and other technology experts who met recently in Los Angeles to discuss the dangers to the Internet. Weinstein, University of Pennsylvania professor Dave Farber, and computing expert Peter Neumann convened the gathering of about 50 technology experts, and the atmosphere was pessimistic. Whereas 10 years ago, technologists confidently tackled fixes or workarounds necessary to make the Internet run, the recent gathering seemed unsure of their technologist powers. Part of the problem is the increasing amount of regulation: ISPs restrict whether users can share their connections or use them for Web servers, entertainment industries have successfully squelched file-sharing networks such as eDonkey, ICANN remains a law unto itself, and governments around the world are eyeing telecommunications-style regulation for VoIP. Former ICANN board member and programmer Karl Auerbach says the Internet is rapidly becoming a fundamental utility, even as it is still developing and facing numerous challenges. Government, business, and regular users depend on the Internet for daily activity and core operations. Meanwhile, evidence shows that anti-virus firms are falling behind in the race to provide security solutions and denial-of-service attacks regularly knock out or slow major sites. Internet governance law expert Michael Froomkin, however, says concern about the state of the Internet is nothing to be worried about in itself; instead, it portends a radical change to fix the situation.
    Click Here to View Full Article

  • "Simple Search Lightens Net Load"
    Technology Research News (09/15/04); Patch, Kimberly

    Using funds from the National Science Foundation and the Defense Advanced Research Projects Agency, University of California at Los Angeles (UCLA) researchers have created a local-rule search mechanism that will keep searches fast even as the network grows dramatically. The search algorithm uses local rules that are found in nature, such as in insect communities, and is meant for scale-free networks such as the Internet, which are randomly formed but are characterized by few well-connected nodes and many nodes with only a few connections. The UCLA algorithm actually makes use of this arrangement by sending queries on short walks of about 100 nodes, enough to ensure they will encounter at least one well-connected node; from there, the algorithm uses bond percolation threshold probabilities to guarantee the query reaches the core of a subnetwork of well-connected nodes. The UCLA search algorithm is similar to one developed in 2001 by Stanford University and Hewlett-Packard researchers, but operates in parallel instead of linearly, says UCLA electrical engineering professor Vwani Roychowdhury. The research team has shown that their algorithm could reduce traffic in peer-to-peer file-sharing networks such as Gnutella by one or two orders of magnitude; in addition, the search algorithm can work for both structured peer-to-peer networks, where each piece of content has a unique ID, and in unstructured systems commonly found among file-sharers. The UCLA researchers are currently working on a software library that will help developers incorporate the search algorithm in their applications, and they expect the technology to be used in practical applications in about one or two years.
    Click Here to View Full Article

  • "At Your Service (or Wits' End)"
    New York Times (09/09/04) P. E1; Hafner, Katie

    Companies are saving tens of millions of dollars in labor costs by using automated agents to handle routine inquiries via speech recognition. Unfortunately, speech recognition technology can prove very frustrating when the caller's requests exceed the limited parameters the agents operate within. Reporter Katie Hafner writes that she experienced this firsthand when she tried out United Airlines' speech-driven fare shopper to inquire about a three-city flight: The agent smoothly talked her through the planned itinerary, until she indicated the conclusion of the session with the phrase "That's it," per the agent's instructions; however, the agent misinterpreted her utterance to be "Athens," backtracked, and got confused when Hafner tried to explain its error. Ron Cole of the University of Colorado's Center for Spoken Language Research says that fairly complex requests can thwart automated systems. "The systems can only work under these constrained tasks where you know what the words will be," he notes. Accents remain a major stumbling block for many systems, though researchers are attempting to overcome this obstacle by training systems on numerous word variations. Still, voice recognition interfaces have come a long way thanks to technological advances that are improving their ability to filter out background noise and nonsensical caller vocalizations, and ask opening questions that invite free-form responses, among other things. Yet Cole doubts that speech recognition technology's incremental progress will continue to the point where machines can carry on intelligent conversations, when what is needed is a research breakthrough.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Are Hackers Using Your PC to Spew Spam and Steal?"
    USA Today (09/08/04) P. 1B; Acohido, Byron; Swartz, Jon

    Since last year, infectious programs have been turning hacked PCs into zombie computers, making them send spam emails and take part in other illegal activities. Experts say the number of infected machines has reached the millions at a time when computers are more powerful and dangerous than ever. Intelguardians co-founder Ed Skoudis says there has been a sharp rise in the number of machines attacked this year, and he's "worried things will get much worse." Most hijacked computers are in homes, on college campuses, or at small businesses, and the motive for hacking has changed from challenge to profit. Experts say code writers put together networks of zombie PCs and then sell access to identity thieves, spammers, and blackmailers. Most consumers whose computers are taken over are not immediately aware of the problem. Dave Dittrich, senior security engineer at the University of Washington's Center for Information Assurance and Cybersecurity, says, "We have a large population that is easily tricked." Regulators must deal with jurisdictional problems in trying to catch suspects since many are not located in the United States, and critics say that existing laws are too weak. The situation will not change quickly, experts believe, since affecting drastic security improvements means tech suppliers would have to cooperate on universal security standards. While vendors are unlikely to move fast on their own, experts say consumer outrage could speed things up. Meanwhile, cyber security experts say law enforcement has only recently begun to focus on the problem, but they are hindered by weak laws and the enormity of the problem. Keith Lourdeau, deputy assistant director of the FBI's Cyber Division, says, "Hackers can do almost anything with a compromised PC, and there isn't much we can do about it."
    Click Here to View Full Article

  • "Rise of the Robot"
    Melbourne Age (09/09/04); Arthur, Charles; Manktelow, Nicole; Barrett, Peter

    Future Horizons projects that 55.5 million robots will be shipped by 2010 in a market worth over $75 billion; driving this expected trend is the falling price of software that enables the machines to adapt to diverse conditions. This will significantly boost robots' reliability, which Wow Wee toys consultant Mark Tilden notes is a critical factor to their acceptance by consumers. "The electronics industry is on the cusp of a robotics wave, a period in which applications are aimed at labor-saving and extending human skills," states Future Horizons, which estimates that 39 million of the expected 55.5 million devices will be domestic robots and 10.5 million "domestic intelligent service" robots. Japanese and American robot development efforts are concentrating on assistive household machines that can help elderly and handicapped people by noting their routines and taking corrective action in response to interruptions or diversions in those routines. However, an even larger predicted trend than assistive living robots is the "immobot" wave, in which software is embedded into appliances and other objects that can handle domestic chores; one example of this is a fridge that keeps track of its inventory and can automatically order items that need restocking from the supermarket. Researchers are divided on whether the best behavior and learning architecture for robots is a "top-down" approach in which behavior is preprogrammed or a "self-organizing system" approach whereby robots self-determine their limitations and how best to function within those constraints. Neither approaches, however, will result in completely predictable behavior, which is why software that can learn is so vital. In addition, the decline of hardware and assembly costs is lagging behind that of software and materials, which could also hinder the proliferation of robots.
    Click Here to View Full Article

  • "Coffee Needs Topping Up? Computer Specks Can Tell"
    Straits Times (09/07/04); Soh, Natalie

    Five universities in Scotland are collaborating to move Speckled Computing out of the conceptual stage and into the real world. Speckled Computing envisions the distribution of thousands of minuscule computers that can network into supercomputers and link ordinary objects to the Internet. Each unit will be a cube one millimeter in size equipped with a microprocessor and detection and communications capability; University of Edinburgh lecturer D.K. Arvind compared the combined power of Speckled Computing to the collective intelligence of an ant colony. The network of computer specks can connect with the Internet or another computer in order to effect action in accordance with changing conditions: For example, specks on a coffee mug could notify the cafe's computer that the beverage is getting cold or is almost finished, and the computer in turn would send someone to top it up. Other potential applications include the distribution of specks on car seats to measure the sitter's weight so that airbag deployment can be tailored to enhance driver and passenger safety. The Speckled Computing consortium is investigating solar power as an energy source for the miniature computers, while a similar initiative, the University of California at Berkeley's "Smart Dust" project, is focusing on vibrational energy. Motorola, Agilent, Sun Microsystems, and others have contributed about $6.7 million to the Specked Computing consortium, according to Arvind, who met with dons in Singapore to discuss the project.
    Click Here to View Full Article

  • "Princeton Research Project Bypasses Internet Shortcomings"
    Wisconsin Technology Network (09/08/04); Stitt, Jason

    Princeton University's computer science department chair Larry Peterson told attendees at a Sept. 7 seminar at the University of Wisconsin-Madison that the commercial Internet is too unfriendly to host experiments with new network architectures and protocols that could be crucial to patching the Net's vulnerabilities and shortcomings. His solution is to let researchers use PlanetLab, a distributed, Internet-independent network of 439 computers that spans all continents with the exception of Africa and Antarctica. "Because we no longer have access to [the Internet protocol], the core of the Internet, we'll just bypass it and do an end-run around the problem," Peterson explained, noting that PlanetLab loses about an order of magnitude of performance, compared with the Net. Applications running on PlanetLab aim to map out the proliferation and behavior of malware, compare a network's model behavior to its actual behavior, and measure how information packets are routed throughout the Internet, among other things. Accessing PlanetLab requires institutions to contribute computers to the network. The machines run a specially-tailored version of Linux and are coordinated by centralized administration, although Peterson said future iterations would cede more control to local administrators. The decision to operate PlanetLab outside the commercial Internet was predicated on the fact that network protocols are so deeply entrenched into commercial router equipment and other applications that experimentation on the Internet itself could only proceed if companies were willing to change their products. Peterson acknowledged at the seminar that malware running on PlanetLab constitutes a potentially catastrophic risk, but the network has safeguards that can, for example, limit specific users' bandwidth and Internet access, and theoretically trace an attack's precise point of origin.
    Click Here to View Full Article

  • "Spreading Knowledge, the Wiki Way"
    Washington Post (09/09/04) P. E1; Walker, Leslie

    Wikipedia, a free online encyclopedia assembled and edited by the site's visitors, has grown remarkably in size and popularity: The archive supports more than 340,000 English-language articles and disseminates news via a publicly authored current-events page, but its increasing acceptance has unsettled commercial encyclopedia providers, who fear declining revenues from defecting patrons. They also think Wikipedia articles may not offer the same accuracy and quality as their articles, while its openness makes it prey for vandals. Wikipedia's founders, however, believe the encyclopedia's communal nature is its biggest advantage, as it would allow chicanery to be policed and muzzled by volunteer contributors. "The more people who come to the Web site and cause problems, the more people we have who are dealing with them," contends Wikipedia founder and chief executive Jimmy Wales. Wikipedia was modeled after the community-based development of the Linux operating system, while its key enabling technology is programmer Ward Cunningham's "wiki" software, which allows groups to collaboratively build and edit Web pages via a special formatting style. The goal of Wikimedia, the nonprofit foundation developing the encyclopedia, is to give everyone around the world free access to all human knowledge, Wales explains; central to this is the planned production of print and CD-ROM versions of the encyclopedia for distribution in remote regions. Skepticism about the reliability of Wikipedia's content has been fueled by contributors' constant battle against vandalism and competing viewpoints, although Wales notes that the encyclopedia is considering implementing more rigorous editorial processes. He further opines that commercial encyclopedias could be driven out of business unless they lower costs and embrace more open principles.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Industry Group Voicing Cybersecurity Concerns in Washington"
    Investor's Business Daily (09/09/04) P. A6; Howell, Donna

    Executive director of the Cyber Security Industry Alliance (CSIA) Paul Kurtz says the motivation for the organization's establishment was to give cybersecurity industry leaders "a common voice in Washington on cybersecurity policy issues." The seven-month-old CSIA aims to address such issues as cybersecurity awareness--which Kurtz says is showing signs of progress, although more improvement is needed--and the implications of regulatory measures such as Sarbanes-Oxley and the Health Insurance Portability and Accountability Act for IT security. Section 404 of Sarbanes-Oxley, which requires CEOs to affirm their financial statements, is hazy in how it relates to cybersecurity, and Kurtz notes that his organization is attempting to find and cite case studies as examples of strategies companies can employ to comply with the regulation. He explains that when it comes to Section 404 compliance, firms need to track transactions related to collating their financial statements along with their sanction and assent. He says, "While the CSIA doesn't have legal authority to put down guidelines, what we can do is put together a picture of what's happening in the space, how companies are responding, and help other companies determine what to do." Kurtz says he reports to the senior executives of the CSIA's founding firms, who are eager to collaborate with other cybersecurity-focused organizations such as the Business Software Alliance and the Information Technology Association of America. He also notes that the CSIA will be pushing for increased understanding of cybersecurity issues through close collaboration with people on Capitol Hill.

  • "Critics Warn of Post-Election Problems If No Paper Trail Exists"
    Federal Computer Week (09/06/04) Vol. 18, No. 31, P. 61; Hardy, Michael

    The U.S. presidential election in November could be endlessly disputed if the vote results are close, since nearly 30 percent of voters will be using touch-screen machines, most of which are not equipped to produce a verifiable paper record. The August recall election of President Hugo Chavez in Venezuela was also held using touch-screen voting machines, and international election observers were able to verify his 59 percent win by checking paper receipts. The upcoming November election in the United States is likely to be much closer, making recount capabilities all the more important. Smartmatic, the U.S. company that supplied the voting machines in Venezuela, says its systems offer several ways to check vote accuracy in addition to paper receipts. Most direct recording electronic (DRE) machines include cross-check methods and backups, but VerifiedVoting.org leader and Stanford University professor David Dill says such techniques are beneficial, but do not highlight irregularities that can be spotted by checking with a printed paper record; furthermore, Georgetown University research associate Mark Gray says paper records also provide more confidence to the voting public, and that digital-only records make it easier to level accusations of fraud or misconduct. Although paper records are supported by state legislation in California and a proposed House of Representatives amendment to the Help America Vote Act of 2002, touch-screen DRE machines will be used in some of the critical states deciding the November election. In those cases, Dill recommends exit-polling and use of electronic backup systems to check vote accuracy. Although touch-screen DRE machines are probably a good solution because of the complexity of the ballot, states would do well to add paper-recording capabilities in order to ensure accurate recounts, says Johns Hopkins University's Aviel Rubin.
    Click Here to View Full Article

    For information on ACM's e-voting activities, visit http://www.acm.org/usacm.

  • "Robots Invade the Table Football Pitch"
    New Scientist (09/04/04) Vol. 182, No. 2463, P. 18; Graham-Rowe, Duncan

    Bernhard Nebel of Germany's University of Freiburg has led a team of roboticists in the development of a robotic foosball table that bested 85 percent of a random sample of players. The rods on one side of the table are linked to high torque motors and an electronic control system that tracks ball movements via a camera mounted under a transparent base that snaps photos 50 times a second and sends the data to a built-in computer. Intelligent software preprogrammed with ball dynamics determines what happens when one of the artificial soccer players hits the ball, and can ascertain whether it can be blocked in some directions by other figures; positioning the ball as close to the competitor's goal as possible is the strategy the software follows. The software's optimal performance, which yielded an average scoring rate of one goal every 36 seconds, is achieved when the software wallops the ball, a tactic typical of novice players. A scoring rate of one goal every 50 seconds was reached when the software performed in a less "reactive" mode. Nebel's team is concentrating on making the table capable of stopping and passing the ball, and the researcher expects the machine will be able to beat the world foosball champion in about five years.
    Click Here to View Full Article

  • "IT to Help Avoid Astronomical Armageddon"
    Computerworld (09/06/04) Vol. 32, No. 36, P. 30; Thibodeau, Patrick

    A number of major projects are underway to map the sky in an effort to spot and analyze potentially dangerous near-Earth objects long before they become serious threats so that they can hopefully be deflected in time. New search and processing approaches will need to be developed in order to effectively extract helpful information from the massive data sets such projects will generate. The Arizona-based Large Synoptic Survey Telescope (LSST) is expected to accumulate about 6 GB of data every second when it is up and running in seven years' time, leading University of Arizona physics professor Philip Pinto to speculate that "The LSST database will probably be the largest known nonproprietary database in the world." Handling the data will be the responsibility of new algorithms devised by computer scientist Jeff Kantor; among the challenges these algorithms need to meet is being able to ascertain from multiple observations whether a specific object in the sky is moving from frame to frame. LSST scientists are hoping that storage densities and processing capabilities will expand enough to deal with the telescope's data sets. Meanwhile, the Panoramic Survey Telescope and Rapid Response System (Pan-STARRS) being developed by the University of Hawaii's Institute of Astronomy will piece together images captured by a quartet of telescopes and use parallel processing to produce data. Pan-STARRS, which is slated to go online in 2008, calls for data parallelization in which each processor works on individual data while carrying out the same instruction simultaneously. This is in contrast to the CPU parallelization typical of high-performance technical computing.
    Click Here to View Full Article

  • "Toward a Federated Future"
    InfoWorld (09/06/04) Vol. 26, No. 36, P. 44; McAllister, Neil

    The federated network model, such as the one employed by banks to facilitate seamless ATM transactions, is being considered by IT vendors and their clients as an enabling framework for next-generation integrated network services for enterprise employees, business partners, and consumers. But its success depends on the resolution of the digital identity issue, which encompasses such riddles as where user identity should actually be located, how open standards can justify the numerous authorizations necessary to exchange user data across countless systems and organizations, and how the role technology plays measures up to the role of trust. The Burton Group defines federated identity as a combination of single sign-on, identity mapping and account linking, directory services, authorization, and management, and the concept's mainstream acceptance hinges on open standards; the Liberty Alliance and Microsoft and IBM are developing OASIS-approved federated identity standards that are designed with compatibility in mind. Burton Group's Dan Blum describes effective federated identity systems as those that model the formation of business relationships in everyday life using dynamic federation, in which activities can be undertaken off the cuff, even in the absence of a prior trust relationship; central to such a schema is the identity framework, whereby party A, which trusts party B, can likewise trust party C, with which party A has no previous trust relationship, on the strength of party B's trust of party C. This architecture relies heavily on accountability and a shared set of rules, while other, unanticipated issues can come up in scenarios that involve federating between firms across different industries. Sun Microsystems' Andrew Shikiar suggests that federation deployment must be approached cautiously, beginning with an evaluation of a company's architecture and what it requires to become federated, with emphasis on the areas most likely to benefit from federation. A sturdy internal identity infrastructure must be set down first and foremost.
    Click Here to View Full Article

  • "The Dark Side of Small"
    Chronicle of Higher Education (09/10/04) Vol. 51, No. 3, P. A12; Monastersky, Richard

    Efforts to study the potential medical and environmental risks of nanomaterials are lagging behind those to develop and commercialize the technology, and advocates are worried that the nanotech movement could be undone by a shortage of reliable information and warnings of doomsday scenarios that are more science fiction than science. "The perception that nanotechnology will cause environmental devastation or human disease could itself turn the dream of a trillion-dollar industry into a nightmare of public backlash," declared Rice University chemistry professor Vicki Colvin at a congressional hearing last year. Complicating the collection of solid data about nanomaterials' medical and environmental effects is the fact that nanoscale substances behave differently than they do at macro-scale, so standard measures for assessing toxicology may not apply. This was proven in a University of Rochester study in which 12-nm to 250-nm titanium dioxide particles were found to cause more inflammation in the lung tissue of rodents than an equal weight of larger particles. Other experiments have demonstrated that carbon nanotubes cause lung damage in mice and produce free radicals in human skin cultures, while fullerene nanoparticles induce oxidative damage in the brains of large-mouthed bass and also kill beneficial microorganisms in their tanks. Critics complain that the federal government is not committing enough money or resources to give potential nanotech hazards the research focus they deserve. National nanotech-coordinating office director E. Clayton Teague estimates that just 1 percent of the total federal nanotech budget is devoted to health and environmental risk assessment. Though the Bush administration responded to growing criticism last month by urging agencies to support research into possible nanotech threats, no additional funding has been allocated.
    Click Here to View Full Article

  • "Dark Matter Revisited"
    Internet Computing (08/04) Vol. 8, No. 4, P. 81; Vinoski, Steve

    Distributed computing and integration toolkits tend to lose their simplicity and grow bigger and more complicated to the point where they become just as complex as the approach they were designed to supplant, and this trend is attributable to a number of factors, not the least of which is toolkits' increased popularity fueling consumer demand for more capabilities and functionality. IONA Technologies' Steve Vinoski observes that the dynamic languages comprising the "middleware dark matter" that many distributed integration applications are based on have sustained and quickly expanded their presence over the last several years, as the massive numbers of open-source projects centered around these languages attest. The author believes the improvement and increasing desirability of dynamic languages are being driven by modern computers' ability to rapidly and efficiently execute dynamic language-based applications, the movement to enhance Java virtual machines and bytecode interpretation through research and experimentation, and the languages' superior ease of use and adaptability. Vinoski devotes special attention to Twisted, a framework for authoring network applications that is based on Python: Features and capabilities of note include enterprise capabilities such as user authentication and object persistence support; Web, DNS, Internet Relay Chat, and mail servers; pluggable and replaceable event loops; a distributed-object broker; and protocol and transport abstractions. Advantages of the Perl-based SlimServer network music server Vinoski points out include its accessibility over multiple protocols, its free availability as open-source software, and its ability to cheaply integrate home computers and home audio systems. Finally, the Python Enterprise Application Kit is chiefly used to build enterprise applications out of components, and its developers think this approach will support the assembly of systems that are simpler, faster, and easier to set up, manage, and maintain than J2EE-based offerings.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM