HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 609:  Monday, February 23, 2004

  • "Hey, Gang, Let's Make Our Own Supercomputer"
    New York Times (02/23/04) P. A1; Markoff, John

    On April 3, University of San Francisco students will gather in a gym to lash together about 1,000 computers into a shared high-speed network that can handle the benchmark program, a bunch of equations that can be parsed and computed on numerous processors concurrently. The project will represent the first time a supercomputer will be assembled by a "flash mob" of volunteers recruited over the Internet, who are asked to only bring machines with at least a 1.3 GHz Pentium or AMD processor and 256 MB of RAM; donated high-speed networking switches will be used to plug the machines together. "This is what happens when crazy ideas catch fire and people say, 'Wait, there is nothing to stop this,'" boasts Patrick Miller, a computer scientist at Lawrence Livermore National Laboratory's Center for Applied Scientific Computing. University of San Francisco graduate student John Witchel believes computing via flash mob will allow community groups and high school students to tap into computing resources that only large corporations or government labs currently have access to. "We're trying to democratize supercomputing," he explains. The group that organized the flash mob supercomputer effort calculates that the machine will need to reach a speed of approximately 550 gigaflops to be ranked on the next Top 500 list, while University of Tennessee computer scientist Jack Dongarra postulates that the machine could run into speed difficulties stemming from electrical power issues. Heat could also become an issue given the tremendous number of people who will gather in the gym, but Miller is confident that the gym's high ceiling will prevent any problems in this area.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Computer-Security Efforts Intensify"
    Wall Street Journal (02/23/04) P. B4; Clark, Don; Wingfield, Nick; Hanrahan, Tim

    An annual conference hosted by RSA Security will be held this week, with email fraud, spam, and new ways to hinder such practices through the authentication of company and user IDs being major topics of discussion. Bolstering information has increased in importance because corporations may now be liable for lost or compromised data thanks to new legislation. One proposed solution is Sender Permitted From (SPF), in which senders' servers post their IP addresses so that email recipients can verify that incoming messages are from legitimate sources. Time Warner's America Online unit has tested SPF, which is also being embedded in MailFrontier software and other products. Meanwhile, PassMark Security will announce a Web site authentication solution on Feb. 23 whereby users are assigned a random image on their first visit to a site that employs the PassMark system; they would be shown the same image when they revisit the site before entering their user names and passwords, or otherwise know that something is wrong. Sun Microsystems wants to widen the scope of smart cards or security tokens, which reportedly offer better protection for Web sites than passwords and more accurate identification of emailers. VeriSign will today announce new technical guidelines to reduce the cost of smart cards and other robust ID measures. The open authentication reference architecture (OATH) is a joint project between Sun, IBM, Gemplus International, BEA Systems, and others that aims to help companies develop simple, interoperable online ID products.

  • "E-Voting Activists: Vote Absentee"
    Wired News (02/20/04); Zetter, Kim

    Activist organizations in Maryland and California are raising the insecurity of electronic voting systems and their lack of voter verifiable audit trails as reasons why voters should use paper absentee ballots in their March primaries. Computer scientists have uncovered evidence to suggest that many e-voting systems are vulnerable to untraceable tampering that could lead to election fraud or lost votes, while the absence of paper ballots lowers the likelihood of accurate recounts if final tallies are called into question. Linda Schade of Maryland's Campaign for Verifiable Voting reports that machines made by Diebold Election Systems received poor grades on three separate audits, and activists demanded that Maryland election officials allow voters to use paper ballots if they so desire. Meanwhile, California activists did not win a temporary restraining order on Feb. 17 that would have required counties to add anti-hacking measures to e-voting systems. Secretary of State Kevin Shelley is requiring all machines in California to include a printed audit trail by July 2006, but in the meantime California Voter President Kim Alexander recommends that voters use paper ballots. "I consider this to be the greatest threat to our democracy of anything we have ever faced in this country," comments Allen Michaan, owner of the Grand Lake Theatre in Oakland, Calif. "Absent [a voter verifiable paper trail] we're totally at the mercy of whoever controls this equipment." Alexander notes that in areas where paperless machines are used, voters will have no choice but to trust that vendors and counties will comply with laws and procedures, but laments that her confidence in the voting systems has been "profoundly shaken" by evidence that at least 17 California counties have not complied.
    Click Here to View Full Article

    To learn more about ACM's activities regarding e-voting, visit http://www.acm.org/usacm.

  • "Software Standards Seen Aiding Auto Complexity"
    EE Times (02/20/04); Hammerschmidt, Christoph

    Experts posit that making software and standardization more essential to automotive development will help reverse the decline of quality and reliability that accompanies the growing complexity of electronics. Addressing the eighth annual Euroforum, Thomas Scharnhorst of Volkswagen explained that adding electronics has become a critical element to the economical manufacture of automobiles, but the growing prevalence of in-vehicle electronics is leading to more breakdowns, and tarnishing automakers' reputations. One out of every two cars that will not start suffer from problems related to electrical and electronic systems, according to current estimates. Scharnhorst's strategy for lowering complexity is to reduce the number of controllers in autos, while Gunter Reichardt of BMW favors an approach in which complex, highly networked systems feature exclusive innovation deployments. Reichardt says that reducing the number of controllers, as Scharnhorst is planning, will only transfer complexity to the controller and to the software level. Automotive Systems and Engineering Technologies President Rainer Kallenbach advocates standardization, and has put forward a list of modular software, universal interfaces, time-controlled bus systems, and better diagnostic features for all parts systems and components. Reichardt also sees value in software standardization, and expresses a need for carmaker-independent standard software that includes an operating system, network management, and diagnostic capability. "An open system architecture such as the one used by AUTOSAR is a logical and consistent step towards mastering complexity," he concludes.
    Click Here to View Full Article

  • "The Shapes of Things to Come"
    TechNewsWorld (02/21/04); Halperin, David

    The advent of smaller, faster processors, new materials, and ubiquitous wireless communications will usher in PC redesigns, which MIT Media Lab's Neil Gerschenfeld says is all about "breaking down the barrier between the bits and atoms." The promise of miniaturization has led to the vision of the ultra PC (uPC), a pocket-sized computer with the processing power of a desktop that OQO plans to roll out later this year. Meanwhile, MIT Media Lab researchers are focusing on a wide array of disciplines--software agents, speech interfaces, machine understanding, object-oriented video, and affective computing, to name just a few--in ultimate pursuit of "a desktop printer capable of printing a fully functional computer onto a piece of paper or plastic." On the horizon are impressive man-machine interface innovations: Canesta has invented a product that can be mounted on a personal digital assistant or phone to project the image of a keyboard on any flat surface, while an infrared "radar" reads finger movements and converts them into keystrokes. Philips, meanwhile, has developed a prototype electronic paper that can be used to download any print content and refreshes itself in approximately one second; the manufacturer expects to mass-produce the product in several years. Wearable computer interfaces under development or already available include eyeglass-mounted displays that superimpose computer data over the wearer's view of the real world, and the University of Toronto's EyeTap, a video camera that shares the wearer's perspective and enhances it with additional data. Kevin Warwick of the University of Reading has entered the realm of cyborg technology: He can remotely operate an artificial arm and a wheelchair, among other things, via electronic implants.
    Click Here to View Full Article

  • "W3C Risks Patent Tussle in Standard Push"
    CNet (02/19/04); Festa, Paul

    The World Wide Web Consortium's (W3C) VoiceXML 2.0 standard is going forward despite a patent advisory group's failure to ensure against patent claims. The W3C adopted a controversial policy about eight months ago meant to keep essential parts of recommendations patent-free, but the patent advisory group completed its work without gaining assurances from Rutgers University that it would not bring unexpected claims against VoiceXML 2.0 users. Rutgers developed a system for accessing data via audio means in the mid 1990s, and patent co-author Tomasz Imielinski--who also chaired the VoiceXML working group in 1999--says the technology is fundamental to the VoiceXML 2.0 effort. VoiceXML 2.0 would allow phone users to receive Internet updates through text-to-speech standardization, and also allow users to search Web servers for tagged information. Although some have likened the possible Rutgers threat to SCO's attack on Linux users, experts say a more similar analogy would be of Microsoft and University of California spinoff Eolas; that court battle surprised the IT world with a $521 million judgment against Microsoft for use of critical Web browser plug-in technology. The W3C and other groups are fighting to overturn the original patent. Rutgers University has not clearly stated its intentions concerning the patent except to say that it intends to offer it under reasonable and non-discriminatory (RAND) licensing terms. Technology experts have chided the W3C patent advisors for making a paltry effort to secure the proper patent clearances, as they had with Avaya Communications and Royal Philips Electronics. "If the W3C claims to have a patent policy, they have the duty and the responsibility to push that policy," said Sun Microsystems standards director Carl Cargill.
    Click Here to View Full Article

  • "Spam: A Reality Check"
    PC Magazine (02/18/04); Ulanoff, Lance

    The CAN-SPAM act has not stymied the rising tide of spam email, but it has influenced changes in the content and targeting of spam messages: Spammers are using provisions in the CAN-SPAM law to make their email look legitimate, including unsubscribe links and postal mailing addresses, for example. SurfControl's Susan Larson says one in 20 spam messages her company captures for enterprise clients has some new information added as a guise, and notes that new spam messages appear to disseminate nonpromotional content, such as trivia, but have normal spam text appended. CAN-SPAM's requirement of snail-mail return addresses is addressed by spammers who insert invisible white text inside addresses, making them appear legitimate to users but keeping anti-spam software from capturing traceable addresses. The unsubscribe links are just a bad idea, according to MessageLabs CEO Mark Sunner, who says anyone even opening spam email puts themselves at risk of virus infection, not to mention those who click on inserted links. Sen. Conrad Burns (R-Mont.), one of the co-authors of the CAN-SPAM legislation, defends the bill but admits the unsubscribe links were one area of compromise; he looks forward to spam volume decreasing in coming months as the Federal Trade Commission and FCC work out enforcement rules that will likely give protections to legitimate email marketing firms and companies that distribute information to clients via email. Burns also looks forward to international gatherings such as the upcoming International Telecom Union meeting for the creation of international enforcement mechanisms. That would no doubt pressure the majority of spam senders who keep their servers outside the United States. Burns also says he has been in contact with colleagues in the United Kingdom and Australia about the international spam problem.
    Click Here to View Full Article

  • "Phone Fibbing Is the Most Common Method for Untruths"
    Newswise (02/18/04)

    Communications researchers at Cornell University say communications technology has an impact on lying. Jeff Hancock, Cornell assistant professor of communication, and graduate students Jennifer Thom-Santelli and Thompson Ritchie plan to present their study, "Deception and Design: The Impact of Communication on Lying Behavior," at the Computer-Human Interaction (CHI) scientific meeting in Vienna, Austria, in late April. In the study of 30 students over a period of seven days, the researchers monitored 1,198 social communications and found that 37 percent of telephone conversations involved lies, 27 percent of face-to-face conversations included lies, about 21 percent of instant messages had lies, and 14 percent of email had lies. "Media where interactions are in real time boost the opportunity for deception," says Hancock, because most lies tend to be unplanned and are spontaneous in the course of a conversation. "This type of opportunity is less likely to arise when composing an email." Hancock adds that people are less likely to lie when a conversation can be recorded and easily reviewed, as is the case with email. Even instant messaging conversations can be saved.
    Click Here to View Full Article

    To learn more about the agenda for the CHI04 conference, visit http://sigchi.org/chi2004/.

  • "Serious Linux Security Holes Uncovered and Patched"
    eWeek (02/19/04); Vaughan-Nichols, Steven J.

    ISec Security Research, a Polish nonprofit organization, discovered a number of security vulnerabilities in the Linux kernel on Feb. 18 and released an advisory. Linux kernel developers verified the problems and fixed them with updates. One flaw would have allowed a hacker to get full super-user privileges, while the other would have allowed whole systems to be hijacked or disabled. However, both would have required local users with sophisticated knowledge and Unix shell access, notes Debian Linux security expert Martin Schulze. Linux distributors including Novell/SuSE Linux, Red Hat, and the Debian Project have released patches. Although not related, both of the flaws were located in Linux's virtual memory kernel subsystem; one of the flaws was found in the mremap(2) system call of Linux 2.4 and 2.6's kernel memory management code.
    Click Here to View Full Article

  • "Roadblocks Could Slow RFID"
    CNet (02/19/04); Hines, Matt

    Criticism of radio frequency identification (RFID) technology has often centered on potential security and privacy issues, but technology vendors and industry observers believe that companies wishing to adopt RFID will run into difficulty if their software infrastructures are ill-equipped to manage the huge amount of data produced by RFID-enabled systems. "In order to do RFID right, to see a true return, the first thing [a company] needs to do is finish a data synchronization initiative, and do it right," urges AMR Research analyst Kara Romanow. She adds that such organizations will have to iron out existing problems in their data policies. IBM RFID architect Rainer Kerth notes that RFID systems must communicate with a universal data repository if the technology is to truly benefit inventory management and other operations where RFID promises to boost efficiency. "Keeping data consistent and accessible within the enterprise is the more immediate problem that needs to get solved," he insists. Romanow sees businesses working to adopt RFID technology following one of two strategies: Those deploying RFID just enough to retain major customers, and those with long-term goals, which will be the ones to enjoy true returns on their investments. Global Exchange Services chief technologist John Radko points out that the pursuit of data synchronization by manufacturers, retailers, and suppliers began long before RFID became popular, citing initiatives such as the Uniform Code Council's UCCNet group, which is pursuing a universal data synch standard. He adds that organizations that fail to settle data synch issues before deploying RFID technology in warehouses will not achieve any return on investment.
    Click Here to View Full Article

  • "Before Wi-Fi Can Go Mainstream"
    Business Week (02/18/04); Salkever, Alex

    A number of hurdles must be cleared before high-speed Wi-Fi communications can truly gain mass appeal among consumers and corporations, such as installation and security issues, differing authentication standards at Wi-Fi hotspots, and a lack of all-inclusive roaming agreements. Seamless roaming is currently impossible because the proliferation of wireless ISPs has split the market, while the difficulty of tracking and servicing customers from provider to provider has led to numerous problems with billing and logons, as well as troubles with quality control. Security and ease of management are the biggest barriers to Wi-Fi adoption faced by corporate customers. The new 802.1x Wi-Fi standard, which promises augmented encryption and authentication, could help overcome the first obstacle, while the second obstacle is a tougher challenge, since CIOs do not want to add further stress to overworked IT personnel, given the additional security responsibilities inherent in Wi-Fi. Wi-Fi adoption in consumer households is being held up by the lack of a single device to manage multiple Wi-Fi-enabled appliances such as DVD players, stereos, and televisions. There is also general agreement that these devices must be considerably improved and reduced in price. One thing is clear: The dramatic growth Wi-Fi has already experienced provides impetus for removing such impediments. Synergy Research estimates that sales of Wi-Fi networking gear shot up 40 percent to $2.5 billion last year, while Farpoint Group founder Craig Mathias predicts that virtually all laptops will be Wi-Fi-enabled by the end of 2004.
    Click Here to View Full Article

  • "Unplugged: Charles Simonyi Creates Software Intentionally"
    Tech Update (02/15/04); Farber, Dan

    Intentional Software founder Charles Simonyi attributes most software problems to a gap between design intent and the actual coding, and cross training subject matter experts and programmers will not solve these problems. His solution is to move programming further upstream while preserving the design intent. In such a scenario, subject matter experts would present their ideas to programmers via PowerPoint, for instance, and the programmers would write a generator program in C# that reads the presentation and writes the program. The computer-aided design (CAD)-like program that the generator can read and process its input from will be provided by Intentional Software. Simonyi explains that writing generators will excuse programmers from the burden of redoing the same transformations each time the problem statement is modified or changed by the stakeholders, thus allowing the result of changes to be deployed in seconds instead of weeks and at vastly less expense--and free of deployment bugs, as well. The generators can be built and adjusted by coders in the same way that automated equipment can be adjusted by engineers and mechanics. Simonyi adds that though intentional programming does not reduce domain bugs, it facilitates rapid turnaround for fixes. The Intentional Software founder says, "With intentional software, there will be no limitation on the nature of the domain notation, and the implementation will be expressed in terms of a generator, which can be simply re-run if the design or implementation, or both, change." Simonyi notes that his company should roll out design tools and a generator interface next year.
    Click Here to View Full Article

  • "Chips to Ease Microsoft's Big Security Nightmare"
    New Scientist (02/22/04); Ananthaswamy, Anil

    Chip manufacturers are working on new microprocessors to block the flaws that made Microsoft issue a critical security alert recently. Various Microsoft programs were found to be vulnerable to "buffer overflow," which can be used to illicitly obtain information from computers. The problem is difficult to detect, but Intel and Advanced Micro Devices are working on processor chips to contain the threat, chips that can tell the difference between memory that contains data and memory that contains program instructions. "Buffer overflows are the largest class of software vulnerabilities that lead to security flaws," says Immunix's Crispin Cowan, who believes that hackers will eventually find ways around the new chip protections. The Slammer and Blaster worms that ravaged networks worldwide last summer took advantage of buffer overflow flaws. AMD's new Athlon-64 and Opteron chips separate memory into data-only and instruction-only sections; any code that attempts to executive in the data-only section will trigger the operating system to close the program. Intel says it will build a similar system into its next-generation of Pentium chips. Still, Cowan says hackers likely will find new ways to prevail, such as malware that forces existing programs to open a data port to hackers by moving to a subsection of its code at the wrong time.
    Click Here to View Full Article

  • "Tin Ears and the Social Fabric"
    InfoWorld (02/16/04) Vol. 26, No. 7, P. 36; Udell, Jon

    Within a span of five years, technology that is able to improve the effectiveness of people working together in information environments has grown to include Weblogs, instant messaging, Wikis, and comment threads within blogs, and Web services have been used to create software systems that are loosely joined together. Although the nature of collaboration has not changed, fluid improvisation among team members will be needed if social software is to facilitate business productivity, suggests former Xerox Palo Alto Research Center director John Seely Brown, in a recent New York Times story. "In soccer there are some set plays, but the best teams also display a wealth of effective improvisation based on the players' deep knowledge of one another," Brown explains. "It's the same in the best corporations or startups." Although it is likely that networked software systems would be able to support such improvisation, there is some concern whether the existing software development culture could produce those kind of systems. The tin ears of the latest relationship amplifiers (Linkedln and Orkut) is not a surprise, considering programmers do not have a reputation for being highly social people. Moreover, programming lacks any input from women. Social skills and protocols are a big part of social software, and concerns remain about its development if representatives from half the population are not involved.
    Click Here to View Full Article

  • "Data Avalanche"
    InformationWeek (02/16/04) No. 976, P. 30; Whiting, Rick; Kontzer, Tony

    Managing the massive amount of data generated by radio-frequency identification (RFID) chips will be a major challenge, and failure to do so will result in an overload of information. Makers of consumer packaged goods believe RFID systems embedded in the supply chain will give them unparalleled foresight of product demand, but this will not happen unless they decide what RFID technologies to employ; the integration of RFID tags with sensors could significantly boost the value and volume of collected data, but this could lead to scalability problems for operational applications. Most companies implementing RFID pilots or systems have yet to work out a robust data management strategy because the technology's feasibility for wide-scale supply-chain use was only recently determined, while data management software is in a very early developmental stage. Business-technology managers will need to establish policies for how much data should be collected from RFID systems. Meta Group analyst Gene Alvarez recommends that companies study how they will condense and amass information produced by RFID chips and readers and move it to middleware and enterprise applications, and his firm also suggests that organizations set up teams of personnel to understand RFID's advantages and disadvantages, and align the technology to business processes. Specific applications will determine what kinds of data RFID chips will collect and the frequency of data collection. Brian Higgins of BearingPoint comments that RFID systems will boost demand for data-synchronization and transformation tools, and raise the importance of data-quality-management software. New data access issues might also crop up as a result of RFID deployments.
    Click Here to View Full Article

  • "Users Tap Network-Monitoring Technology"
    Network World (02/16/04) Vol. 21, No. 7, P. 17; Hochmuth, Phil

    The value of the sFlow draft standard, approved by the IETF three years ago, is growing among high-speed network users, who are employing the technology for security and network performance tracking. Foundry Networks and Hewlett-Packard, which developed the sFlow technology along with InMon, report that sFlow furnishes a real-time perspective of network traffic performance, problems, and trends via random sampling of LAN and WAN data packet flows. The usual method of network monitoring involves placing a network probe device onto a network segment, typically by plugging the probe into a mirrored port on a LAN switch, which means that the device can only collect data from the mirrored port; sFlow is implemented via network management information bases (MIBs) that run on the network's actual switches and routers, which supporters say provides a much more extended view of network performance. The MIBs collect random samples, or sFlow datagrams, of data packets as they move through ports. The datagrams are in turn sent to an sFlow collection server, where an algorithm builds a model of network traffic. David Bratt at Tampa's Moffit Cancer Center reports that with sFlow, "If you have someone doing something wrong on the network, you can track them down right to where their PCs is plugged in." The draft standard's application as a security measure is demonstrated in its ability to detect unauthorized network devices acting as network address translation (NAT) boxes, for example; these devices could be used as back doors, but sFlow data analyzers can zero in on such devices through comparison of subnet data among switches and NAT boxes. SFlow's importance is likely to increase as network traffic speeds accelerate to gigabit and 10G in certain enterprises, according to users and experts.
    Click Here to View Full Article

  • "Unlocking Our Future"
    CSO Magazine (02/04); Garfinkel, Simson

    Sandstorm CTO and technology writer Simson Garfinkel maintains that computer security has Grand Challenges equivalent to putting a man on the moon or forecasting weather via supercomputing--in fact, he was one of dozens of leading security researchers invited by the Computing Research Association and the Association for Computing Machinery to find and present such challenges at a November workshop. The end result was a quartet of challenges that deserve "sustained commitments." The first information security Grand Challenge lies in eliminating epidemic-style worm, virus, and spam attacks within a decade, and Garfinkel writes that most conference attendees favored the development of a completely new approach to solving the problem, rather than the installation of antivirus software and the continuous updating of systems. The second Grand Challenge is the development of tools and principles for building large-scale systems for critical and trustworthy applications that also make lucrative targets, such as medical records systems. The third Grand Challenge is finding a reliable way to measure risk in information systems, which could allow people to determine how much an organization could save by deploying a specific piece of software, for instance. Practitioners usually establish "best practices" designed to reduce the changes of computers being breached, but such measures provide no metric for making purchasing decisions, nor do they tell organizations how secure their systems are at the moment. The last Grand Challenge is to give end users easily understandable security controls as well as privacy they can control for the pervasive, dynamic computing environments of the future; meeting such a challenge could involve a fundamental shift in the way people look upon and work with information systems. Garfinkel concludes, "Ultimately...we need to start thinking more strategically about computer security, or else we are going to lose this war."
    Click Here to View Full Article

  • "Molecular Nanotech: Benefits and Risks"
    Futurist (02/04) Vol. 38, No. 1, P. 42; Treder, Mike

    The emergence of molecular nanotechnology (MNT) may be closer than common wisdom dictates: Though a 1999 media report indicates that the research community expects decades to pass between the creation of a nanofactory assembler and the fabrication of consumer goods, a summer 2003 study from the Center for Responsible Nanotechnology (CRN) concludes that the interim between these two events could be less than eight weeks. The combination of MNT with simple-to-use computer-aided design programs will shrink the product design cycle from months to hours, and allow innovation and development to flourish at unparalleled levels. MNT will offer a fast and cheap source of products that could help eliminate disease epidemics, alleviate social unrest caused by a lack of material goods, allow solar power to be tapped as an energy source, and enable developing nations to modernize their industrial infrastructure. On the other hand, the technology has potential to cause dramatic political and socioeconomic disruption by lowering the value of many material and human resources, which could lead to widescale unemployment, corporate erosion, or new forms of monopolization. Even more sinister implications of molecular manufacturing include the creation of bigger machines that could accelerate environmental damage; the development of deadlier weapons that are harder to detect, which could spur a dangerous arms race with far more potential for conflict than nuclear brinkmanship; and ubiquitous, ultrasmall surveillance devices that could drive privacy infringement to new heights. The University of Toronto Joint Center for Bioethics released a report last February stressing the urgent need to examine the potential ethical, economic, legal, environmental, and societal impact of nanotech, as well as open public discussion of the technology's pluses and minuses. Also critical is researching ways to effectively administrate the use of nanotech worldwide. "All areas of society stand to be affected by molecular manufacturing, and unless comprehensive international plans are developed, the multiplicity of cures could be worse than the disease," argues CRN research director Chris Phoenix.



    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM