HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 709: Friday, October 22, 2004

  • "A Global Assault on Anonymity"
    CNet (10/20/04); Borland, John; Lemos, Robert

    The growing sophistication of data-mining and data-sharing technology has added a sense of urgency to the debate on government surveillance of U.S. citizens in the name of homeland security, with civil liberty proponents arguing that privacy protections should be built into such programs at the outset. Data-mining initiatives designed to identify suspected terrorists or terrorist activity, such as the Total Information Awareness (TIA) project and the Computer Assisted Passenger Prescreening System, were heavily criticized by civil libertarians who feared their potential for privacy abuse, as well as by professional societies such as the ACM, which cited technical problems. A less public data search tool used by federal agencies to identify foreign terrorists or American citizens linked to foreign terrorism is the Verity K2 Enterprise System, which was mentioned in a recent Government Accountability Office report. The tool can focus on a wide array of sources--internal intelligence databases, Web sites, data flowing across agency-monitored communications networks, etc.--and index the results from these sources, alerting investigators when data pertinent to an inquiry is found. Meanwhile, the Multistate Anti-Terrorism Information Exchange (MATRIX) system has drawn considerable fire and been rejected by 11 or the original 16 participating U.S. states for reasons of cost or privacy. Law enforcement officials argue that MATRIX merely accelerates the time it takes to study arrest records, addresses, driver's license details, and other publicly available information that would be routinely checked in the course of investigations. Nevertheless, the ACLU has filed a privacy infringement lawsuit against the MATRIX program in Michigan. Some researchers think that the debate between opponents and supporters of data-mining and data-sharing programs has ignored the need to find a middle ground between privacy and security.
    Click Here to View Full Article

    For more information on ACM's activities involving the TIA project, visit http://www.acm.org/usacm.

  • "E-Voting Sceptics Use Web to Monitor Election"
    IDG News Service (10/19/04); Heichler, Elizabeth

    A geographically dispersed, mostly volunteer team of technologists is using the Web and open-source toolkits to ready the Election Incident Reporting System (EIRS), a tool to help the Election Protection Coalition identify and respond to e-voting problems, for the Nov. 2 election. Erik Nilsson, chair of the Computer Professionals for Social Responsibility (CPSR) working group on voting technology, says EIRS will help election security proponents rapidly address voting problems as they occur on Nov. 2. Among the open-source products involved in the development of EIRS is PHP Surveyor, a toolkit for constructing online surveying instruments and managing the results; AdvoKit, which can coordinate tasks, campaigns, and volunteers; and the MapServer clickable map generator. EIRS developers have also received support from the development teams responsible for the open-source toolkits. The EIRS development team, which hails from both the CPSR and the Verified Voting Foundation, involves roughly five core developers and a more or less equal number of testers, three people focused on the user interface, and a four-man group concentrating on architecture, security, and physical hardware. Since EIRS was launched about four months ago, between 30 and 35 people have worked on the system. EIRS needs to provide more detail about election incidents: For example, a user who clicks on a map to view incidents in a specific area can only raise the number of incidents, although Verified Voting executive director Will Doherty promises that this information will be augmented.
    Click Here to View Full Article

    For information on ACM's e-voting activities, visit http://www.acm.org/usacm.

  • "Pen Stroke Cuts PDA Web Clutter"
    Technology Research News (10/27/04); Patch, Kimberly

    Microsoft and Tsinghua University researchers have devised a Web browser for small-screen devices such as personal digital assistants and smart phones that allow users to easily collapse and expand selected portions of the Web page. The unique benefit of the new interface is that it enables users to manipulate Web content with single-stroke gestures of a pen: By drawing the pen diagonally across portions of the screen, the user creates a rectangle of selected content for either collapse or expansion of the entire area or individual cell, depending on whether the pen is drawn upwards, downwards, from left to right, or vice versa. Thin placeholders enable users to easily open collapsed content again and links are still accessible even when content areas are collapsed. The browser remembers how Web pages were configured by the user so they are loaded that way on the next visit to that site, a feature that saves time in some cases because collapsed pages load faster. One of the main challenges to creating the browser was figuring out how to quickly analyze Web page structures and capture pen strokes in the Web browser. Microsoft researcher Patrick Baudisch says the browser has been adapted to smart phones as well, many of which do not use touch-screen pen interfaces; instead of pens, users push their keypad buttons which are mapped to browser controls. The researchers say such a Web browser is significant because of the rapid proliferation of mobile Web devices in comparison with PC deployment. The Microsoft-funded research will be presented at the Association for Computing Machinery Symposium on User Interface Software and Technology, which takes place October 24-27 in Santa Fe, New Mexico (http://www.acm.org/uist/).
    Click Here to View Full Article

  • "For Missing Web Pages, a Department of Lost and Found"
    New York Times (10/21/04) P. E5; Eisenberg, Anne

    Under the direction of IBM lab researcher Andrew Flegg, a team of four student interns has developed a working prototype for software that can check links between Web pages and search for the correct pages if the connections are broken or incorrect. The Peridot program uses a page's source code, context, and other data to build "fingerprints" that encapsulate the page's links and unique features, which are stored for later comparison; the software regularly checks the fingerprints to see if there are any discrepancies in a Web site's links, and then ascertains how important such discrepancies are. Flegg says he conceived of the software as a tool to not only address broken links, but also links whose content had become inaccurate or inapplicable. Oxford University student and IBM intern Ben Delo notes that Peridot required algorithms that could measure the degree of change in a link, determine how significant such a change was, and find the optimum techniques for retrieving the missing links to address the challenge of sifting through a huge number of pages in search of replacement connections. Another challenge cited by Delo is dealing with links that have the right URL but the wrong content. "You need a system to help manage this risk" so that a company's reputation will not be tarnished when visitors at its Web site are misdirected, he says. The software is designed so that users can select the pages to be updated automatically as well as the substitutions that require their notification. Although Peridot is not commercially available, it may eventually be employed to monitor the accuracy of internal corporate Web sites, easing the burden of Web administrators; the software could also be offered as a service to clients by ISPs, and could benefit Web site users as well.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Alliance Seeks to Get a Stereo to Listen to a PC"
    Wall Street Journal (10/21/04) P. B3; Dvorak, Phred

    The growing popularity of digital content among consumers has sparked intense competition among electronics and PC makers to find a way to enable a network of devices that can seamlessly interact and share content in the home. Meeting this challenge is the goal of the Digital Living Network Alliance (DLNA), a consortium that counts most of the major consumer-electronics and computer suppliers as its members. The DLNA's mission is to establish technical standards for devices that can transfer data in the home. Complying with the alliance's guidelines is not too difficult or expensive because the group's proposed standards are already authenticated products rather than a completely new networking technology. The consortium intends to establish a procedure for certifying DLNA-compliant devices by next year, so that users can share digital content between DLNA-approved products regardless of manufacturer or media formats. The alliance has no plans to address the issue of copyright and content encryption until 2005. The DLNA is hosting plugfests, sessions where engineers gather to test interoperability between different devices. Meanwhile, companies such as Intel and Microsoft are pursuing their own home-networking standards initiatives.

  • "U.N.: Domestic Robots to Surge to 4.1 Million by 2007"
    Associated Press (10/20/04)

    The United Nations' annual World Robotics Survey issued on Oct. 20 by the U.N. Economic Commission for Europe (UNECE) and the International Federation of Robotics expects the number of domestic robots in use to skyrocket to around 4.1 million by the end of 2007, thanks to falling prices. The report estimates that 607,000 domestic robots--the majority of them automated vacuum cleaners--were in use at the end of last year. In the first six months of 2004, business orders for robots exceeded those recorded for the same period a year earlier by 18 percent, mostly in North America and Asia. The UNECE thinks demands for industrial robots could soon be outpaced by domestic robots. The majority of the 250,000 robots in use in the European Union at the end of 2003 were in France, Germany, and Italy, while demand for robots from North American businesses increased 28 percent to reach about 112,000; price drops have also encouraged greater robotics investment in richer developing nations such as Brazil, Mexico, and China. The study reckons that approximately 21,000 "service robots"
    programmed for chores such as milking cows, assisting surgeons, and handling toxic waste are currently in use, and that number is expected to surge to 75,000 by 2007. The report also anticipates the emergence of robotic caretakers, surgeons, firefighters, and hazardous-area inspectors by 2010. IRobot CEO Colin Angle admits that skepticism remains the biggest impediment to the proliferation of domestic robots, but adds that greater affordability and demonstrated practical applications have begun to build momentum in the household automation sector.
    Click Here to View Full Article

  • "For an Inventor: IM Opens a Window to a World of Games"
    New York Times (10/21/04) P. E4; Weingarten, Marc

    Gaming prodigy Jules Urbach has created a platform for instant-message-based video games and other applications that he plans to offer free to hobbyist developers and others. Urbach says the Otoy game engine is the key to leveraging instant messaging for a multitude of purposes, including huge multiplayer games that are free. "What I've always been most interested in is the idea of a virtual community, and AOL had the first chat room and IM," he says of his admiration for America Online's sometimes derided approach to the Internet. Urbach is a co-founder of video game firm Groove Alliance, which makes low-memory, online 3D games for clients such as Nickelodeon, Disney, Shockwave, and Electronic Arts; he is currently designing a Star Trek-like game for the Otoy platform that will be run in a window linked to the users' instant-messaging application, so that numerous players can be involved in the game simultaneously and use a separate window to chat with each other. Urbach says his Otoy games are highly componentized and could provide fertile ground for advertisers who could, for example, paste clickable billboards on virtual spaceships: "Each piece in a game can be a separate, encrypted stream," Urbach notes. Otoy will be made available as a free download next year, and Urbach hopes individual developers will use it to create applications that pull up Web browsers, MP3 files, Excel spreadsheets, or whatever other applications they can cook up. Urbach developed Hell Cab, one of the first CD-ROM games that became a best seller in 1992, and created the first 3D video game using Macromedia Director software.
    Click Here to View Full Article
    (Articles published within 7 days can be accessed free of charge on this site. After 7 days, a pay-per-article option is available. First-time visitors will need to register.)

  • "Tech Defense Roundtable"
    CNet (10/18/04); Barksdale, Jim; Schneier, Bruce; Berman, Jerry

    Technology plays a key role in guarding the United States against terrorism, but business, civil rights, and technology experts have important advice about how to best use technology to protect the country. The Markle Foundation Task Force on National Security in the Information Age brought together technologists, national security experts, and civil liberties advocates to draft a framework for information sharing. Unlike centralized database approaches, the Markle Task Force recommends a decentralized, trusted network capability that allows different government agencies to tap into a shared pool of information based on relevancy and authorization. This would allow government representatives at different jurisdictional levels and geographic locations to act quickly and collaboratively on emerging threats, says task force co-chair Jim Barksdale. Markle Task Force recommendations have been supported by the 9-11 Commission and incorporated into Senate legislation, and efforts should be made to address privacy issues upfront and through congressional channels, notes task force member and Center for Democracy and Technology President Jerry Berman. Counterpane Internet Security chief technology officer Bruce Schneier says it is virtually impossible to secure every possible terrorist target individually, arguing that the government should instead focus on intelligence gathering and analysis to help prevent terrorist strikes, as well as communications technology that allow emergency response personal to lessen the impact of possible attacks. Progress & Freedom Foundation senior adjunct fellow Solveig Singleton explains that securing the United States against terrorists is an enormous task requiring careful balance between civil liberties and security, and achieving that balance means greater accountability with judicial and legislative checks against executive authority as well as delineated federal responsibilities.
    Click Here to View Full Article

  • "How Do You Think the Brain Works?"
    Fortune (10/18/04) Vol. 150, No. 8, P. 215; Stipp, David

    Jeff Hawkins, who counts the groundbreaking Palm Pilot and Treo among his credits, has never lost sight of his goal to understand how human intelligence works, and he provides a radical theoretical construct of such a mechanism in his book, "On Intelligence." This theory could become the foundation for revolutionary technologies such as computers that understand speech against noisy backgrounds, highly accurate surveillance and warning systems, automated collision-avoidance systems, and other applications involving machines imbued with human-like intelligence. Most of these systems will probably take the form of disembodied intelligences that use sensors to measure phenomena outside of human perception. Hawkins' hypothesis, the result of almost 20 years of research, posits that intelligence is derived from the brain's memory-based ability to construct an internal model of the world in order to make predictions that aid survival. A key insight Hawkins applied to his theory concerned the neocortex, which functions as the seat of conscious perception, thought, language, and purposeful movement in the brain. Johns Hopkins University researcher Vernon Mountcastle noted that the neocortex's six-layer neuronal architecture never changes even though different neocortical regions perform different operations, leading to speculation that their distinct tasks stem from how the neocortical areas are connected to other brain regions instead of from fundamental differences in how they function. Hawkins further speculated that the neocortex's chief function could be continually making predictions and expectations about sensory inputs by tapping memory, a notion that explains the brain's ability to effortlessly single out novel input, such as unfamiliar objects in a familiar setting. Hawkins boils creativity down to making predictions using analogy, arguing that "Finding a solution to a problem is literally finding a stored pattern in your cortex that is analogous to the problem you are working on."
    Click Here to View Full Article

  • "What the Future Holds"
    InformationWeek (10/18/04) No. 1010, P. 8; Ricadela, Aaron

    Jim Gray of Microsoft, Hewlett-Packard Labs director Dick Lampman, Palo Alto Research Center (PARC) President Mark Bernstein, Intel CTO Pat Gelsinger, IBM research director Paul Horn, and Sun Microsystems CTO Greg Papadopoulos discuss needed or expected changes in the computer industry over the next 10 years. All six computer scientists agree that speed is of the essence: Microprocessor speed, information retrieval speed, and product integration speed have to accelerate, while the United States must narrow the lead other countries have taken in churning out science and technology talent. Horn forecasts "a sea change in the way processors are designed" because computer performance's rate of expansion is overtaking chips' rate of increasing clock speed, which is winding down. Lampman expects silicon electronics to be supplanted by nanoelectronics in the long term, while Bernstein anticipates a major impact from organic electronics; Horn, however, does not see any clear replacement for silicon yet. The scientists are pursuing different routes to reach the goal of advanced home information interfaces for entertainment--Gelsinger says Intel is focusing on putting PC, digital video recorder, and game machine functionality on a single silicon chip, while HP is licensing technology to a Swiss chip manufacturer for a processor geared toward digital TVs and DVD players. The researchers generally concur that the PC still has a lot of life left in it, but the shortcomings of its hierarchical filing system are becoming more and more pronounced as the number of electronic documents increases. To address this problem, Papadopoulos is interested in the notion of a "personal network" of information, while Bernstein says PARC is looking into touchscreen displays and Lampman cites HP's efforts to market videoconferencing for the enterprise. Gray and many others believe the decline in science and engineering program enrollments at U.S. colleges will threaten the nation's competitiveness.
    Click Here to View Full Article

  • "Internationalizing Top-Level Domain Names: Another Look"
    CircleID (10/18/04); Klensin, John

    The internationalization of Internet language implies not only the availability of domain names that are entirely in languages other than English and other Latin-based scripts, but also standardization and internationalization of email addresses and resource locators. While the former is an easier goal to attain, the latter is also necessary to allow users access to the Internet entirely in their native languages and scripts. In this paper, Dr. John Klensin, formerly a principal research scientist at MIT, argues that in order to achieve internationalization of the Internet, officials must ask: "What should the user see (or enter) and what is the best way to accomplish that?" The goal cannot be effectively achieved at the protocol level or by changing the DNS system, he says. Generally, changes should not be sought in the realm of low-level network functions or DNS operators, but rather at the level of individual users' ability to use their own language when navigating the Internet. Klensin suggests translating or mapping top-level domain (TLD) names at the local rather than global level, an approach that allows browsers, mail user agents, and applications software to meet the specific needs of local users. This alternative would make it more feasible for user interface software to translate non-ASCII TLD names to standard form and vice versa. As a result, Chinese users could refer to a variety of TLDs using Chinese script, while French users could refer to the same domains using their own language.
    Click Here to View Full Article

  • "Machine Dreams"
    CIO (10/15/04) Vol. 18, No. 2, P. 76; Jahnke, Art

    Inventor and author Ray Kurzweil responds to concerns that technology will continue to boost productivity without spurring job growth with the observation that the employed percentage of the potential workforce as well as wages in constant dollars have increased dramatically in the last century as a result of automation-driven prosperity. He admits that these sorts of trends entail short-term job displacement, but says that people's increasing abilities to function in cyberspace is facilitating "a reallocation of mental work" that is a good thing for the global economy. Kurzweil also points out that the emergence of international cooperation in projects that require skills and education is a positive sign, as the resulting products will be universally beneficial. He expects information and fabrication methods to advance over the next 20 or so years to the point where nearly any physical object can be created at almost no cost. Kurzweil predicts that most IT departments will be free of "clutter" such as desktops by 2010 thanks to the proliferation of mobile, virtually invisible technology, broadband connectivity, and online access. These transformed departments will primarily concern themselves with security against malware. The inventor anticipates a merging of computers and the human body within the next few decades, starting with the noninvasive implantation of nonbiological intelligence and followed by the augmentation of cognitive capability via countless numbers of blood cell-sized devices that interact directly with neurons. An anti-aging advocate, Kurzweil is convinced that the means already exist to retard the aging process so that even baby boomers can maintain their health and vitality long enough for biotechnology to mature to the point where human bodies and brains can be rebuilt.
    Click Here to View Full Article

  • "Phones Pick Up Language"
    Technology Review (10/01/04) Vol. 107, No. 8, P. 22; Vatz, Mara E.

    Speech recognition technology is the interface of the future for mobile devices because they allow users to quickly input information and control functions without fiddling with small keyboards or scroll menus. Some mobile phones already use speech recognition for simple tasks such as dialing a preset number, and technology firms are now developing more advanced speech recognition systems that not only allow a greater vocabulary of recognized words, but also are speaker-independent, meaning they do not need to be trained to understand a user's voice. Unlike the traditional algorithms used to identify words according to their particular sound wave pattern, speech recognition technology from VoiceSignal uses the smallest units of recognizable speech, called phonemes, to identify stored words; this allows a far greater vocabulary since the device does not have to store entire audio files of each word, only the phonemes and their arrangements into particular words. An expanded vocabulary allows VoiceSignal to offer innovative new speech recognition applications on handheld devices, such as software that allows access to specific functions just by speaking a phrase. Telling the phone to send a text message to a particular stored number would accomplish what would normally take up to 10 clicks on a button-interface. The company plans text message dictation software in coming months, and is eyeing the Asian market where users have to deal with thousands of characters instead of an alphabet. Mitsubishi Electric Research Laboratory is researching speech recognition for use with consumer electronics, especially choosing functions. Mitsubishi researcher Peter Wolf says speech recognition is well suited to picking out selections since users only have to call up the name of a song, for example.
    Click Here to View Full Article
    (Access to this article is available to paid subscribers only.)

  • "Fixing the Vote"
    Scientific American (10/04) Vol. 291, No. 4, P. 90; Selker, Ted

    The 2000 U.S. presidential election was characterized by massive voter disenfranchisement owing to registration database errors, equipment foul-ups stemming from poorly designed ballots, and aggravation at polling places due to long lines and other discouraging holdups. Election officials are upgrading to direct record electronic (DRE) voting machines as way to avoid similar problems, but these products must ensure a simplification of the election process, reduce errors, and remove the possibility of fraud if they are to be truly effective. Complicating the situation are internal and external security issues with e-voting machines, a lack of industry-wide standards, disparate testing and usage policies for voting equipment across local jurisdictions, implementation problems, and user interface difficulties. It is recommended that a state or county considering the purchase of DREs recruit expert testers to rigorously evaluate the user interface and audit the systems for security holes, glitches, and malware; furthermore, election officials and poll workers should become familiar with the machines and their operations while following practices that uphold the security of votes, such as election-day testing with dummy precincts. Also, steps must be taken to prevent widescale voter disenfranchisement by streamlining and improving the voter registration process and polling-place practices overall. DREs should be tested for "Easter eggs"--pieces of code that are not apparent to a program reader--on election day, while strategies to support reliable computing in financial transactions and other scenarios can be applied to voting systems. Other e-voting security strategies favored by researchers and critics include encryption methods and open source schemes, self-checking computer agents that conduct internal audits to confirm each step in the voting process, and the incorporation of a voter-verified paper ballot. However, MIT's Ted Selker claims this last measure is complicated for both voters and election officials and is also exploitable by fraudsters, and he recommends vote verification via audio recorded feedback as a better option.
    Click Here to View Full Article

  • "Nanotechnology: The Revolution Has Begun"
    Military & Aerospace Electronics (09/04) Vol. 15, No. 9, P. 20; McHale, John

    Scientists at the NASA Ames Research Center foresee useful nanotechnology applications for space exploration in the next 10 to 15 years. In a white paper, NASA Ames Center for Nanotechnology director Dr. Mya Meyappan and colleagues predict the emergence of intelligent, reconfigurable, and autonomous spacecraft, leading to networks of deployable, miniature planetary probes and micro-sized vehicles that can perform an array of measurements. Frost & Sullivan's 2003 Nanotechnology Marketplace Analysis report expects carbon nanotubes (CNTs) to be embedded in microprocessors within 10 to 15 years, resulting in faster and less power-consumptive computer chips as well as less expensive chip-making thanks to CNTs' ability to self-assemble. CNTs are particularly exciting to researchers with their potential to revolutionize spaceflight by contributing to digital nanoelectronics, integrated aerospike engines, lithium batteries and fuel cells, smart materials such as electronically operated flight surfaces, H2 storage, and composite aeroshells. Meyappan attributes the long expected wait for practical nanotech applications to the need for tech breakthroughs to be "shaked and baked in a lab for 10 to 12 years before people have the ability to make large quantities with large scalable manufacturing." Frost & Sullivan analysts list electronics and computing, chemicals and materials, defense and security, medical and health care, fabrication and instrumentation, and consumer goods as the major emerging nanotech markets. The Frost & Sullivan report cites National Science Foundation estimates projecting that the market for nanotech-enabled goods and services could be worth $1 trillion by 2015, with defense and aerospace accounting for $70 billion.
    Click Here to View Full Article

  • "The Best-Kept Secret?"
    Software Development (10/04) Vol. 12, No. 10, P. 51; Ambler, Scott W.

    Vendors, academics, and industry organizations continuously criticize the modeling element of software development as a dysfunctional practice hindered by complexity and complicated techniques, when in fact it is a highly efficient process that uses simple tools and methods. Model-storming sessions are typically off-the-cuff, two- to three-person affairs in which team members investigate a problem using a shared modeling tool--a whiteboard, for instance--until understanding is reached, after which they resume their projects. "Agile Database Techniques" author Scott W. Ambler contends that it is high time to start discussing the model-storming process in detail in order that the practice may be refined. He recommends that companies support model storming by providing sufficient available whiteboard space while ensuring that the whiteboards are visible and readable to people from their workstations. Ambler notes that a protocol for whiteboard work must also be developed, so that decisions such as the appropriate time to erase something can be determined. The author offers a number of protocol suggestions at http://www.agilemodeling.com/essays/whiteboards.htm. Ambler also emphasizes the role of team culture in a model-storming session's success. Team members must not feel uncomfortable at the prospect of assisting or requesting help from others.
    Click Here to View Full Article

  • "The Incredible Shrinking Man"
    Wired (10/04) Vol. 12, No. 10, P. 178; Regis, Ed

    Nanotechnology godfather K. Eric Drexler's star has fallen dramatically in the last few years, culminating in December 2003 when Rice University chemist Richard Smalley publicly discredited Drexler's celebrated concept of molecular manufacturing and accused Drexler of fear-mongering with his theory that self-replicating machines could run amok and destroy the world. Another defeat came shortly afterward with President Bush's ratification of the 21st Century Nanotechnology Research and Development Act, which eliminated rather than prioritized funding for molecular manufacturing. The version of the bill the House originally passed contained a stipulation to conduct research into molecular manufacturing provided a successful feasibility study was carried out first, but this provision was jettisoned in a later draft when support for Drexler's concept waned. The White House's Office of Science and Technology Policy got nervous that the public would respond to nanotech with hostility because of doomsday scenarios suggested by Sun Microsystems' Bill Joy and sci-fi author Michael Crichton, among others. Additional pressure came from the NanoBusiness Alliance, which was more interested in the development of nanomaterial-enhanced commercial products than molecular assembly. Drexler's bitterness at this turn of events stems from his long-term personal commitment to molecular manufacturing and revolutionary applications such as nanoscale disease treatment, pollutant cleanup, and rapid fabrication of virtually any substance out of ordinary components--applications that Drexler feels have been abandoned in favor of commercial interests. He even goes so far as to speculate that this rejection could ultimately destroy the United States as a world power, arguing that "In a competitive world, suppression of research in molecular nanotechnology is the equivalent of unilateral disarmament."
    Click Here to View Full Article

  • "The Internet of Things"
    Scientific American (10/04) Vol. 291, No. 4, P. 76; Gershenfeld, Neil; Krikorian, Raffi; Cohen, Danny

    Enabling everyday objects to seamlessly connect to a data network will involve resolving differences between competing device interconnection standards by applying the same internetworking principles that allowed heterogeneous networks and computers to be integrated into the homogenous Internet. The Internet-0 (I0) project was launched with the goal of interdevice internetworking in mind, write researchers Neil Gershenfeld, director of MIT's Center for Bits and Atoms; Raffi Krikorian, an MIT graduate student and leading developer of I0; and Danny Cohen, a Sun Microsystems Distinguished Engineer and one of the fathers of the Internet. Meeting this formidable challenge requires adherence to seven central tenets. The first principle is for each I0 device to use Internet Protocol (IP), while the second is to deploy the communications protocols in unison rather than separately. The third maxim posits that two I0 devices can operate without the presence of a third device, because the data and routines each device needs are stored in the device itself rather than within a central server. The fourth principle requires that each I0 device manage its own identity, while the fifth dictates that bits used by I0 should outsize the network in order to eliminate communications problems between dissimilar interfaces. The use of big bits dovetails with the sixth principle, in which data encoding remains identical regardless of the physical media used to transmit the data. The employment of open standards constitutes the final maxim. The resulting "Internet of things" could support such innovations as simple light and switch configuration by homeowners; medicine cabinets that automatically remind owners when to take their medication; and automatic synchronization of clocks over the network to the global time standard.

  • "The Myth of Mind Control"
    Discover (10/04) Vol. 25, No. 10, P. 40; Horgan, John

    A holy grail for many neuroscientists and researchers working on brain-machine interfaces is to translate the neural code of the human brain into clearly delineated mechanisms that underlie perception, memory, decisions, and emotions--mechanisms that could be tapped to enhance, modify, or perhaps direct a person's cognition and behavior. This brings to mind the sinister ramifications of mind control, as well as more positive augmentations such as improved mind/body awareness and better treatment and prevention of neurological disorders or damage. However, the challenge is formidable, as the brain is astronomically more complex than any computer currently in existence. Duke University researcher Miguel Nicolelis is confident that a certain amount of neural decoding is possible, at least enough to result in practical neural prostheses for disabled people; but he doubts that science will be able to decipher the language of meaningful thought and memory, which is unique for each individual. Although specific neurons can perform specific functions, recent research has shown that neurons can be re-tasked for other operations and are continuously undergoing behavioral shifts throughout a person's life, while numerous theories contradict the assumption that a neuron's firing rate is its only means of transmitting data. Analysis of the motor cortex demonstrates that the brain can erect new coding architectures in response to unusual circumstances. Such revelations are causing cracks to appear in long-cherished beliefs in the existence of a simple neural coding scheme, thus complicating assumptions about the potential of neural decoding. But neuroscientists are hopeful that the brain's adaptability will ensure the ultimate success of neural prosthetics based on approximations rather than precise models of brain cells.
    Click Here to View Full Article


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM