HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.

ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.

To send comments, please write to [email protected].

Volume 5, Issue 528:  Monday, August 4, 2003

  • "Wonders Aplenty at ACM's SIGGRAPH"
    Wired News (08/01/03); Stroud, Michael

    The Emerging Technologies exhibition at this year's SIGGRAPH is a showcase for 21 state-of-the-art innovations, including a walk-thru fog screen onto which images can be projected, various haptics technologies, and novel musical devices. These cutting-edge technologies have uses that run the gamut from pure entertainment to practical applications. Sony Computer Science Laboratory's Continuator is designed to recognize musical patterns produced by a keyboard and improvise compositions based on its accumulated knowledge. French researcher Jean-Julien Aucouturier says Continuator is agreeable with children and jazz pianists--the former group likes its entertainment value, while the latter group sees the device as helpful to improving their own compositions. Another musically-oriented exhibit is a Japanese-developed "music table" that allows users to construct musical phrases by shuffling around decks of cards. More practical emerging technologies highlighted at ACM's SIGGRAPH conference include Arc Science Simulations' OmniGlobe, an acrylic orb that displays "spherical data" on its surface whose movements are controlled by a track ball; and an inexpensive, single chip visualization tool from Canesta that produces real-time 3D images of nearby objects, such as a keyboard that can be projected onto any flat surface. Exhibits centered around haptics include an artificial "skin layer" designed to enhance the wearer's tactile sensations through electrodes. Also on hand at SIGGRAPH is the City University of Hong Kong's Body Brush, a technology in which computers and cameras work in tandem to translate people's movements into projected 3D images.
    Click Here to View Full Article

  • "The Age of Automation"
    CNet (07/31/03); Kanellos, Michael

    Automation is a running theme at the annual ACM SIGGRAPH conference, as researchers refocus their energies on developing machines that will take over routine tasks to relieve human beings of everyday burdens, a trend that Michael Kanellos terms "extroverted computing." A consortium of universities in England, Germany, Switzerland, Finland, and Sweden is working on sensor technology for the Smart-Its Project, which seeks to aid people in daily tasks. One Smart-Its application would be intelligent build-it-yourself bookcases programmed to warn owners when they are diverging from the assembly instructions. Also on hand at SIGGRAPH was a demonstration of a prototype decorating application developed by researchers at New York University: The technology incorporates luminescent diodes, motors, and position-detectors to allow users to rearrange furniture in a room by pressing a button. University of Tsukuba scientists displayed a "food simulator" in which participants insert oral devices in their mouths. The devices reproduce the biting force needed to crunch through various foods, and are designed as an aid for people who have difficulty chewing. Meanwhile, the Defense Advanced Research Projects Agency (DARPA) is holding the DARPA Grand Challenge, a contest promising a $1 million prize to whoever invents fully automated, off-road robotic cars that can successfully make a 10-hour trip from Los Angeles to Las Vegas. Kanellos notes that Google, Microsoft, and other companies have rolled out or are developing extroverted computing technology for the virtual world, an example being Google's search engine, which is designed to be more responsive to naturally phrased queries.

  • "NCCUSL Pulls Support for Controversial UCITA Law"
    Computerworld (08/01/03); Thibodeau, Patrick

    The National Conference of Commissioners on Uniform State Laws (NCCUSL) has withdrawn its advocacy of the Uniform Computer Information Transaction Act (UCITA), attributing the reversal to rampant political resistance to the measure, which would establish default rules in software contracts. In a letter to the organization's commissioners, NCCUSL President King Burnett stated, "Clearly we are experiencing directed intense and incessant politics and strong opposition, without suggestion of concrete alternatives, from some consumer groups, insurance companies and libraries, and the allies they have accumulated." The opposition argues that UCITA would give vendors too much control over negotiating software licensing terms and conditions and would shield them from liability. UCITA was adopted by Virginia and Maryland in 2000, but no other state has embraced the bill since, which prompted the NCCUSL's recent decision. However, UCITA is far from dead, given that it is still valid in Virginia and Maryland, while Burnett noted that the measure appears to be exerting judicial influence, as evidenced by recent court rulings that cited the law. American Library Association legislative counsel and UCITA adversary Miriam M. Nesbit vowed that the opposition will remain active as long as the measure exists. Vermont, West Virginia, Iowa, and North Carolina have adopted anti-UCITA "bomb-shelter" legislation, and Nesbit said that UCITA disputers will continue the push for similar laws to be enacted elsewhere. Certain UCITA provisions could be incorporated into new or pending uniform laws, and NCCUSL's Carlyle Ring called the legislation in "repose" rather than completely defunct.
    Click Here to View Full Article

  • "More Calls to Vet Voting Machines"
    Wired News (08/04/03); Witt, Louise

    Computerized voting technology is coming under increased scrutiny from decision-makers following a Johns Hopkins University report. The study uncovered serious flaws in the source code of Diebold touch-screen voting machines, more than 40,000 of which are currently in use in 37 states. At their annual meeting in late July, the National Association of Secretaries of State called for electronic voting machine standards to be created by the National Institute of Standards and Technology. Among the security lapses found in the Johns Hopkins study was the inclusion of the password in the source code, lack of encryption, and other loopholes that would allow smart card holders to cast more than one electronic ballot or allow election officials to tamper with tallies undetected. The researchers also warned the machines were subject to remote access hacking. Other groups had already been calling for closer scrutiny of electronic voting methods, including the Ad Hoc Touch Screen Task Force assembled by the California Secretary of State. Stanford University professor and task force member David Dill says a paper audit trail should be used to ensure the validity of election results when using computerized systems. Congress' Help America Vote Act, which has a 2006 compliance deadline, has given states the money and motivation to quickly buy new electronic voting machinery, while the IEEE has already formed a committee to create electronic voting standards. NIST spokesperson Kay Albowicz says the organization has not decided whether it will draft voting machine standards, while new National Association of Secretaries of State President Mary Kiffmeyer warned against a "rush to judgment" against existing touch-screen voting technology. She says, "Standards are being revised as new equipment comes along." She says that process should be accelerated as states must make decisions soon on which equipment to buy.
    For more information on e-voting issues, visit http://www.acm.org/usacm/Issues/EVoting.htm

  • "End of the Road For SMTP?"
    CNet (08/01/03); Festa, Paul

    Internet experts are clamoring to change the email backbone protocol, Simple Mail Transfer Protocol (SMTP), or alter some of its aspects in order to stem the flow of spam. Spam volume is estimated to be as much as 50 percent of that of legitimate email. Academic Suzanne Sluizer, who co-authored the 1981 Mail Transport Protocol immediately preceding SMTP, says the trustworthy protocol is simply too trusting. It was created for research purposes and was to be deployed among the relatively few institutions connected to Arpanet, not in today's commercial Internet jungle. From her experience in computers, Sluizer suggests that any patch fixes to SMTP will eventually prove more difficult than starting again from scratch; others, however, doubt there will ever be enough critical mass to replace SMTP entirely, given its worldwide user base, and are instead pushing enhancements such as authentication for SMTP. Internet Mail Consortium director Paul Hoffman says he wrote one possible fix, called SMTP over SSL/TLS, though he admits that establishing relationships between individuals and mail servers would be difficult. Currently, lists that identify spam-sending mail servers are compiled by volunteers and could be a bulky component of an authentication solution. The Internet Engineering Task Force has commissioned a research group to look into fixes or possible replacements for SMTP, though it seems many of the group's members are pushing for an additional protocol that would build atop SMTP. Microsoft is putting its considerable email-world heft behind an entirely different solution: Changing the domain name system so that message header addresses can be reliably compared with the actual originating address.

  • "Supercomputing's New Idea Is Old One"
    New York Times (08/04/03) P. C1; Markoff, John

    The custom-designed vector supercomputer pioneered by the late Seymour Cray fell out of fashion in favor of machines built from low-cost, mass-produced microprocessors. But Tera founder Burton J. Smith, who acquired Cray Research three years ago, is reviving Cray's breakthrough model, which has drawn the attention of government, industrial, and academic researchers who are racing to build the world's speediest supercomputer. The rise of massively parallel processor (MPP) machines is attributed to a change in government policy, when the end of the Cold War reduced the urgency to maintain the United States' supercomputing race with Japan and the now-defunct Soviet Union, while the MPP concept itself popularized the view that off-the-shelf technology could make supercomputing sustainable. In recent months, supercomputing experts have testified before Congress that the federal government should increase the supercomputing spending budget, and warned that massively parallel computers are inadequate for programs such as the Advanced Strategic Computing Initiative, which is supposed to keep tabs on the U.S. nuclear stockpile and simulate nuclear detonations. Perhaps the most telling example of MPP machines' shortcomings was the 2002 launch of a Japanese supercomputer based on the Cray design philosophy, which is the reigning supercomputer speed champion. Cray's stock, both literal and figurative, has climbed thanks to recent events, including its selection by the Defense Advanced Research Projects Agency to develop prototypes of next-generation supercomputers that can reach petaflop speeds. One Cray design dubbed Cascade could attain such speeds by 2010.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

  • "Senator: ICANN Crucial to 'Net Security"
    TechNews.com (07/31/03); MacMillan, Robert

    At the Senate Commerce Subcommittee on Communications hearing concerning ICANN, the subcommittee's chairman Sen. Conrad Burns (R-Mont.) expressed the view that ICANN should increase its responsibility to the public. Burns indicated concern about the security of DNS, given attacks on the 13 root servers that occurred last October. "The bottom line is that ICANN must be a part of the security solution and not part of the problem," Burns noted. ICANN head Paul Twomey reported that tests on servers and other measures have been underway since ICANN created a security committee in the wake of the 2001 terrorist attacks against the United States. ICANN, however, can only manage systems to a certain degree, given the ownership of many aspects of DNS by private entities. On the issue of ICANN's revamping to better manage its responsibilities, Twomey noted that the effort "has essentially been completed," but acknowledged that ICANN is constantly developing.
    Click Here to View Full Article

  • "Inventor Designs Sign Language Glove"
    Associated Press (08/03/03); Hartman, Carl

    George Washington University researcher Jose Hernandez-Rebollar's AcceleGlove is a system in which a glove outfitted with sensors and a wearable computer can convert American Sign Language (ASL) gestures into spoken words or text as an aid to the hearing-disabled. Hernandez-Rebollar notes that his invention is more advanced than others because it is also capable of translating some of ASL's more complicated arm and body movements into words and simple phrases. The glove can produce signs that correspond to all 26 alphabetical characters, allowing any word to be spelled out, although the process is slow. The device can thus far generate less than 200 words that can be signed with one hand, and a limited number of simple sentences. Institute for Disabilities, Research, and Training director Corinne K. Vinopol thinks the AcceleGlove could be especially useful for deaf parents with hearing children as well as normal parents whose children are hearing-disabled; of particular interest to Vinopol is Hernandez-Rebollar's work to enhance the AcceleGlove so that it can translate ASL into Spanish as well as English. The AcceleGlove's inventor believes a one-handed version could hit the market as early as 2004, while a more sophisticated two-handed model could debut the following year. Hernandez-Rebollar adds that the glove could be integrated with existing wireless gear and be used as a vibration- or text-based communications device for squad commanders to relay orders to concealed soldiers.

  • "Six Degrees of Exploitation?"
    Wall Street Journal (08/04/03) P. B1; Bulkeley, William M.; Wong, Wailin

    Visible Path, Spoke Software, and ZeroDegrees are readying software that can analyze the email correspondence, electronic calendars, address books, and instant-message buddy lists of a company's employees to trace their relationships with outside contacts that could potentially generate new business. Visible Path President Anthony Brydon says the proliferation of cell phones and 10-digit dialing has helped make relationship-mining programs more valuable by spurring users to computerize contact information rather than memorize the data. Visible Path's offering can, for instance, allow a company salesperson to find a co-worker with a potentially valuable outside contact; the software identifies the contact, but keeps the co-worker's identity anonymous. The salesperson can then email the co-worker a request to introduce him to their contact, which the co-worker can either accept or refuse. Both Spoke and Visible Path's software boast denial options, which makes them less intrusive than earlier sales-software programs that tried to force workers into giving up their contact data, notes New York University telecommunications professor Clay Shirky. Another key selling point of relationship-mining software is its ability to evaluate the strength of the relationship between the worker and his contacts--Visible Path, for example, features a Relationship Mining Engine that assigns a stronger relationship if the worker possesses the contact's cell-phone number rather than just an office number, or if the contact responds regularly to the employee's emails. More sophisticated relationship-mining programs are expected to be able to find contacts separated by more than one degree. The privacy implications of such software are likely to provoke controversy, although Stan Wasserman of the University of Illinois joins Shirky in praising Visible Path for its privacy safeguards.

  • "Virtual Reality Conquers Sense of Taste"
    New Scientist (07/30/03); Ananthaswamy, Anil

    Researchers at the University of Tsukuba in Japan have developed a virtual reality device that is able to simulate the taste of food and the way food feels in the mouth. Researchers already have been able to simulate vision, hearing, touch, and smell, but not the sense of taste, which uses chemical and auditory cues in feeling food in the mouth. The device makes use of a thin-film force sensor placed in the mouth to measure and record the force needed to bite through a piece of food, biological sensors made up of lipid and polymer members to record the chemical makeup of the food's taste, and a microphone that records the vibrations of the jawbone while chewing. Cloth and rubber covers the mechanical part of the simulator, which resists the bite in a manner that is similar to what occurs with real food. The device also uses a thin tube that shoots a mixture of flavorings onto the tongue to stimulate basic taste sensations, and a tiny speaker plays back the sound of chewing in the ear. Cheese, crackers, confectionary, and Japanese snacks have been among the foods simulated with the device, but the team still must find a way to use a vaporizer to deliver smells to the nose. Hiroo Iwata and colleagues at the university presented their research at the recent SIGGRAPH 03 computer graphics and interactivity conference in San Diego.

  • "Behind the Blockbusters--Special Effects Tool Locks Characters Onto Film"
    National Science Foundation (07/31/03)

    Computer graphics must be matched precisely with real objects in movie footage, or else the camera's different frame of reference for those objects will ruin the effect. Motion-tracking software called Fastrack from the National Science Foundation's Integrated Media Systems Center (IMSC) helps movie studios match computer-generated and real-life aspects much more efficiently than previous methods. Using special algorithms, effects artists choose a few features from the film, such as a doorpost or street sign, that serve as reference points for the computer generated objects. The software then matches the computer graphics to each frame, processing one frame every few seconds. Effects artists still have to smooth over the combinations, but they say the technology has dramatically cut the amount of time necessary to create integrated segments. Special effects studio Rhythm & Hues has used the IMSC technology in the films "X-Men 2," "Daredevil," and in the soon-to-be-released "Dr. Seuss' 'The Cat in the Hat.'" The IMSC is located at the University of Southern California and focuses on developing new multimedia technologies for security, communications, education, and entertainment. Other innovations include 3D surveillance technology for use in airports and 10.2-channel movie theater sound systems. NSF's Mary Harper says, "The research at IMSC bring engineering, art, mathematics, psychology, and computer science together. The breakthroughs coming out of IMSC affect everyday life."

  • "Thought Leaders Gather to Direct the Reshaping of the Technology Workforce"
    PRNewswire (07/28/03)

    The Anita Borg Celebration and Technology Summit will take place Sept. 9, 2003, at Stanford University. The event will serve as an opportunity for top executives, visionaries, and technologists to celebrate the life and vision of Borg, who died of brain cancer in April. Borg was the founder of the Institute for Women and Technology, a non-profit organization that seeks to encourage more women to get involved in technology development and support women already in the industry. The Institute for Women and Technology was instrumental in bringing together top technology companies, organizations, and educational institutions for that purpose. Sun Microsystems CTO Greg Papadopoulos says he learned from Borg that the way technology is created needs to be addressed if there is going to be an impact on its social relevance. "Mostly, the technology we have been creating is created by nerdy white guys, so we get nerdy, sometimes not-so-useful technology," he explains. Packet Design Chairman Judy Estrin, San Diego Supercomputer Center director Dr. Fran Berman, National Academy of Engineering President Dr. William Wulf, Kara Swisher of the Wall Street Journal, Rebecca Roberts of Public Radio International, IBM Fellow Emeritus Dr. Fran Allen, Forum for Women Entrepreneurs founder Denise Brosseau, and ACM President Dr. Maria Klawe of Princeton University will be keynote speakers and panelists during the celebration. ACM-W, CRAW, Sally Ride's Imaginary Lines, and MentorNet will be among the organizations offering informational and educational exhibits at the summit.

  • "Software Vulnerabilities Fade But Never Disappear"
    InternetWeek (08/01/03); Keizer, Gregg

    Qualys CTO Gerhard Eschelbeck presented his "Laws of Vulnerabilities" at the Black Hat security conference in Las Vegas, detailing various conclusions about the behavior and longevity of software flaws drawn from a study of 1.24 million security holes analyzed over an 18-month period. Eschelbeck indicated that serious software flaws--SQL Slammer, Code Red, and the Microsoft Windows DCOM Remote Procedure Call (RPC) bug being examples--have an average half-life of one month. "Typically, within the first 30 days, only about 50 percent of the vulnerable systems are patched," observed Eschelbeck, who had originally expected a shorter half-life. The longevity of software bugs may also be attributable to companies still deploying servers running older versions of software that have not been updated. Eschelbeck advised security companies to try to reduce the vulnerability half-life period by about 50 percent by this time next year. The existence of a half-life implies that some vulnerabilities are never completely eradicated, and Eschelbeck said this phenomenon directly relates to the persistence of bugs such as Code Red, which is returning in a more muted form. He pointed out that vulnerabilities considered to be less threatening than critical flaws have an even longer half-life. Eschelbeck also concluded that the most pervasive and serious security flaws are refreshed by hackers on a yearly basis, while exploits for eight out of 10 vulnerabilities are available within 60 days of the bug's disclosure. Qualys' Web site has posted a list of top 10 security vulnerabilities that is updated in real time since July 30, and the current edition features the Windows DCOM RPC flaw and four other Microsoft-related bugs.
    Click Here to View Full Article

  • "VR Accommodates Reality"
    Technology Research News (08/06/03); Smalley, Eric

    Flight simulators and other virtual reality systems incorporate concrete elements to give artificial environments a ring of authenticity, and University of North Carolina and Disney Corporation researchers have developed a system that mixes real and virtual objects in an artificial reality. In such a hybrid virtual environment, people can, for instance, interact with real window drapes while viewing a virtual representation of their hands parting simulated drapes to reveal a simulated view out a simulated window. Benjamin Lok, currently at the University of Florida, says the system's core component is a technique for ascertaining when actual and artificial objects collide and supplying a realistic response. The shapes and positions of real objects in the virtual space are determined by a quartet of cameras and object recognition software; the data gathered by the cameras is used to flesh out 3D shells that conform to the objects' shapes. When real and virtual objects collide, only the virtual objects deform or move so as to prevent overlap between objects and shells. The researchers used the environment to simulate a space shuttle payload assembly task that NASA engineers interacted with; Lok says the test results concluded that the system is an improvement over fully virtual environments when it comes to assessing hardware designs and planning assembly tasks. He adds that hybrid environments not only have a greater sense of authenticity, but their ease of installation allows them to be employed earlier in the design process. Lok says the researchers are currently working to improve the virtual representation of real objects, and believes that hybrid systems will not be ready for general applications for two decades. The researchers presented their work at ACM's Symposium on Interactive 3D Graphics in April.
    Click Here to View Full Article

  • "Print a Hologram? Almost, Xerox Says"
    CNet (07/30/03); Borland, John

    A new Xerox technology that takes advantage of a common glitch in the laser printing process could be used to certify hard copies of printed documents, much like holographic stickers are used to authenticate drivers licenses and credit cards. Xerox researchers stumbled upon the solution, dubbed Glossmark, while trying to find a way to reduce differential gloss, a phenomenon in which toner is melted onto the paper, forming glossy areas in the printout. The scientists discovered that this gloss can be tweaked and configured into consistent patterns that are invisible to the eye when the document is viewed head-on, becoming apparent when the document is held at a right angle to the light. "This does speak to something that is going to need to be addressed to ensure hard-copy security," notes International Data analyst Dan Corsetti. "There really is no reliable or affordable way of securing the content on hard-content documents, apart from putting it in a vault and locking it up." However, Xerox has its doubts whether Glossmark, which will be unveiled on July 31, will be viable as a security technology. Xerox lab manager Rob Rolleston comments that Glossmark could be utilized as an artistic tool, or be incorporated into greeting cards.

  • "Databases Get an XML Infusion"
    InfoWorld (07/28/03) Vol. 25, No. 29, P. 46; Udell, Jon

    Databases must store and manage XML documents now that XML is capable of depicting both document data and database data through the deployment of a hybrid SQL/XML relational database management system (RDBMS). OpenLink CEO Kingsley Idehen remarks that "human context is never stored in the RDBMS today," but the context of the data and how it was received cannot be understood without such a process. The majority of enterprise data resides in file-system documents rather than relational databases, but the transition of business documents from existing formats to XML allows relationships to be threaded between document data and database data. There are several strategies for storing XML in the RDBMS: Documents can be stored as database columns, which is the easiest and most flexible method, or they can be stored as objects or be "shredded" into relational tables. The relational table approach optimizes the database's query engine and atomic update capabilities, but it is easier to map from SQL to XML than vice-versa. All XML-oriented query approaches are based on XPath syntax, which is a component in Oracle, DB2, SQL Server, and OpenLink's Virtuoso, as well as the World Wide Web Consortium's anticipated XQuery specification. Vendors are eager for XQuery 1.0 to be finalized, but their deployments of the standard, after a fashion, will be less powerful than their current SQL/XML deployments. Within a few years, it will be possible for authorized parties to view core data and contextual metadata at all times, as well as access both types of data through pan-enterprise queries.

  • "Computing: Quantum Bits and Silicon Chips"
    Nature (07/31/03) Vol. 424, No. 6948, P. 484; Hogan, Jenny

    Bridging the gap between quantum and desktop computers is the goal of University College London materials scientist Marshall Stoneham, who has received 3.7 million pounds to flesh out a quantum device that calculates efficiently, functions at higher temperatures than competing machines, and can be assembled with existing equipment. Current quantum computers can store and manipulate quantum bits (qubits) either by exploiting an ion's energy state or using the spins of atomic nuclei to represent 0 or 1: However, a quantum computer based on the ion manipulation approach is extremely large, while an nuclear-spin-based device requires magnets cooled by liquid helium, and cannot handle qubit increases past a certain point because the noise from neighboring molecules can mask the result signal. A design for a silicon-based quantum computer was proposed in 1998 by Australian researcher Bruce Kane, who theorized that phosphorus-impregnated silicon could yield a device that stores qubits in the spins of the embedded atoms' nuclei; the spins themselves would be flipped by radio signals, while the interaction between neighboring atoms would be reconciled by an electrode, enabling linked qubits to perform operations. Such a design could supposedly contain thousands of qubits that would be manipulated by existing electronics. Kane's proposal inspired several attempts to build a silicon-based quantum computer, including an Australian effort that has yielded top-down and bottom-up construction strategies and has modified the original concept to manipulate electron spins rather than nuclear spins. Stoneham's proposal would randomly distribute the embedded atoms within the silicon and employ laser light to manipulate electron spin. Qubits would be connected with "control atoms" that, like the qubit electrons, can be excited by specific laser-light frequencies. Stoneham thinks his quantum computer should operate at temperatures above 4 K, and anticipates a working three-qubit system by 2004 and a processor featuring tens of qubits by the end of the decade.

  • "The End of Handicaps"
    eSchool News (07/03) Vol. 6, No. 7, P. 40; Kurzweil, Ray

    In an address to the CSUN 18th Annual Conference on "Technology and Persons with Disabilities," futurist and National Medal of Technology recipient Ray Kurzweil presented his vision of the sweeping technological changes he expects to take place over the next few decades--in fact, he argued that some of these changes have already begun. Kurzweil estimates that the rate of progress doubles every decade--by that reckoning, 21st century progress will be roughly 1,000 times greater than 20th century progress. Kurzweil envisions ubiquitous computers with always-on Internet connections, systems that allow people to fully immerse themselves in virtual environments, and artificial intelligence embedded into Web sites by 2010. The futurist also projects that 3D molecular computing will be a reality by the time Moore's Law reaches its limits, while nanotechnology will emerge by the 2020s. Kurzweil predicts that the human brain will have been fully reverse-engineered by 2020, which will result in computers with enough power to equal human intelligence. He forecasts the emergence of systems that provide subtitles for deaf people around the world, as well as listening systems also geared toward hearing-impaired users, while blind people should be able to take advantage of pocket-sized reading devices in a few years. Kurzweil believes that people with spinal cord injuries will be able to resume fully functional lives by 2020, either through the development of exoskeletal robotic systems or a technique to mend severed nerve pathways, possibly by wirelessly transmitting nerve impulses to muscles. All of these developments are expected to reach maturity and culminate in enhanced human intelligence by 2029.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

[ Archives ] [ Home ]