HomeFeedbackJoinShopSearch
Home

ACM TechNews sponsored by Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.    Looking for a NEW vehicle? Discover which ones are right for you from over 250 different makes and models. Your unbiased list of vehicles is based on your preferences and years of consumer input.
ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either AutoChoice Advisor or ACM. To send comments, please write to [email protected].
Volume 6, Issue 703:  Wednesday, October 6, 2004

  • "Software Disasters Are Often People Problems"
    Associated Press (10/04/04); Fordahl, Matthew

    Computer system failures--some of them with dire consequences--are usually rooted in human error rather than technology, especially as the systems grow more sophisticated. For example, data overload due to infrequent maintenance caused the shutdown of an air traffic control communications system in Southern California on Sept. 14. Analyst Joshua Greenbaum blames 90 percent of system crashes on poor training, implementation, or project execution, and many experts expect the situation to be exacerbated as more systems become dependent on other computers and more operations are automated by software. The success or failure of a project often hinges on how well an organization delineates its business processes and the redesign route they follow, and communicates these guidelines to the technical team. Other factors that can lead to failures include a shortage of strong leadership, and miscommunication with project developers stemming from inadequate resource allocation, little participation of stakeholders in planning sessions, and indifferent executives. ITKO CEO John Michelsen says developers should not be handed the responsibility of validating business requirements because their unfamiliarity with such documentation or cultural differences make them ill-equipped for the job. A 2002 National Institutes of Standards and Technology study estimates that the U.S. economy is losing approximately $59.5 billion a year to software bugs, while about $22.2 billion could be saved through better testing. In addition to a lack of proper training, some employees may be working against projects because they perceive them as threats to their job security.
    Click Here to View Full Article

  • "Senate Wants Database Dragnet"
    Wired News (10/06/04); Singel, Ryan

    The Senate is expected to hold a final vote on a bill Wednesday night to create a network of interconnected government and commercial databases containing a huge volume of records on American citizens that could be instantly queried by federal counter-terrorist investigators. The draft of the bill, sponsored by Sens. Susan Collins (R-Maine) and Joseph Lieberman (D-Conn.), was based on recommendations of the 9/11 Commission, while the network it proposes uses the Markle Foundation Task Force's December 2003 report as a guide. The task force urged the deployment of anonymized technology, graduated echelons of permission-based access, and automated auditing software as anti-abuse measures. An appendix to the report recommended that the system should "identify known associates of the terrorist suspect, within 30 seconds, using shared addressees, records of phone calls to and from the suspect's phone, emails to and from the suspect's accounts, financial transactions, travel history and reservations, and common memberships in organizations." Center for Democracy & Technology director James Dempsey says the commercial records contained in the databases, being public records, will not offer such a broad scope of information. Rather than seek patterns in data warehouses in an attempt to spot terrorist activities, the system would look for any known information relating to a name provided by the investigator. Critics also say the proposal is too ambitious and relies too much on commercial data, raising the specter of civil liberty infringement. Meanwhile, Lee Tien of the Electronic Frontier Foundation says Congress is being "institutionally lazy" for failing to hold hearings on the proposal, and argues that data sharing would become a threat without widespread privacy and due process safeguards.
    Click Here to View Full Article

  • "Bye-Bye Blueprint: 3D Modeling Catches On"
    CNet (10/04/04); Becker, David

    Building information modeling (BIM), in which 3D computer models replace traditional 2D documents such as blueprints, is gaining ground in the field of architecture and building design. One of BIM's strongest points is to incorporate real-world data to enhance the accuracy, quality, and speed of architectural projects, as well as the management of the finished buildings; but the technique's widescale adoption for more conventional projects hinges on the construction-related industry reorganizing itself around the advantages of 3D modeling and other technology. Architects, builders, and industry analysts agree that the speed of BIM adoption will depend on how far existing processes defining how information is exchanged between parties are rethought. Traditional procedures entail redrafting of blueprints to satisfy clients' proposed changes as well as any revisions deemed by the contractor, with the end result being a set of drawings handed to the building owner that may bear little resemblance to the finished building: "Architects haven't been trained to think collaboratively; they're only interested in handing off designs," reports architect James Timberlake. BIM can provide a more accurate representation of the structure to the building owner by centralizing the building process on a digital document that is reworked throughout the design and construction stages. The building process is typically split up among relatively small firms due to legal considerations and other reasons, but BIM erodes this fragmentation; thus designers are required to take on responsibilities normally handed to contractors, and a framework to support this paradigm shift is needed, says consultant Kristine Fallon. She is convinced that the conservative building industry market will eventually come around once BIM's benefits are perceived by enough people. For now, a key obstacle to broad industry acceptance of BIM is the lack of a single standard for data exchange.
    Click Here to View Full Article

  • "Uneven Equation"
    Daily Bruin (10/04/04); Fernando, Menaka

    Engineering schools have fewer female students compared to other fields of study for a variety of reasons, but rolls of female engineers have slowly grown over the last few decades. UCLA has maintained an approximately 20 percent statistic for female engineering students over the last five years, though drops in new first-year and transfer students are likely to bring down those numbers by a few points. Low numbers of women engineers is the result of problems in K-12 education, says education researcher Jane Margolis, who founded the computer science training program for the Los Angeles Unified School District. She notes that only 17 percent of the students taking the Advanced Placement Computer Science exam are girls, roughly the same percentage of female engineering students in colleges. UCLA materials engineering student Sophia Wong says a lot of the problem is in people's perception of engineering: Wong did not realize what engineers did until her high school chemistry teacher encouraged her to pursue the subject and she had an opportunity to do research with a Stanford professor. "The thing to tell [prospective women engineers] would be that you can do anything with it--whether it is materials or electronics or the environment," she suggests. Interestingly, lesser-known engineering fields such as agricultural and environmental engineering draw far more women than do the mainstay mechanical and electrical engineering studies. University of Michigan professor Jacquleynne Eccles, who studied uneven gender ratios in engineering, found male students considered mathematics to be of more use than females.
    Click Here to View Full Article

  • "IT Industry Lags Behind Talent"
    Moscow Times (10/06/04); Levitov, Maria

    Despite Russia's enviable tradition of producing greater numbers of workers with exceptional science and engineering skills than many nations, its domestic IT market is a fledgling compared to countries such as India. Forrester Research estimates that Russia has as many as 40 percent more scientists per capita than Germany, England, and France; Auriga reckons that the 68,126 Russian graduates earning master's degrees in computer science and software engineering this year constitutes an almost 7 percent gain over last year, while the potential IT labor pool may experience even higher growth because there are more graduates with advanced degrees in other, related engineering disciplines. But impeding the expansion of Russia's IT sector is insufficient government support, meager investments, and low consolidation levels. Seventy percent of the country's IT market is currently committed to hardware sales, while Russia's outsourcing effort has barely begun. Yevgeny Butman with the Information & Computer Technologies Industry Association expects a gradual increase over the next few years in the percentage of revenues coming in from government contracts, which currently stands at 30 percent. Meanwhile, $86 million in funding is expected to go toward the development of information and communications technology through the government-sponsored Electronic Russia program. Higher investment levels among IT companies are also anticipated, although Sergei Matsotsky with Information Business Systems cautions that "in the nearest future, the IT market will feel a sharp deficiency in investment necessary for growth." The country's failure to become adept at receiving investment could bring the Russian IT market's development to a halt and hamstring its competitiveness both globally and nationally, he says.
    Click Here to View Full Article

  • "Hacking 101: It's For Your Own Good"
    Charlotte Observer (10/05/04); Choe, Stan

    UNC Charlotte (UNCC) professors such as Bill Chu believe the best way to cultivate network security professionals is to "expose our students to dark side techniques so they gain insight on how bad guys can penetrate systems and how to effectively protect them." Chu teaches Vulnerability Assessment and System Assurance, an ethical hacking course that assigns homework assignments such as breaking into a computer network or spreading malware. Students enrolled in the course are required to sign a legal agreement in which they promise not to employ the techniques or information they learn for malevolent purposes. Russell Shackelford, who heads ACM's education board, notes that teaching students responsible, ethical behavior has been a difficult task for computer science and IT programs, and the usual strategy has been to teach a separate course on ethics that often bores students. More and more "white hat" hackers are being hired by businesses to attempt to crack corporate network security so that vulnerabilities can be spotted and remedied before malicious hackers can exploit them. At a recent UNCC lecture, a visiting professional white hat hacker told students that courses such as Chu's merely provide the tools to learn hacking skills, which cannot be cultivated without a student's drive. "It goes to fundamental human curiosity," he remarked. Ethical hacking students often find work in companies' IT staffs.
    Click Here to View Full Article
    (Access to this site is free; however, first-time visitors must register.)

    For more on ACM's Education Board, visit http://www.acm.org/education/.

  • "Ubiquitous Network Society 'Around the Corner'"
    IDG News Service (10/05/04); Kallender, Paul

    Matsushita Electric Industrial President Kunio Nakamura opened Ceteac 2004 on Oct. 5 with the declaration that the ubiquitous network society is "just around the corner," and he predicted that much of the networked world will be Web-accessible to anyone anywhere and at any time by the end of the decade. Nakamura said in his keynote speech that Japan is at the forefront of this transformation, since the country already possesses the three central building blocks for the ubiquitous network society: Network infrastructure, terminal equipment technology, and services. Over 50 percent of Japan's Internet subscribers use broadband, and Japanese schools provide one PC for every 10 students. By 2010 the Web site population will surpass 10 billion, the average connection speed to the home should exceed current asymmetric digital subscriber line rates by a factor of 10, and connection speeds ought to be up to 50 times faster than current Wideband Code Division Multiple Access network technology. Nakamura said the nationwide rollout of digital broadcasting in 2006 will broaden mobile phone functionality, enabling broadcasting of MPEG-2 digital content to phones and boosting their interactive and entertainment capabilities. Other predictions include the advent of a universal remote control for home appliances, audiovisual equipment, and home services; remote control and remote observation of household systems via a mobile phone-based single client; and integrated circuit tags for wirelessly networked objects. Nakamura cautioned that industry must fulfill its responsibility to deliver consumer products that are easy to use, secure, convenient, and practical. Manufacturers must also work to bridge the gap between technology haves and have-nots, implement solid privacy and intellectual property protection strategies, and not overtax the environment.
    Click Here to View Full Article

  • "Live-in Lab"
    Boston Globe (10/04/04) P. C1; Weisman, Robert

    MIT and independent research firm TIAX are running PlaceLab, a first-of-its-kind experiment to monitor how people could use technology in their homes: The collaboration has equipped a 950-square-foot condominium in Cambridge, Mass., with hundreds of hidden sensors linked with miles of cable. In addition to light, temperature, humidity, and water-flow sensors, cameras and microphones will gather data on how live-in volunteers use the home, and how technology could be deployed to improve their experience. The test is different than other showcase homes because it focuses mainly on daily-life applications. Light-emitting diodes flash when the outdoor temperature allows people to open windows and signals warn elderly residents near the bottom of the stairs, where falls commonly occur; temperature microzoning technology will also be tested. Different groups of volunteers will alternately stay at the condominium, including singles, couples, and families of different age groups. Forrester Research says this type of lifestyle research is needed to improve health care, especially remote health care technology that will allow physicians to monitor health signs from afar using glucose meters, electronic scales, and other equipment. Technology embedded in the home could also help people maintain healthy lifestyles by reminding them to take medication or keep up with preventative care. TIAX founder Kenan Sahin says his company decided to invest $700,000 in the MIT House_n project because it hopes to reap some marketable technology applications: "We feel strongly that the American home is at an inflection point," he says, referring to the growing number of Baby Boomers, computers, and need for integrated technologies.
    Click Here to View Full Article

  • "A Cyberspace Odyssey"
    Globe and Mail (CAN) (09/30/04); Halal, William E.

    Momentum toward an "Intelligent Internet" is accelerating with the convergence of two trends: Commercial exploitation of the Internet and increasingly sophisticated human-computer interaction. The TechCast project believes a revolutionary communications interface is on the horizon, one that will support more convenient and conversational interplay between people and computers. The intelligent interface promises to help harness underused IT, if industry leaders, customers, and the public can overcome the discouragement of the dot-com bust and economic recession and commit themselves to innovation. Enterprises, entrepreneurs, government agencies, and academic researchers are busy developing technologies that are laying the groundwork for the interface: IBM's Super Human Speech Recognition System, Microsoft's program to cut speech recognition error rates, and other projects could help make reliable speech recognition commonplace by the end of the decade, while the CEO of Native Minds predicts that the Net will be widely populated by virtual robots or avatars by 2010 as well. Other significant projects that could contribute in the long run include a UCLA Web site that recreates ancient Rome in 3D; the emergence of low-power, inexpensive flat wall monitors; the use of artificial intelligence to direct the movements of virtual characters in computer games; IBM's autonomic computing program to make networks and servers self-configuring and self-healing; a Defense Advanced Research Projects Agency initiative to develop a super-intelligent adaptive computer system; and an Energy Department project to create a computer capable of deducing intention, recalling precious experiences, problem analysis, and decision-making. If current trends are sustained, a simple version of HAL, the talking computer from "2001: A Space Odyssey," is expected by 2010.
    Click Here to View Full Article

  • "Cyber Center Targets Internet Plagues"
    NewsFactor Network (10/05/04); Martin, Mike

    Much like the Centers for Disease Control study how to prevent and contain human sicknesses, the National Science Foundation (NSF) is funding a new Center for Internet Epidemiology and Defenses (CIED) that will study computer viruses and worms. The Internet's openness and efficiency may have led to its phenomenal success, but those qualities also pose the biggest challenge to the Internet as well, says CIED project director and University of California computer science professor Stefan Savage. "Infection is spread via contact, and the Internet allows a host infected in one place to rapidly contact any other system on the planet," he explains. Outbreaks occur so fast that only fully automated defenses will be able to control them, which is why CIED is focusing on classes of computer infections, not just single versions of computer code. University of California at Berkeley International Computer Science Institute senior researcher Vern Paxson says creating defenses against a known infection is easy, but understanding entire classes of pathogens requires deep insight into the behavior of those infections and how it differs from normal network activity. CIED will use technology such as "network telescopes" and "network honeyfarms" to monitor and measure ongoing Internet infections in real time in order to gather evidence. Eventually, the researchers expect to produce algorithms that can automatically create virus and worm signatures to inoculate systems. CIED is part of the NSF's $30 million Cyber Trust program that aims to not only deal with current problems, but create more secure and resilient infrastructure for the future, notes NSF Cyber Trust program director Carl Landwehr.
    Click Here to View Full Article

  • "Pitt, CMU Get Grant to Study Learning"
    Pittsburgh Post-Gazette (10/05/04); Chute, Eleanor

    Researchers at the University of Pittsburgh and Carnegie Mellon University will use a $25 million federal grant over five years to study how students learn. After testing a number of theories in real classrooms, the researchers will use part of the award from the National Science Foundation on technology for tracking which students ask for help and when, which students do not ask for help, and which students ask for too much assistance. "The neat thing is because we're tracking students before intervention and after, we can get a much richer picture of who benefits and how they benefit," says Kurt VanLehn, Pitt computer science professor and senior scientist at Pitt's Learning Research and Development Center. The researchers will also use the grant money to develop learning approaches, technologies to enhance learning, and a scientific base for what works. The project comes at a time when schools are required to use techniques that have been proven to boost academic performance, according to the federal No Child Left Behind Act. "The way the brain works and the way we learn is one of the most important and fundamental mysteries of science," says Kenneth Koedinger, associate professor of human-computer interaction and psychology at Carnegie Mellon.
    Click Here to View Full Article

  • "Navigating PCs With Pictures, Not Words"
    CNet (10/04/04); Kanellos, Michael

    Operating on the principle that people can recall pictures better than words, the experimental VisualID software automatically tags word processing files or spreadsheets with random graphical icons. A joint venture between the University of Southern California (USC) and MIT, VisualID is designed to make navigation easier and faster by delivering an enhancement to text file names rather than a replacement. There is generally no innate link between the file content and the icon, although VisualID can assign similar, "mutated" icons to files with similar names as well as a meta-icon to thematically related icons. Work with VisualID shows that people can remember icons even when they are meaningless, as the brain does not use such "scenery" for data visualization, but rather for visual search and memory, notes USC researcher and principal VisualID author J.P. Lewis. The software generates intentionally complex icons to prevent confusion and overlapping imagery, yet attempts to keep the icons from becoming too complex. In one study, it took users of VisualID an average time of 25.2 seconds to carry out four file searches, compared to 30.5 seconds for those using generic icons; another test demonstrated that users remembered VisualID tags 37 percent of the time one day later, while users recalled generic tags 24 percent of the time. Furthermore, subjects could still accurately identify previously seen icons six weeks after the study 80 percent of the time. Lewis observes that "In one study, people were able to form much better than chance recognition memory for a set of 100 of these icons after seeing them once for a few seconds each," and he guesses that a person can learn and retain an average of several hundred to several thousand icons altogether.
    Click Here to View Full Article

  • "Super-Connected Users Could Aid IM Worms"
    IDG News Service (10/04/04); Roberts, Paul

    A study of instant messaging (IM) worms by former Hewlett-Packard researcher Matthew Williamson finds that such worms, which are propagated by "highly connected" users, move too swiftly to be effectively impeded by traditional antivirus measures; one possible method to slow down or halt an IM worm epidemic is to choke off communications from such users. The principle of "scale-free" networks epidemiologists use to define systems that are highly susceptible to virus infections even though not all members are connected to each other applies to IM networks as well. IM networks contain highly connected nodes or users that are linked to many correspondents, and thus worms that infect highly connected users' machines likewise spread to their correspondents' computers--a process that Williamson witnessed firsthand in his study of 700 users at HP. He explains that "immunizing" IM users with antivirus software is impractical in such a scenario because the majority of IM users support a scant number of contacts and make a meager contribution to virus propagation; immunizing only highly connected users would be more effective, but the speed of infection throughout an IM network can make this difficult. Another strategy is "virus throttling," in which network administrators attempt to identify "worm-like" behavior on networks as it transpires and limit the rate of machine-to-machine communication, a tactic that does not affect most IM users, according to Williamson. Virus throttling technology restricts the number of IM messages compromised IM users can send outside their "working set" of regular correspondents, and Williamson estimates such messages to average two per day. These messages are put into a queue and delayed slightly prior to delivery; if the delay queue signals a high quantity of traffic to irregular correspondents, IM messages can be impeded or postponed longer.
    Click Here to View Full Article

  • "The Search for Computer Security"
    Harvard University Gazette (09/30/04); Powell, Alvin

    Greg Morrisett, a professor at Harvard University's Division of Engineering and Applied Sciences (DEAS), believes the burden of trusting an incoming program to be free of bugs or malware should be transferred from the computer user to the program itself. "What we're aiming for is a day when you don't have to 'trust' a code, where you can state your guidelines [for acceptable code] and the builder would have to give you a [mathematical] proof that you can check," he explains. Morrisett, a programming language pioneer who has developed tools that identify exploitable flaws in computer programs, is authoring software tools designed to help programmers write less buggy code. He estimates that one bug exists for every 100 to 1,000 lines of code, and the growing complexity of computer programs makes manual checking for bugs impractical without computerized assistance. Morrisett's tools scan code for consistency in a process that the DEAS professor likens to checking that speed calculation formulas use the same units. Morrisett acknowledges that the programs he designs for tracking down and eliminating software bugs can just as easily be used for exploitation by hackers. He predicts that "The next round of questions [pertaining to computer security] will be ethical, legal, and social," and he hopes to use his position at Harvard to help address these questions. He says, "We have to understand that technology gets you to a certain place, and the remaining questions are harder."
    Click Here to View Full Article

  • "Facing Up to the Future"
    New Scientist (10/02/04) Vol. 184, No. 2467, P. S2; Kemp, Sandra

    Though the technology for digitally simulating the human face is advancing dramatically, the artifice is still relatively evident in even the most sophisticated programs. Modeling the face's muscle movements accurately is the key challenge, one that requires software based on an anatomical model that represents the intricate interaction between bones, cartilage, muscles, nerves, blood vessels, subcutaneous fat, connective tissue, skin, and hair. Rudy Poot, who served as color and lighting supervisor on the movie "The Matrix," also notes that "There are so many layers of light being absorbed by our skin and bounced around and it's very hard to mimic that in a program." Faithfully mirroring and sustaining the spontaneity of facial expressions and giving eye gaze and movement a lifelike quality are additional challenges. Researchers at the University of Illinois' Beckman Institute for Advanced Science and Technology are building a database of 3D scans of human facial expressions in an attempt to capture subtle nuances, while Beckman fellow Jesse Spencer-Smith is developing software to allow a computer to facially identify users when they sit in front of the screen. Acceleration of the digital facial modeling process is being driven by an exchange of knowledge between researchers in the various fields and industries--film, archaeology, surgery, etc.--where facial reconstruction and animation is gaining in value. One example of cutting-edge facial modeling are digital models such as Kaya, an aspiring "virtual star" created by a Brazilian animator and special effects artist; yet even these impressive simulations suffer from facial plasticity, lifeless eyes, and artificiality in their movements. However, there are indications that virtual faces may be more attractive to people than actual faces, a trend whose roots may lie in the little accumulated knowledge we have about consciousness or how it animates faces.

  • "Records Management Takes a Few Lessons From Supercomputing"
    Government Computer News (09/27/04) Vol. 23, No. 29, P. 38; Jackson, Joab

    Similarities between the management of massive volumes of scientific data by supercomputing programs and records management by government agencies has led to a partnership between the National Archives and Records Administration (NARA) and the National Center for Supercomputing Applications (NCSA) focusing on the relevance of scientific data management tools to records management. The supercomputing center received $285,000 from NARA to conduct six studies in the first phase of the project, exploring such areas as distributed records management, suitable long-term archival and data storage formats, techniques to ramp up data intake and output, and categorization automation methods. The volume of the material to be archived is so massive as to risk inundating the archivists, but project director Michael Folk thinks that NCSA staff could apply their experience in managing huge data volumes to help mitigate this problem. Folk and Bruce Barkstrom with NASA's Atmospheric Sciences Data Center carried out a study on the application of data formats employed by the scientific community to long-term records management, and are currently investigating NCSA's Hierarchical Data Format and similar standards. Their work led to the conclusion that formats must be defined in "a sufficiently rigorous way" in order to ensure that unavailable commercial software can be replicated to read files. The researchers suggested that "Perhaps the most useful way to improve data formats for long-term, persistent access is to place their structure within a rigorous mathematical structure." Phase two of the project will study visualization and analysis, data management, and performance measurements.
    Click Here to View Full Article

  • "Just Keep Rolling a Lawn"
    GPS World (09/04) Vol. 15, No. 9, P. 16; Miller, Mikel; Raquet, John; Morton, Jade

    Ohio University, Miami University, and Illinois Institute of Technology (IIT) student teams were ranked first, second, and third, respectively, in the Institute of Navigation's (ION) First Annual Autonomous Lawnmower Competition in early June. The goal set forth by ION to contest entrants was to build an intelligent, self-navigating mower that could cut rectangular areas of approximately 150 square meters, with the winning entrant being the machine that cut the most grass in the shortest time. Ohio University's mower took that honor by cutting 13 square meters in less than 60 seconds. The chassis of the battery-powered mower was custom-designed with a 3D computer drafting program, and the machine's navigational capability was facilitated using Global Positioning System (GPS) technology; GPS measurements from a reference station are mixed with code- and carrier-phase measurements from a GPS receiver on the mower itself, and custom software written for the machine's 585 processor tracks the relative position of the mower's GPS antenna in relation to the field coordinates at an update rate of 1 second. Miami University's Red Blade system combines position and orientation sensors, servo actuation systems, a control system, a hydrostatic base lawnmower, and a safety system. The sensor element integrates the inputs of custom differential GPS (DGPS) receivers, Hall-effect sensors and magnets, and a digital compass, which are fed into a control system consisting of an onboard laptop computer and a guidance algorithm; this in turn directs the servo and actuation system, which drives the base mower. IIT's prototype mower is an automated ground vehicle (AGV) that provides steering and motion, DGPS sensor-based navigation, and power for an attached electric mower module. The AGV conducts trajectory-tracking operations, and the controller receives carrier-phase differential GPS measurements and channels optimal directional commands to the motors.
    Click Here to View Full Article

  • "Our Microtech Future"
    Futurist (10/04) Vol. 38, No. 5, P. 51; Holmes, William

    Breakthrough technologies that could benefit fields ranging from health care to neurology to industry to resource conservation are expected from the convergence of microtechnology and biology. Microtech devices would mimic the functions of human cells and in some cases interact with them due to their scale; such devices or "bioparts" would be assembled into "biostructures" with functions similar to biological tissues. One example is clothes that protect the wearer from extreme physical conditions and adjust to maximize comfort by monitoring vital signs as well as the shape, motion, and tension of the body. With microtech, it is possible that people could incrementally become more self-sufficient by effecting a steady transition toward more localized, small-scale production; prevent and treat diseases through internal monitoring; improve their minds and perhaps open up new vistas of perception via a personal observational network within the brain; and minimize energy consumption by building dwellings, clothes, and other products out of embedded microparts. The self-sufficient, microtech-enabled household would be maintained by advanced robots controlled by home workers with tactile gloves and later bioclothes, and such a breakthrough would allow consumers to build rather than buy the products they need out of biostructures, thus contributing to the erosion of poverty and improving the economy. Consumers' strain on resources could be mitigated by the reduction of travel and commuting through communication with teleforms through bioclothes, while microenvironments constructed using miniature teleforms could extend the entertainment and work potential of people's immediate surroundings. Many forms of physical and mental illness could one day be monitored and treated using a cell-care network comprised of modified cells, manufactured bioparts, or both. Working in tandem with bioclothes technology, the network could report physiological conditions to its owner as well as care for tissues.


 
    [ Archives ]  [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM