HomeFeedbackJoinShopSearch
Home

       HP is the premier source for computing services, products and solutions. Responding to customers' requirements for quality and reliability at aggressive prices, HP offers performance-packed products and comprehensive services.


ACM TechNews is intended as an objective news digest for busy IT Professionals. Views expressed are not necessarily those of either HP or ACM.

To send comments, please write to [email protected].

Volume 4, Issue 415: Friday, October 25, 2002

  • "Net Attack Could be First of Many, Experts Warn"
    IDG News Service (10/23/02); Roberts, Paul

    Although the Oct. 21 cyberattack on all 13 of the Internet domain name system (DNS) root servers fizzled, several experts warn that more sophisticated and successful attacks could follow, and are urging the federal government take action to shield the cyber-infrastructure. The attacks appeared to originate from American and European Internet hosting service providers, according to reports from Matrix NetSystems on Tuesday. Computers already left vulnerable as a result of Slapper, Nimda, and other worms can be used as "zombie" machines in distributed denial-of-service (DDOS) attacks such as the one launched on Monday, explains Arbor Networks' chief strategist Ted Julian. "Monday's attack was an example of people not targeting enterprises, but going against the Internet itself by attacking the architecture and protocols on which the Internet was built," he notes. Guardent CTO Gerry Brady, who expects more DDOS attacks in the near future, adds that automated attack tools and other software programs allow people to easily commandeer host machines, and says that past incidents have often uncovered disgruntled teenagers as the perpetrators. Future attacks may not necessarily follow the "packet flood" strategy used in Monday's assault, but instead might send a high volume of normal-looking traffic to overwhelm servers, or target the Internet's routing infrastructure. Brady and Julian agree that the federal government needs to get more involved in the maintenance of major Internet infrastructure elements--for example, it could offer funding or tax breaks to private businesses and public organizations that safeguard their systems. Brady adds that federal investment would shore up backbone providers, and strengthen the infrastructure.
    http://www.pcworld.com/news/article/0,aid,106266,00.asp

  • "Tomorrow's Tech: The Domino Effect"
    CNet (10/24/02); Junnarkar, Sandeep

    Taking a cue from falling dominoes, scientists at IBM's Almaden Research Center have built digital logic elements that are 260,000 times smaller than those currently used in today's most sophisticated semiconductor chips. They created the circuits by patterning carbon monoxide molecules on a copper substrate; tipping a single molecule triggers a molecular cascade, and each cascade is interpreted as a single bit of data, with toppled or cascaded molecules representing a "1" and non-cascaded molecules representing a "0." Circuit features are formed by the intersections of two cascades. However, the researchers report that setting up the cascade patterns takes several hours and requires a very clean environment, while only an ultra-high-vacuum scanning tunneling microscope that operates at extremely low temperatures can be used to carry out the experiment. Molecules would have to be able to cascade at room temperature if the process is to be used in desktops, notes Wolf-Dieter Schneider of Switzerland's University of Lausanne. Furthermore, calculations using the molecular cascade effect are a one-time operation, because there is no reset mechanism. The IBM scientists are confident that they can add repeatability to the process, and intend to concentrate on cascade-based computation that takes advantage of electron spin and other basic interactions. The push to research the possibilities of nanoscale-level calculation is critical, since exponential improvements in the speed and integration of silicon electronics are expected to reach a threshold in a matter of decades.
    http://news.com.com/2100-1001-963207.html

  • "GAO: Visa Fees Boost IT Industry"
    Federal Computer Week Online (10/23/02); O'Hara, Colleen

    Employers who wish to hire foreign workers for IT jobs must pay a $1,000 application fee for H-1B visas, and the government is channeling the money collected from these fees into training programs for American workers, according to a recent report from the General Accounting Office (GAO). Fifty-five percent of the funding goes to the Labor Department, which trains workers to qualify for hard-to-fill positions, while 22 percent goes to the National Science Foundation's scholarship program for low-income students. The occupations the scholarship recipients are being trained for are usually filled by H-1B visa holders, and the GAO notes that the Immigration and Naturalization Service reported that 58 percent of these visa holders were approved to work in computer-related-positions last year. Consequently, 38 percent of students receiving scholarships are choosing computer science as their major, while 37 percent are opting for engineering. The GAO also says that the first three rounds of grants awarded in 2000 focused on IT training programs; 35 out of the 43 grantees selected in that period offered IT training, and 19 focused on IT exclusively. Improvements that the office recommends include the incorporation of new reporting guidelines that provide more detailed information about training program participants and the level of training offered, and better processes to simplify information-sharing. It also suggests that the NSF find an easier way for schools that are awarded scholarship grants to share their best strategies among each other.
    http://www.fcw.com/fcw/articles/2002/1021/web-gao-10-23-02.asp

  • "Letter: Free Software Hurts U.S."
    Wired News (10/25/02); McMillan, Robert

    Reps. Adam Smith (D-Wash.), Ron Kind (D-Wis.), and Jim Davis (D-Fla.) urged 74 Democrats in Congress to support a letter that Reps. Tom Davis (R-Va.) and Jim Turner (D-Texas) sent to White House cybersecurity advisor Richard Clarke, suggesting that his national cybersecurity policy reject "licenses that would prevent or discourage commercial adoption of promising cybersecurity technologies developed through federal R&D." The three supporters argue that this letter demonstrates that Linux's GNU General Public License (GPL) threatens the "innovation and security" of the United States, and they urge Clarke not to include it in his cybersecurity plan. However, Davis and Turner deny that their letter addresses open source or the GPL. Smith has drawn criticism for the huge contributions he has received from Microsoft, and although his spokesperson Katherine Lister says that Microsoft does not influence his policies, Free Software Foundation director Bradley Kuhn counters that his rhetoric is a virtual mirror of Microsoft's anti-GPL stance. Clarke's draft plan only mentions Linux in its recommendation that users of the operating system update it regularly. Open-source boosters say that this incident is only a preliminary bout in the debate over open-source and GPL software's role in the federal government. Computer security expert Gene Spafford notes that getting companies to use technology developed by the federal government will involve Congress looking outside the free software license. "Why don't we also reject any software patents and copyrights that could discourage the adoption and use of software developed under federal funds?" he inquires.
    http://www.wired.com/news/linux/0,1411,55989,00.html

  • "Copyright Fights Slowing Broadband Growth"
    InternetNews.com (10/24/02); Mark, Roy

    Technology and telecommunications companies need to help resolve copyright disputes over digital content in order to make broadband attractive to consumers, according to Bruce P. Mehlman, advisor to the President and assistant secretary of technology policy at the Department of Commerce. Mehlman spoke at a Heritage Foundation conference in Washington, D.C., on Wednesday and said easy-to-use, fast, and wide-ranging libraries of legitimate digital movies and music was one of the keys to spurring consumer broadband adoption. The FCC has said that the onus for further broadband take-up now lies primarily with content providers, since over 70 percent of the U.S. population now has access to broadband connections, but less than 15 percent subscribe for service. Mehlman said that technology providers need to work with the music and movie industry to develop technological protections for copyrighted content, and that content owners need to acknowledge they would never be able to eliminate digital piracy completely. Mehlman also agreed with Gary Shapiro, CEO of the Consumer Electronics Association, that new media technologies have always ended up benefiting the content industry in the long run. Shapiro's group recently filed a friend of the court briefing in the MGM Studios lawsuit against the file-sharing network Grokster, arguing that copyright protections need to be balanced against fair-use rules established in the 1984 Sony Betamax case.
    http://dc.internet.com/news/article.php/1487631

  • "Tech Helps Blind 'See' Computer Images"
    United Press International (10/24/02); Burnell, R. Scott

    The National Institute of Standards and Technology (NIST) has developed a tactile display designed to enable the visually impaired to feel digital images. The prototype device, which will be tested by the National Federation of the Blind (NFB), features over 3,000 pins suspended over a plotting printer that dates back to the early 1990s; an extendable pointer similar to a ballpoint pen "prints" an image by raising selected pins into an outline. About 10 pins or dots exist for every inch, a scale that NFB technology director Curtis Chong says is adequate for the blind. John Roberts of NIST says the display could prove especially useful for visually handicapped engineers and mathematically oriented professionals who rely on information presented in graphs and other visual representations. Chong thinks that it also holds enormous potential for education, saying, "If we could get [blind] kids to feel stuff way more often, every single day in school, maybe the graphical gap won't be so bad." Roberts expects the device to be available at an initial price of $2,000, whereas tactile display technology already in use can cost about $40,000. He also reports that NIST will concentrate on reducing the size of the device without losing detail. Meanwhile, the agency has also developed a reader that can convert electronic text into Braille, which deputy secretary of commerce Sam Bodman declares is ready for licensing.
    http://www.upi.com/view.cfm?StoryID=20021024-034855-5479r

  • "Encryption Method Getting the Picture"
    CNet (10/23/02); Junnarkar, Sandeep

    Xerox and University of Rochester researchers have devised a method, known as reversible data hiding, that can be used to encrypt digital images and later retrieve them without causing data loss or distortion. Both partners will share patent rights to the technique. Reversible data hiding could be used to verify photos and military and medical images, as well as encode data within the very images themselves. Digital watermarking technology can permanently affect an image's quality, while the new method uses data-embedding algorithms to alter the lowest levels of pixel values. The researchers explain that distortions caused by the embedded data are removed as authorized viewers extract the authentication message buried in the image. They add that hardware as well as software is compatible with the method: A digital camera, for example, could be programmed with the algorithms so that it can capture forensic images for courtroom use. Any incidence of doctoring afterward could easily be uncovered. This development could be especially beneficial to ensuring the authenticity of Web-based tickets, receipts, and contracts, and hasten e-commerce as a result.
    http://news.com.com/2100-1001-963054.html

  • "Quantum Scheme Lightens Load"
    Technology Research News (10/23/02); Smalley, Eric

    Johns Hopkins University researchers have devised a scheme that would involve the construction of a linear optical quantum computer with a lot less equipment than previously thought. The scheme involves basing the computer on the manipulation of single photons rather than moving atoms or electrons. This machine could be built using gear such as mirrors, beam splitters, and phase shifters, explains James Franson of the Johns Hopkins University Applied Physics Laboratory. However, sending the correct result of a quantum logic operation between devices without directly observing the photonic states that symbolize the results and eradicating information is hard to do, and requires the simultaneous addition of ancilla photons to the operation. The researchers' breakthrough involves lowering the number of ancilla photons by entangling them with input photons so that the probability of error is reduced; the result is that the computer requires less equipment. "An optical approach to quantum computing would have a number of potential advantages, including the ability to connect different devices using optical fibers in analogy with the wires of a conventional computer," declares Franson. However, Los Alamos National Laboratory mathematician Emanuel Knill notes that the Johns Hopkins scheme leaves out logical qubits, relying instead on physical qubits. Franson believes that quantum repeaters that can increase quantum communications over great distances could be ready in five years, while full-blown quantum computers would take 15 to 20 years to develop, under optimum conditions.
    Click Here to View Full Article

  • "Thinking of Radio as Smart Enough to Live Without Rules"
    New York Times (10/24/02) P. E5; Rojas, Peter

    Although the FCC has recently allowed for more technologies to make use of unlicensed swaths of bandwidth, such as Ultra Wideband technology and the spectrum near 2.4 GHz for Wi-Fi connectivity, others envision the creation of a wireless network similar to the Internet based on unlicensed spectrum. Some researchers say that improved radios could one day make possible an entirely unfettered radio infrastructure, where no part of the radio spectrum would need to be dedicated. This idea, called cognitive radio, uses increasingly powerful processing chips to decode encrypted data transmitted over radio waves. Global positioning system technology would direct signals and help make devices aware of the surrounding radio network infrastructure. The current radio paradigm depends on central broadcasting towers and a number of receiver devices. Unlike these devices, cognitive radio receivers would be able to distinguish between incoming transmissions in order to avoid signal interference, and each radio device--whether a cell phone, boombox, or car stereo--would relay signals, making the network more robust with an increasing number of nodes. Wireless networking expert and early Internet protocol engineer David P. Reed says cognitive radio technology faces a number of barriers, including economic and political ones, since a number of telecommunications and media firms are heavily invested in the current system of licenses and equipment. However, the idea may necessarily be used in order to ease growing demand on the existing radio system. Edmond J. Thomas, chief of the FCC's Office of Engineering and Technology, doubts that radio spectrum will ever be totally unlicensed. Still, he says, "Cognitive radio gives us the opportunity to utilize the spectrum in a way that was totally impractical before."
    http://www.nytimes.com/2002/10/24/technology/circuits/24next.html
    (Access to this site is free; however, first-time visitors must register.)

  • "X Marks the Spot"
    Boston Globe (10/21/02) P. C1; Denison, D.C.

    Widespread adoption of wireless technologies is a foregone conclusion, according to those in the emerging location-based Internet industry. A number of companies are working on Internet technologies that draw on location-based information to provide businesses and consumers with more relevant data and services. GeoVue of Boston, for example, has already signed a number of big-name retailers for its "location intelligence" service, which allows companies to view maps replete with demographic data gleaned from more than 100 databases. Experts say the trend of the future will not only be in delivering the Internet to wherever users are via wireless, but also enriching their online experiences with location-specific data. Boston-based Newbury Networks is working on technology that would enable businesses and other WLAN operators to deliver different content and accessibility options to users based on where they are in the wireless network. Conference attendees would be able to receive special content and network access when in a meeting room, for example. The company already provides its services to a Cambridge, Mass., hotel that uses it to deliver information about its artwork collection to customers as they stroll through the building. Newbury Networks' technology is also being used at Dartmouth College, where engineering professors stream localized content and restrict network access according to where wireless-enabled students are in the laboratory. Technology expert Howard Rheingold comments that the development of the location-based technology sector will partly depend on whether independent developers will have access to enabling tools and software.
    http://digitalmass.boston.com/news/tech_innovation/news/1021_x.html

  • "Purdue Researchers Build Made-to-Order Nanotubes"
    EE Times Online (10/23/02); Johnson, R. Colin

    Using what professor Hicham Fenniri describes as "a novel dial-in approach," scientists at Purdue University have developed application-specific "rosette nanotubes" that feature unique physical, chemical, and electrical traits. Rosette nanotubes possess tunable inner and outer diameters, while conventional carbon nanotubes do not; hollow channels on the external diameter of the rosette nanotube are customized to contain specific molecules that can be used in particular applications. Nanotube construction starts with a "seed" molecule that organizes into rings in water, and these rings form tubes that can be synthesized to any length. The tube configuration is the result of a hydrophobic inner layer and a hydrophilic outer layer, and two electronic seed molecules have been demonstrated--one grows conventional electrical wires, while the other grows photonic nanotubes. Application-specific molecules can be added to the nanotube's outside layer in a test tube, and self-organization into a customized configuration is modulated by the adjustment of environmental factors such as pressure and temperature. Using this process, Fenniri's group has been able to induce the growth of "chiroptical" nanotubes, which grow in both a right- and left-hand spiral formation. "We are optimistic that we can make many specialized nanotubes that are useful for a new generation of computer memory systems, high-definition displays, biosensors and drug delivery systems," Fenniri declares. Other applications he foresees for nanotubes include optical information storage and the manufacture of polymers.
    http://www.eetonline.com/at/news/OEG20021023S0058

  • "Q&A: Internet Pioneer Stephen Crocker on This Week's DDOS Attack"
    Computerworld Online (10/24/02); Thibodeau, Patrick

    Internet pioneer Stephen Crocker, who chairs an ICANN security committee, says that this week's distributed denial-of-service (DDOS) attack on the Internet's 13 Domain Name System (DNS) root servers has both positive and negative aspects, and discusses ways the system can be improved. The mode of attack involves overwhelming the servers with traffic, and the strength of a DDOS attack depends on its sophistication. On the positive side, Crocker notes that not all of the servers were taken out of commission and the impact on the Internet community was "negligible," proving that DNS defenses are effective and server staffs are well-trained. However, he cautions that this success does not mean the system's strength should be taken for granted, since larger, more sophisticated attacks are inevitable. Crocker cites three areas for improvement: The implementation of better DNS core protocols and service, such as the DNSSEC security protocol and more up-to-date Berkeley Internet Name Domain (BIND) server software; keeping hosts in line by having ISPs use discipline and authentication; and making computers less open and vulnerable to being commandeered by hackers as platforms for DDOS attacks, a situation resulting from the wide use of off-the-shelf units. Calling it "our biggest problem globally" and a "public nuisance," he says "computers should not be wide open." Crocker also says the chief lesson learned from the incident is an old lesson. "We shouldn't say everything is fine and expect the system to survive indefinitely," he insists. The attacks will be the subject of a discussion that Crocker will lead at ICANN's annual meeting in Shanghai next week.
    Click Here to View Full Article

  • "Brave New World"
    Electronic News Online (10/21/02); Chappell, Jeff

    Belgium-based IMEC's M4 is a 10-year initiative that aims to marry several disciplines to facilitate the convergence of semiconductor, software, and micro-electromechanical systems (MEMS) technology into wireless body-area networks (WBANs) that can directly read a person's body chemistry through implanted sensors. Although such technology will involve nanotechnology and biology, it will also incorporate elements of standard silicon semiconductor process technology in certain circumstances. Polymer-based isolation field-effect transistors (ISFETs) are already being used to take pH readings in water and blood, and one day such devices will be modified to be implanted within people, where they can measure antibody-antigen responses and relay the data to external machines. IMEC's Andrew Campitelli says his group is trying to make such a device viable, and the consortium's cross-disciplinary strategy is essential to providing users with a reasonably priced polymer-based bioFET. Also being investigated by IMEC are implanted biological sensors that use nanoscale magnetic devices and magnetic random access memory. IC fabrication process technology and tools also play an important role in the MEMS discipline that IMEC intends to use--target products the consortium plans to develop include silicon-based impendence sensors for immunology and blood gas sensors. Nano-sized thermal generators, electrostatic generators, and photovoltaic cells are being researched by IMEC as power sources, while making these applications interoperable requires low-power gigahertz RF technology. Meanwhile, Living Microsystems has built a machine that will allow scientists to place tens of thousands of living human cells on a semiconductor and examine them to gain insights into disease diagnosis.
    Click Here to View Full Article

  • "TeraGrid Receives $35 Million From National Science Foundation"
    Grid Computing Planet (10/18/02); Shread, Paul

    The National Science Foundation (NSF) has approved an additional $35 million grant to the TeraGrid project, extending the infrastructure to five sites and joining it with the TCS-1 supercomputing project at the Pittsburgh Supercomputing Center (PSC). The Carnegie Mellon University and University of Pittsburgh project had previously been awarded $45 million from the NSF to jointly build the TCS-1 system at the PSC. In all, the expanded TeraGrid system will harness more than 20 teraflops of distributed processing power and have almost one petabyte of storage, equal to 1 million GB. Additionally, Qwest Communications will be brought in to help build new links between the sites, creating the fastest research network in the process. Research institutions linked to other dedicated high-speed networks, such as the Abilene Internet2 network, can also tap the TeraGrid. Some of the money will go toward making the TeraGrid project interoperable with future infrastructure contributed by other institutions that may want to join. PSC scientific directors Mike Levine and Ralph Roskies say the recent deal is significant because it links the TeraGrid to TSC-1, two heterogeneous supercomputing systems. They note that this will be the first real test of a large-scale grid integration, and this will have implications for a much wider grid infrastructure in the future. The five sites now participating in the TeraGrid include the San Diego Supercomputing Center at the University of California, the University of Illinois' National Center for Supercomputing Applications, Argonne National Laboratory, the California Institute of Technology's Center for Advanced Computing Research, and the PSC.
    http://www.gridcomputingplanet.com/news/article/0,,3281_1484771,00.html

  • "Toward a More Flexible Future"
    InformationWeek (10/21/02) No. 911; Greenemeier, Larry; Travis, Paul

    Lower corporate spending on IT coupled with a drive to squeeze existing servers for efficiency are pushing server vendors to change their sales strategy to emphasize cost savings through more flexible products that offer easy management. Blade servers are likely to net a portion of the IT budget because they are cheaper, more reliable, smaller, and easier to manage than standalone units, but more options are opening up as mainframes, RISC machines, and Intel-based servers get less costly, more manageable, and more powerful. Server vendors are pinning their hopes on blade servers and brick servers, which will enable IT personnel to tailor server boards with interchangeable memory, disk space, processors, and other components. Bricks, for example, will allow a telecommunications service provider to add or remove components without interrupting customer service. Communications service providers and hosting companies, which rely on many arrays of servers, are expected to have a particular affinity for blades. The wide deployment of blades will necessitate management software that can distribute upgrades, secure against hackers, and build server clusters. Mainframes are also becoming more capable thanks to the advent of Web services, logical partitioning, and performance redistribution. "Regardless of modular computing or the clustering of low-end Intel-based servers, there is still a place for mainframes, and that's not going to change anytime soon," asserts Galileo International CTO Robert Wiseman.
    http://www.informationweek.com/story/IWK20021018S0005

  • "Sensors Gone Wild"
    Forbes (10/28/02) Vol. 170, No. 9, P. 306; Fulford, Benjamin

    Uses for intelligent sensors and ways to improve them are the goal of several research projects, including one at southern California's James San Jacinto Mountains Reserve, where dozens of devices have been distributed to track animal movements and plant growth, and monitor temperature, wind speed, air pressure, and carbon use. The Defense Advanced Research Projects Agency (DARPA) has invested $160 million of its own money and $500 million in matching funds from other agencies to develop wireless sensor network technology that would track enemy troop and armament movements in a war zone, as well as detect biological weapons and electromagnetic noise. One of the major technical challenges is developing more power-efficient batteries, and MIT researchers are working on ultra-small, self-powering chips; some new sensors are small enough to run on 100 microwatts. A better way to manage the billions of inputs a sensor receives must also be devised, and UCLA computer science professor Deborah Estrin is working on the problem through the James Reserve project. Her solution involves processing data packets at every point in the network route, and not relying on a specific address but assembling themselves on the spur of the moment on a time stamp, location, or the type of media being transmitted. Sensors are relatively cheap to produce, and Omron's Tatsuro Ichihara estimates that within five to 10 years, "people will no longer need to carry ID, keys or money, because sensors will recognize their face." Sensors will be embedded in bridges and other structures to read integrity, and hospitals will use them to track patients. Clark Nguyen, an electrical engineering professor at the University of Michigan on leave to work on sensors for DARPA, says that "in a way [sensor networks] could be bigger than the Internet...the whole analog world will interface with the Net."
    http://www.forbes.com/global/2002/1028/076.html

  • "Hot Research"
    HP World (10/02) Vol. 5, No. 10, P. 1; Elgan, Mike

    Keeping data centers cool so that system failures can be averted is a heavy area of concentration in IT research efforts, and Hewlett-Packard is working on a variety of solutions using a holistic approach, according to HP Labs researcher Chandrakant Patel. Air conditioning is a very expensive solution to the problem of overheated data centers, while other solutions are not fast, reliable, or scalable enough. One possibility HP is researching involves converting an inkjet printing cartridge into a semiconductor cooling device that sprays dielectric liquid coolant onto chips when the temperature rises. HP has also undertaken various projects designed to address how to more cost-effectively cool hardware in existing data centers. Patel notes that the usual strategy for detecting data center "hot spots" is to have a technician roam around, sensing the air; HP's solution, the UDC RoboRunner, is a laptop on wheels with a sensor-equipped antenna that monitors temperature and can negotiate obstacles. Meanwhile, Patel and colleagues are building a smart data center that uses multiple tools to cool data centers, including fans, floor and ceiling tiles, and blowers. Such a center also has the ability to allocate number-crunching workloads away from hot areas and toward cooler systems. Another solution that could find favor is the use of data centers in cooler climates such as Siberia or Alaska, says Patel.

  • "Life By the Numbers"
    Popular Mechanics (10/02) Vol. 179, No. 10,; Wilson, Jim

    The convergence of biology and computer science into bioinformatics follows the principle that all natural systems adhere to a mathematical model, and life itself can be broken down mathematically through the understanding of DNA, the molecule the directs protein production through the arrangement of four basic chemical compounds. Bioinformatics was officially inaugurated as a legitimate field with the commencement of the U.S. Genome Project, an initiative to decode the human genome and use the research, which is freely available on the Internet, as a jumping-off point for scientists. Michael Lush of the HUGO Gene Nomenclature Committee says the availability of the data levels the playing field for researchers. Some of the most productive research stemming from the Genome Project involves the study of single nucleotide polymorphisms (SNPs, or "snips") in DNA that can change the behavior of cells. Scientists scan for snips using computers and DNA microarrays, in which sample DNA is stained with a fluorescent marker; gene activity manifests itself as a red or green glow in the presence of a laser. Speaking at the O'Reilly Bioinformatics Technology Conference earlier this year, Ewan Birney of the European Bioinformatics Institute predicted that bioinformatics will revolutionize cancer research, eliminating experimental design in favor of actual testing. Meanwhile, the Department of Energy thinks that all life sciences will benefit from bioinformatics research, which will lead to such advances as custom-designed bacteria and plants that aid in environmental cleanup.
    http://popularmechanics.com/science/medicine/2002/10/life_by_numbers/

 
                                                                             
[ Archives ] [ Home ]

 
HOME || ABOUT ACM || MEMBERSHIP || PUBLICATIONS || SPECIAL INTEREST GROUPS (SIGs) || EDUCATION || EVENTS & CONFERENCES || AWARDS || CHAPTERS || COMPUTING & PUBLIC POLICY || PRESSROOM