Women's Career Choices Influenced More by Culture Than
Biology
Penn State Live (05/16/06)
Ignoring the diversity of the workforce, U.S. IT companies stubbornly
cling to blanket policies for all workers, particularly women, according to
Eileen Trauth, professor of information sciences and technology at
Pennsylvania State University. "Policy makers, educators, managers need to
recognize that you can't generalize to all women," said Trauth. "There is
far too much variation in the paths that women take for anyone to assume
that women's career motivations are the same, their methods of balancing
work and family are the same, or their responses to motherhood are the
same." Trauth interviewed 167 female IT professionals in the United
States, Ireland, Australia, and New Zealand, and presented her findings at
the recent 2006 ACM conference on computer-personnel research. Trauth
found that a host of factors, including gender stereotypes, societal
messages, and family dynamics can affect a woman's career choice. She also
found that women respond differently to stereotypes of motherhood, career,
and gender, reinforcing her conviction that blanket policies ignore the
complexities of female workers. "What would be inappropriate is to look at
a young woman and presume that she will get married, or that she will have
children, or that she will leave the workforce if she does have children,"
Trauth said. "Organizations shouldn't have HR policies based on gender
stereotypes because people are motivated by different things--salary, job
security, flexible work schedules." Trauth, who co-authored the paper with
PSU doctoral students Jeria Quesenberry and Haiyan Huang, suggests that
stereotyping could be the reason why women's participation in IT fell from
41 percent in 1996 to 32.4 percent in 2004. For information on ACM's
Committee on Women in Computing, visit
http://women.acm.org.
Click Here to View Full Article
to the top
Sun Promises to Open-Source Java
CNet (05/16/06) Evers, Joris
After years of pressure from open-source advocates, Sun Microsystems
announced that it will open the entire Java language, once it resolves the
implementation and compatibility issues. "The question is not whether we
will open-source Java, the question is how," said CEO Jonathan Schwartz at
this week's JavaOne conference. Sun executives would not specify a time
frame for completion of the process. Sun expects the release of Java under
an Open Source Initiative license to broaden its base of users, just as
opening Solaris caused a dramatic surge in the operating system's user
base. "It just grows the tent," said Schwartz, adding that additional
users will boost Sun's revenue through increased demand for services and
support. IBM, a major Java user and one of Sun's main competitors,
supports the move, and even offered to help Sun make the transition. "Java
has grown in popularity, but the rate and pace of innovation has been
limited by the degree of openness Sun was willing to embrace," said IBM's
Rod Smith.
Click Here to View Full Article
to the top
SCAMPI Trawls the Internet
IST Results (05/17/06)
To improve the flow of traffic on today's increasingly complex networks,
researchers working under the European Union's SCAMPI project developed a
software and hardware combination to create novel, open-source applications
to monitor networks and clear up bottlenecks. In addition to gauging how
much capacity is being used, monitoring tools can keep tabs on the types of
traffic passing over a network. "It's a niche application," said Kevin
Meynell, SCAMPI project manager. "But it's an important one, and I think
it will become more important as we seek to create more efficient use of
the bandwidth we have." SCAMPI, or Scaleable Monitoring Platform for the
Internet, was launched to address the increasing difficulty of monitoring
traffic as network speeds continue to accelerate. Project developers
created a host of programmable PCI-based adapters known as COMBO, which
serves as the hardware that facilitates network traffic and monitors
connections with speeds up to 10 Gbps. The team also developed three
interface cards that physically connect with the Internet via both copper
and optical connections. The open hardware is distinctive both because
earlier models were proprietary and because it provides high-speed
monitoring at a relatively low cost. The researchers also developed a
middleware package, the Monitoring Application Programming Interface
(MAPI), to process network flows. MAPI handles applications with a
conventional API, enabling it to serve as a universal adapter. Numerous
complementary applications offer enhanced capabilities, such as scanning
for unusual network traffic to protect against denial-of-service
attacks.
Click Here to View Full Article
to the top
Panning E-Waste for Gold
New York Times (05/17/06) P. 8; Moran, Susan
Each month, Hewlett-Packard's two major recycling centers process 1.5
million pounds of computer monitors, keyboards, game consoles, and other
discarded electronic items, many of which contain precious metals such as
gold, silver, and palladium. "We want these valuable resources put back
into the economy in some way, shape, or form," said Renee St. Denis, who
directs Hewlett-Packard's recycling operations in the Western hemisphere.
Electronics manufacturers are generally becoming more environmentally
conscious, weaning themselves away from toxic materials and taking greater
responsibility for recycling. The shift comes with an eye toward the
bottom line, however, as manufacturers seek to avoid potential lawsuits and
regulatory fines. The European Union has passed several environmental
provisions, including heavy restrictions on the use of lead, mercury,
cadmium, and three other deadly materials frequently used in electronic
equipment. Several major companies have begun curtailing their use of lead
in monitors, looking instead to liquid-crystal displays, and Sony spent six
years developing a tin-silver-copper alloy solder with the same properties
as lead. Some Japanese companies have begun replacing petroleum-based
plastics in the casings for cell phones with biopolymers formed from plants
such as corn. Biopolymers present numerous engineering problems, however,
including their relative brittleness, intolerance to heat, and the fact
that some of the compounds that researchers add to strengthen them cancel
out their environmental benefits. Americans traded in or discarded more
than 63 million computers last year, but watchdog groups warn that around
80 percent of the electronics equipment that is collected for recycling
winds up in landfills in developing nations, and the United States has been
slow to enact regulatory policies governing electronics disposal.
Click Here to View Full Article
to the top
Growing Geek Gap Gnaws U.S. Firms
Investor's Business Daily (05/15/06) P. A7; Deagon, Brian
The lack of interest among U.S. students in math and science could lead to
a critical shortage of skilled technical workers, experts warn. Companies
looking to replace the talent that has collectively been responsible for
the bulk of U.S. innovation for the past three decades will begin finding
fewer qualified graduates to fill the skilled positions. The NSF estimates
that the demand for science and engineering graduates will increase 26
percent by 2012, amounting to 1.25 million new jobs. The schools are
falling far short of the demand, though the government and private sector
are working to address the shortage. Cisco, Intel, and other tech
companies spend millions on contests and scholarships to kindle interest in
the sciences, and Cisco recently held a panel addressing U.S.
competitiveness that included President Bush and California Gov. Arnold
Schwarzenegger, as well as 250 technology leaders. The United States faces
increased demand for jobs in its rapidly growing tech-manufacturing sector
as it competes in a flattened global marketplace against countries such as
China, which are spending more and more on science and technology. U.S.
companies had been supplementing their worker needs with foreign students
who came to U.S. institutions under the H-1B program, though in the wake of
the dot-com collapse and the Sept. 11 attacks, the program has been
curtailed sharply. The NSF reports that Asia produces engineering degrees
at four times the rate of the United States, a trend that the lack of
proficiency among K-12 students is only likely to exacerbate. "By the
fourth grade, if students don't catch the math and science train, they get
derailed and the problems get compounded as they go up," said Jeetan Singh,
CEO of HighPointsLearning.com. Politicians of both parties agree that
there is a problem, and numerous bills have been introduced to address it,
as well as President Bush's American Competitiveness Initiative, which
pledges to increase basic research funding and strengthen education.
Click Here to View Full Article
to the top
UI's Beckman Institute Blazes Trail for Interdisciplinary
Research
Associated Press (05/13/06) Paul, Jim
The Beckman Institute for Advanced Science and Technology has become a
leading research center over the years, and is now viewed as the nation's
model for interdisciplinary research. The facility on the campus of the
University of Illinois at Urbana-Champaign brings together scientists and
researchers from a wide range of academic disciplines to focus on broad
areas involving biological intelligence, human-computer intelligent
interaction, and molecular and electronic nanostructures. Current projects
include the use of imaging technology to determine how the human brain
processes sound, and the development of hearing aids that can pick out a
voice in a noisy room. "Where else could I go where I would have the
opportunity to just walk across the hall and talk with world-class
researchers in areas that I need to know about in order to do my research
in the best way?" asks Jennifer Cole, a linguist who works with electrical
engineers, biologists, and psychologists to understand how humans process
language. "How much more convenient could it get?" About 90 percent of
Beckman's annual budget of millions of dollars comes from the federal
government, corporations, and other private sources. The 110 full-time and
part-time faculty at Beckman are joined by several hundred researchers,
graduate assistants, and postdoctoral fellows.
Click Here to View Full Article
to the top
High Court Rejects Patent Rulings Against eBay
Washington Post (05/16/06) P. D5; Lane, Charles; Noguchi, Yuki
The U.S. Supreme Court ruled in favor of eBay in a patent infringement
lawsuit brought against the online auction service by MercExchange, by
throwing out an appeals court decision that would have shuttered eBay's
"Buy It Now" feature and ordering a federal district judge to restart the
case. Justice Clarence Thomas wrote in the court's opinion that the lower
courts should apply "traditional equitable principles" in revisiting the
case, arguing that degree of harm to the patent owner, the sufficiency of
monetary damages, the "balance of hardships" between the patent owner and
infringer, and the public interest are consistent in all kinds of cases,
not just patent cases. The ruling fell short of many people's expectations
that patent law would need to be radically reconfigured, since the court
refused to decide whether the Patent Act basically requires the court to
deploy an injunction banning an infringer from using a patented invention.
The court's ruling could be partially explained by a desire not to flout
Congress' own efforts to address the beleaguered patent system. Though the
court's opinion was unanimous, some members held sympathy for eBay, while
others were more sympathetic toward the plaintiff. Justice Anthony Kennedy
concurred with Justices David Souter, John Paul Stevens, and Stephen Breyer
that the lower courts should concentrate on the problem of "patent trolls,"
or parties that collect patents and make money by enforcing them rather
than practicing them. More in MercExchange's corner were Chief Justice
John Roberts and Justices Ruth Bader Ginsburg and Antonin Scalia, who wrote
a separate concurring opinion citing the historical precedent for granting
injunctions to patent holders.
Click Here to View Full Article
to the top
Divining Tech's Future This Pundit's Idea of Fun
Seattle Times (05/16/06) Dudley, Brier
Technology leaders, investors, and journalists recently converged on San
Diego for Future in Review, or FiRe, an annual technology conference hosted
by entrepreneur and consultant Mark Anderson where attendees discuss the
pressing issues and trends in the technology industry, such as robotics,
broadband mobile devices, and nanotechnology. This year, speakers at FiRe
included Microsoft co-founder Paul Allen, Wipro Chairman Azim Premji, and
Google's Larry Brilliant. Participants acknowledged that the industry is
at a tipping point, where PC-focused companies such as Microsoft, Dell,
Cisco, and Intel have been losing market share while the increased demand
for mobile devices has returned some of the luster to telecommunications
companies. In identifying the major trends that will affect the future,
Telstra CEO Sol Trujillo identified the emergence of China and India and
the concurrent decline of Europe, the global rise in life expectancy that
will lead to new demands for simplified functions and interfaces, and the
growing dependence on online relationships. Genome research and
experimentation could bring about another industrial revolution as
biologists use supercomputers to model and commercialize the evolutionary
properties of various species, according to Larry Smarr, director of the
California Institute of Telecommunications and Information Technology.
Premji noted the strides that India has made in protecting its intellectual
property, while China has shown little interest in enforcing its
anti-piracy laws. Symantec CEO John Thompson warned that Macs can no
longer be seen as immune from security threats, now that more hackers are
targeting individual users, irrespective of the system they use, rather
than the traditional broad attack that mainly affected PC users.
Click Here to View Full Article
to the top
Congress May Make ISPs Snoop on You
CNet (05/16/06) McCullagh, Declan
House Judiciary Committee Chairman Rep. F. James Sensenbrenner (R-Wis.) is
set to introduce legislation as early as this week that would require ISPs
to record information about U.S. Internet users' online activities so that
police can more easily "conduct criminal investigations." The
legislation--called the Internet Stopping Adults Facilitating the
Exploitation of Today's Youth (SAFETY) Act--is likely to be controversial
because it would significantly change U.S. laws regarding the protection of
Americans' Web surfing habits. ISPs currently discard any log file that is
no longer required for business reasons such as network monitoring, fraud
prevention, or billing disputes, although they do make exceptions when
contacted by police conducting an investigation. Critics such as
Electronic Privacy Information Center executive director Marc Rotenberg
says Sensenbrenner's legislation is too vague. Instead of specifically
describing exactly what information ISPs would be required to keep about
their users, the legislation gives the attorney general broad discretion in
drafting regulations. At minimum, the proposal says, user names, physical
addresses, IP addresses, and subscribers' phone numbers must be retained.
That generous wording could allow the attorney general to order ISPs to
keep records of email correspondents, Web pages visited, and even the
contents of communications. Despite the controversy, the legislation has
garnered the support of state law enforcement agencies, which say strict
data retention laws will help them investigate crimes that have taken place
awhile ago.
Click Here to View Full Article
to the top
The Great Singularity Debate
ZDNet (05/13/06) Farber, Dan
At the recent Singularity Summit at Stanford University, 12 panelists
convened to discuss the concept of singularity--the notion that machine
intelligence will one day eclipse human intelligence, and that the process
will accelerate exponentially as smarter minds continue creating still
smarter minds. Futurist Ray Kurzweil, father of the singularity theory,
discussed the highlights of his recent book, "The Singularity Is Near."
Kurzweil believes that the law of accelerating returns will propel
intelligence into the nonbiological realm and increase its capacity by
orders of magnitude. "In this new world, there will be no clear
distinction between human and machine, real reality and virtual reality,"
he writes. "Human aging and illness will be reversed; pollution will be
stopped; world hunger and poverty will be solved. Nanotechnology will make
it possible to create virtually any physical product using inexpensive
information processes and will ultimately turn even death into a soluble
problem." Kurzweil looks to the history of technological advancement to
defend the theory of accelerating returns. The exponential advancement of
technology will lead to the singularity within a few decades in Kurzweil's
estimation. He predicts that cell phones will be capable of real-time
language translation within a few years. Kurzweil also believes that the
whole genome of the human brain can be captured in around 20 MB, and that
its computing power should be available in a PC for $1,000 by 2020.
Douglas Hofstader, professor of cognitive science and computer science at
the University of Indiana, followed Kurzweil with his critique of the
singularity theory. From a human perspective, there are troubling
implications for uploading the whole of one's self into cyberspace, such as
what constitutes the essence of a human who is basically software.
Click Here to View Full Article
to the top
Re-engineering the Research Process
NCSA (National Center for Supercomputing Applications) (05/02/06) Myers,
Jim
NCSA is focusing on researching and developing the advanced management and
connection capabilities that scientists and engineers will demand of
cyberenvironments in the future, writes NCSA associate director Jim Myers.
The integrated, end-to-end software systems promise to usher in a new day
of seamless cyberinfrastructure accessibility and usability, helping to
ensure that the loss of human knowledge does not become an unintended
consequence of modern technology. Aside from developing more sophisticated
service abstractions, NCSA is at work on new capabilities for automating or
semi-automating processes. Those include visual knowledge discovery, which
involves the use of data analysis to categorize, cluster, and extract
features from large data sets, along with interactive visualization. New
tools for managing semantic information about data and resources will
enable functions such as provenance tracking, annotation, and collaborative
data curation. Meanwhile, technologies such as Web and grid services,
translating or integrating middleware, global unique identifiers and
metadata, workflow and provenance, and semantic descriptions of resources
and data are all being used to align cyberinfrastructure with
cyberenvironments. Although cyberenvironments are viewed as a gateway for
accessing all sorts of computing capabilities from sensor arrays to
visualization tools, NCSA sees their potential in re-engineering science
and engineering research processes.
Click Here to View Full Article
to the top
Fragmented Domain Names Could Destabilize Internet--ICC
Warns
ag-IP-news (05/13/06)
A seven-page report released by the International Chamber of Commerce
(ICC) warns against the decentralization of the domain naming systems and
urges quicker integration of IDNs. "Unless this process is carefully and
centrally implemented, domain names may lead to fragmentation and threaten
the stability, integrity, and security of the Internet," said Talal
Abu-Ghazaleh, chair of ICC's Commission on E-business, IT and Telecoms,
publisher of the paper. The warning comes as China, Russia, and Brazil
consider their own domain naming systems to offset the perceived policy
monopoly held by the United States through ICANN. The report recommends
that UNICODE continue to be used and that oversight of the naming system
remain under one body, ICANN. "The introduction of IDNs is an important
step towards true global diffusion of the Internet," says ICC Internet and
IT Services Task Force Chair Allen Miller. "Multiple authorities would
pose serious problems for the protection of intellectual property rights,
would marginalize Internet users in the developing world, and create
islands of users blocked from full global access." The report also
suggests the creation of a classification system for domain names to help
prevent conflicts over trademark rights.
Click Here to View Full Article
to the top
Java Inches Closer to Open Source
CNet (05/16/06) LaMonica, Martin
Sun Microsystems will release the source code for another round of Java
applications, including its portal and integration software, at the JavaOne
conference on Tuesday. Sun is also expected to discuss the Java
Distribution License, which makes the desktop Java Runtime Environment more
compatible with Linux. The announcements come as Sun is transitioning to
an open-source business model and working to accommodate more development
languages. Java now receives most of its innovative thrust from
open-source development projects and scripting languages. Sun's new
strategy is to generate more revenue from offering support for open-source
software products and to form deeper connections with developers, who often
guide corporate purchasing decisions. Sun's open-source approach has
rejuvenated interest in the Solaris operating system and the Java
Enterprise System. However, Sun is facing increased competition from
alternative languages to Java, especially for writing Web applications.
Sun's response is to make Java more compatible with these dynamic languages
to ensure that Java remains relevant in a development atmosphere looking
increasingly to faster, simpler languages.
Click Here to View Full Article
to the top
Scan This Book!
New York Times Magazine (05/14/06) P. 42; Kelly, Kevin
The promise of a universal library--a virtual repository of all human
works that anyone in the world can access--resides in its ability to make
all books searchable by weaving them together through word/sentence links
and public annotations, turning reading into a communal activity. With all
works thus interconnected, it is believed that even the most obscure works
will find an audience, our comprehension of history will be deepened, a new
sense of authority will be nurtured, and Web search will support a novel
infrastructure for wholly unique services and functions. A universal
library will provide users with the means to organize virtual bookshelves
thanks to ubiquitous snippets, articles, and pages of books that can be
shuffled and transferred; such collections could become a source of
distinction and maybe income for users. The sole reason for the universal
library's painfully slow growth is the indefinite extension of copyright,
which has led to a situation in which about 75 percent of the world's 32
million archived books are orphan works, about 15 percent are in the public
domain, and about 10 percent are actively in print. The bulk of U.S.
libraries' book digitization effort is concentrated on the 15 percent or so
of all books in the public domain, while the roughly 10 percent of books
still in print should be digitized soon; the remaining three-quarters of
the world's books are not open to scanning because their copyright status
is unknown, and current copyright law prevents any of these orphans from
reverting to the public domain until 2019. Google crafted a plan to scan
orphan works but only show limited snippets of those works, allowing
copyright holders to ask for the snippets' removal if proof of ownership
was established, but the plan hit a snag when publishers and authors
alleged that such use infringed on copyright. The business model based on
the mass production of cheap copies, which has benefited creators as well
as audiences by removing the need for patronage, is on the verge of
obsolescence because digital technologies are making free copies
ubiquitous, and raising the value of how these copies are connected,
annotated, reorganized, translated, bookmarked, and incorporated into the
universal library. It is expected that copyright law will ultimately adapt
to digitization and furnish a business model where creators only receive
copyright if they make their works searchable.
Click Here to View Full Article
to the top
More Visas, Less Work
CIO (05/15/06) Vol. 19, No. 15, P. 24; Gross, Grant
Information technology workers continue to claim that tech companies are
using H-1B visas to obtain cheaper labor overseas, but many companies are
expressing concern over whether the industry will face a labor shortage in
the near future. Microsoft and other tech companies that are pressing
Congress to raise the limit on the H-1B visa program scored a victory in
March when a U.S. Senate committee voted to raise the annual cap from
65,000 to 115,000, allow for additional increases once the new limit is
reached, and eliminate any caps for advanced-degree holders. Congress has
started debating the Comprehensive Immigration Reform Act of 2006, which
includes the H-1B visa provisions. "U.S. businesses should have access to
the best and brightest workers in the world," Rep. Bob Goodlatte (R-Va.)
said during a hearing of the House Subcommittee on Immigration, Border
Security and Claims. A new report from the Society for Information
Management indicates that technology executives are uneasy about the
prospect of filling entry-level programmer and systems analyst positions.
During the hearing, John Miano, a computer programmer for 18 years, said
companies pay foreign programmers $13,000 less. "We should not have a visa
program that allows an employer to lay off U.S. workers in favor of cheaper
foreign labor," added Rep. Steve King (R-Iowa).
Click Here to View Full Article
to the top
Microsoft Researcher Honored
EE Times (05/08/06)No. 1422, P. 54; Mokhoff, Nicolas
Microsoft's George Robertson, the human-computer interaction pioneer who
coined the term "information visualization," was recently inducted into the
CHI Academy for his contributions to the field. Microsoft's research into
the human-computer interface began with Robertson's hiring in 1996, and the
concept of information visualization has attracted the attention of
military, business, and intelligence leaders. Information visualization
aims to present data in innovative formats using color graphics or
animation that provide an interactive experience. Robertson also sits on
the National Visualization and Analytics Center Panel at the Pacific
Northwest National Laboratory, which conducts analysis of information
gathered on terrorist activities. "The problem is huge," said Robertson.
"In one database alone there are 120 billion documents, and the pace is
that 1 million documents are changed every hour when searching for clues.
That requires an enormous effort to show graphically." At his induction
ceremony, Robertson presented two papers describing Microsoft's latest
work, including a comparison of different graphical software interfaces
that found that the type of task information provided varies in accordance
with the level of abstraction. The study measured interactions at the
levels of scaling, which displays the layout of a window; change detection,
which measures whether any changes occurred; semantic-content extraction,
which displays a small amount of content in the most relevant window; and a
mixture of change detection and semantic-content extraction. The second
paper outlined a new technique for using a mobile phone to search through
large data sets using iterative data filtering to lessen the reliance on
keyword text entry.
Click Here to View Full Article
to the top
John Koza Has Built an Invention Machine
Popular Science (05/06) Vol. 268, No. 5, P. 66; Keats, Jonathon
Stanford University adjunct professor John Koza's "invention machine" is a
network of 1,000 PCs that creates innovative new designs without human
guidance through genetic programming based on Darwinian evolution. The
process induces bits of computer code to sire successively superior
offspring until the final generation is so supremely well-adapted for its
assigned function that it outclasses any product of human conception. In
creating the invention machine, Koza focused on addressing the shortcomings
of genetic algorithms and artificial intelligence and then combining the
two: The algorithms were unable to develop creative solutions to problems,
while AI's performance was not living up to its promise. The machine
randomly generates designs, measures their fitness rating, and then
combines or mates some designs, redistributing their properties into their
offspring. Offspring that exhibit an improved fitness rating are retained,
while offspring with a low fitness rating are weeded out. The invention
machine evolves new designs within one day or one month. A testament to
the machine's success was a U.S. patent awarded to an invention it created
to boost factory efficiency, without the examiner realizing that the
inventor was non-human. Koza's next goal is to use genetic programming to
invent something that is commercially successful. Regardless, Koza
believes that genetic programming will become pervasive in the next 10
years, providing an efficient method for solving difficult engineering
problems.
Click Here to View Full Article
to the top
Bringing DNA Computers to Life
Scientific American (05/06) Vol. 294, No. 5, P. 44; Shapiro, Ehud;
Benenson, Yaakov
Computers composed of biological molecules do not seem so far-fetched in
consideration of the fact that natural molecular machines process
information in much the same manner as the Turing machine. Both the Turing
machine and natural automata have demonstrated the ability to store data in
strings of symbols, process these strings in a stepwise pattern, and modify
or add symbols in keeping with fixed rules. DNA and enzymes assembled into
a Turing-like automaton can carry out computations, receive input from
other molecules, and produce a discernible output, such as a signal or a
therapeutic drug. This organic device illustrates the feasibility of the
concept and could find useful application as a medical tool. In a DNA
computer, the ribosome reads data encoded in gene transcripts, or messenger
RNAs (mRNAs), and converts it into amino acid sequences to form proteins;
mRNA's symbolic alphabet consists of nucleotide trios or codons that each
correlate to a specific amino acid. The ribosome processes the mRNA strand
on a codon-by-codon basis, and the proper amino acid is delivered by
transfer RNA (tRNA) molecules, which verify the codon match and then
release the amino acid to join the expanding chain. Scientists Ehud
Shapiro and Yaakov Benenson created an autonomous, programmable molecular
computer that could run itself on its input molecule and theoretically
process any input molecule with a fixed number of hardware and software
molecules without ever running down, and they have outlined a biomolecular
Turing machine that harnesses the molecules' ability to identify symbols
and assemble molecular subunits together. Once they proved that DNA/enzyme
automata can perform abstract yes-or-no computations, the scientists
developed a device that can determine whether disease indicators are
present and release a drug molecule if the diagnosis is positive.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top