Aggregated Information Threatens Privacy
Norristown Times Herald (PA) (03/13/06) Phucas, Keith
Technological advances have enabled the federal government to increasingly
gather and search American's personal information, says the American Civil
Liberties Union, who also notes that the government often purchases
individuals' personal information that it cannot obtain legally from data
aggregating companies. The ACLU warns that Lexis Nexis, ChoicePoint, and
others have built a multi-billion-dollar industry out of the increased
demand for personal information and lax privacy laws, while the information
collected and shared often contains glaring inaccuracies, such as erroneous
criminal records. The federal government was running 131 data mining
initiatives in 2004, with plans to launch another 68, according to the
Government Accountability Office. The Verity K2 Enterprise, a Defense
Intelligence Agency initiative to search for terrorists overseas with
connections to U.S. citizens, is particularly controversial. A Freedom of
Information Act request filed by the Electronic Privacy Information Center
to obtain disclosure of the initiative's activities was denied, and the
center has followed up with a lawsuit. DIA spokesman Don Black reported no
knowledge of the program, but said the agency routinely runs comparative
analyses on terabytes of data, likening its activities to Google. Black
also said the agency always looks for something specific in its searches.
The ACLU has also warned of the numerous government contracts held by data
aggregators (ChoicePoint has 35) that enable the government to skirt the
requirements of the Privacy Act of 1974 prohibiting the government from
compiling data about its citizens who are not the subject of
investigations. Data collection is subject to a host of complex and often
vague laws, which could be streamlined so that other companies would have
to follow the rules of Internet service providers, who are not allowed to
store customer information permanently, according to Villanova law
professor Michael Carroll.
Click Here to View Full Article
to the top
Software Helps Develop Hunches
Wired News (03/13/06) Norton, Quinn
Pattern recognition, an innate ability for humans, has been difficult to
incorporate into computers and has posed the central limitation to
human-computer interaction. Rather than force computers to perform tasks
beyond their ability, Icosystem founder Eric Bonabeau and his team have
developed software to hone human intuition, with the software performing
the computing functions and the user supplying the human intelligence. The
hunch engine progresses from its starting point through a series of
mutations selected by the user at each stage from a bank of possibilities
until it produces a version that is satisfactory to its user. One
application of the hunch engine is a photo editor that allows a user to
select from nine versions of a scanned image, allowing indefinite mutations
until the software produces an acceptable final product. Icosystem intends
to include interior-decorating and name-selection applications in the hunch
engine, and postal workers in France have already used the technology to
determine ideal carrier routes. Coalesix, a spinoff of Icosystem, is
testing the use of the hunch engine to develop new drugs based on
variations of the interactions of molecules. The mutation aspect of the
hunch engine can help guide innovation in directions that humans would
normally overlook.
Click Here to View Full Article
to the top
Immigration Bill Would Add Visas for Tech Workers
San Francisco Chronicle (03/10/06) P. A1; Lochhead, Carolyn
While the debate over the Senate's immigration bill has centered on its
guest-worker language pertaining to unskilled laborers, the legislation
also includes provisions to increase the number of H-1B visas for skilled
workers to 115,000 and start foreign tech workers with advanced degrees on
the fast track to permanent residence. The annual cap for H-1B visas could
increase by 20 percent a year under the skilled-worker provisions, which
were included after heavy lobbying from the tech industry. Following the
dot-com crash and the Sept. 11 attacks, Congress reduced the annual H-1B
visa cap to 65,000, which the country reached in August, effectively
halting the influx of foreign tech workers, save for a January exemption
authorizing an additional 20,000 visas for workers with advanced degrees.
While the revelation that some of the Sept. 11 hijackers had gained entry
to the country with student visas prompted the visa cuts, concern has
shifted in Congress due to widely publicized reports that the United States
is losing its competitive edge in technology. The immigration bill,
introduced by Sen. Arlen Specter (R-Pa.), also calls for the creation of an
F-4 visa category granting permanent residence to students pursuing
advanced degrees in technology, engineering, mathematics, or science,
provided that they find a job and contribute $1,000 to help fund
scholarships and U.S. worker training. The bill would also relax labor
certification requirements for qualified degree holders and open access to
green cards and permanent residency for professors, researchers, and those
deemed to possess "extraordinary ability." While the new immigration laws
enjoy bipartisan support in the Senate and the endorsement of the Bush
administration, House Republicans are opposed to any immigration increases,
and voted in December to seal off the United States' southern border with a
700-mile fence.
Click Here to View Full Article
to the top
'New Internet' Moves One Step Forward
Network World (03/10/06) Pappalardo, Denise
The next-generation Internet was the focus of a town hall meeting last
week. Proposed a year ago by the National Science Foundation, the Global
Environment for Networking Innovations (GENI) would serve as experimental
platform that would provide researchers with greater flexibility in
carrying out their work, according to Larry Peterson, chair of the GENI
Planning Group and professor and chair of the computer science department
at Princeton University. The dynamic nature of computing will demand an
improved Internet in the years to come, in addition to evolving security,
accessibility, and manageability needs, says Peterson. He says industry
will not address such issues and has no incentive to do so, adding that the
current Internet determines the level of research that the academic
community can pursue. According to the original design for GENI, plans
call for a national and later an international fiber optic network with
programmable routers, clusters at the edge sites, wireless subnets and
peering to the Internet at MAE East and MAE West, which allow access to
content on the current public Internet. GENI would help to "change the
nature of networked and distributed systems design," say Petersen. The
platform could be built in five to seven years, and could be available for
research a year after construction begins.
Click Here to View Full Article
to the top
Carnegie Mellon to use 'Sims' to Enliven Software
Associated Press (03/13/06) Lovering, David
The next version of Carnegie Mellon University's Alice, an educational
program that introduces computer programming concepts to first-time users,
will use characters and animation from the popular "The Sims" video game to
transform the popular teaching aid's crude three-dimensional images.
Computer science professor Randy Pausch, director of the Alice Project,
says, "For the intended demographic we're trying to teach, 'The Sims' are
more valuable than the Disney library." Alice, which is used at over 60
colleges and universities and about 100 high schools, was designed to boost
interest in computer programming by making it easily understandable. The
language uses images of people and animals that are controlled by words to
help students create programs. The number of computer science majors has
fallen 50 percent of the past four years, while the proportion of women
considering computer science degrees is the lowest since the early 1970s,
according to a University of California, Los Angeles study. Pausch says
Electronic Arts, which publishes 'The Sims,' "wants more women in computer
science, they want more minorities in computer science...any
underrepresented group." The next version of Alice is expected to take 18
to 24 months to develop.
Click Here to View Full Article
to the top
Research on the Road to Intelligent Cars
IST Results (03/09/06)
The IST-funded PReVENT program, a component of the European Commission
2010 Intelligent Car Initiative, is attempting to develop new safety
applications in cars that will sense danger in an effort to meet the EU's
goal of cutting the number of highway fatalities in half by 2010. While
high costs and lack of demand stymied earlier attempts to include
intelligent safety systems in cars, the technology exists today to deploy
them inexpensively and on a wide scale. A dashboard display in a BMW
creates digital maps with lasers and sensors to extend the driver's horizon
by 300 meters to 500 meters, allowing him to anticipate what is coming
around the next curve. MAPS&ADAS project partners will submit the
technology for certification, and auto makers hope say the new interface
could lower the cost of implementation. The project is also working to
make digital maps into more than just navigation devices by incorporating
information on speed limits, slopes, curves, and traffic signs to improve
their use as a safety application. Project leaders expect to see the
results of the initiative appear widely in cars within five years. Another
PReVENT subproject, INTERSAFE, is focusing on improving traffic safety at
intersections, which pose the greatest challenge to a car's onboard
sensors. The INTERSAFE project is developing new vehicle localization
algorithms, sensors to warn of approaching drivers, and new techniques for
communication between the road and the vehicle. The system is particularly
designed to help drivers when they miss stop signs and red lights, make
left turns, and cross traffic. APALACI, another sub-project, is developing
applications to prevent and mitigate crashes, such as tightening seat belts
immediately before a crash and preparing a car's brakes to avoid an
accident. "The challenge for intelligent safety systems is to avoid false
alarms, so that users quickly come to trust them," said DaimlerChrysler's
Matthias Schulze.
Click Here to View Full Article
to the top
PetaCache: Accelerating Data-Intensive
Applications
HPC Wire (03/10/06) Vol. 15, No. 10,Tuttle, Kelen
A team of researchers at the Stanford Linear Acceleration Center has
launched the PetaCache project to reassess the approach that developers
take to data access and storage. Disks are handling more of the storage
load as clustered computer systems process larger amounts of data, and
increased bandwidth has sped the transmission of data between terminals in
the system, though latency has remained sluggish. PetaCache is principally
designed to improve latency, according to the project's Randal Melen.
PetaCache relies on various forms of memory, rather than disks, to access
the first byte of data. With the prices of DRAM and flash memory having
fallen dramatically, the quantities of memory required to retrieve massive
amounts of data from particle accelerators and other data-intensive
applications are now affordable. As flash memory finds increasing use in
digital cameras, iPods, and cell phones, the price is only likely to drop
further. The PetaCache prototype consists of two racks of 64 servers, each
containing 16 GB of DRAM, totaling 1 TB of memory. The Structured Cluster
Architecture for Low Latency Access program shuttles data from the servers
to batch systems that run physics analysis software to determine the lowest
latencies possible. The software balances the load and organizes itself,
distributing data over numerous servers with optimal efficiency, creating
the appearance of a single, aggregated chunk of memory. Through the use of
more inexpensive flash memory, the next prototype should contain a few tens
of terabytes of memory, which approaches the scale that will be required to
process the data generated by the Large Hadron Collider.
Click Here to View Full Article
to the top
Microsoft Researcher Honored as Computer-Graphics
Pioneer
Seattle Times (03/02/06) Doughton, Sandi
Microsoft's John Platt was honored at the Scientific and Technical Oscars
on Feb. 18 for his groundbreaking work in computer graphics, particularly
his contributions to the computer simulation of flexible materials such as
gels, rubber, and cloth. Though subsequent developers have further honed
and refined his technique, Platt's approach still underpins effects such as
the billowing sails of an armada of computer-generated ships in the movie
"Troy." Platt was a member of California Institute of Technology computer
science professor Alan Barr's circle of "brilliant nerds" that formed in
the 1980s, when computer graphics was essentially primitive geometric
shapes. Three Pixar employees who had also been a part of Barr's group won
awards for work that built on Platt's research. Flexible materials,
particularly clothing, have long been difficult to simulate accurately, and
early equations could only muster enough detail and realism to create
materials that resembled rubber or chain mail. Platt and Demetri
Terzopoulos, now Chancellor's professor of computer science at the
University of California, Los Angeles, applied rudimentary physics
equations to describe the behavior of elastic materials. "You take these
laws of motions for cloth and you simulate gravity, you simulate wind, you
simulate contact between the cloth and the body," says Terzopoulos. After
that, an animated character's clothes will automatically follow his
movements. It took almost 10 years before processing power evolved to a
point where the technique could actually be applied to movies, though
clothing simulations are still not easy.
Click Here to View Full Article
to the top
Open Source Software Capability Key to 'Technological
Self-Determination'
LinuxElectrons (03/08/06)
Experts from the United Nations University herald open-source software as
a tremendous opportunity for developing nations to obtain economic
independence, but also warn of the critical need to cultivate local
expertise to support open-source development. Between 50 percent and 75
percent of Internet activity involves open-source software, and UNU experts
believe that China, East Asia, India, and South America will be the largest
markets for open-source solutions, though there remains a shortage of
developers. "Should this situation persist, developing nations will simply
remain consumers of open-source products rather than participants in the
larger open-source market," says Mike Reed, director of the UNU
International Institute for Software Technology (UNU-IIST). Open-source
technology provides a framework that enables enterprises to develop
high-value products more quickly, a key ingredient in the transition from
passive consumer to active participant. Within the Linux environment,
analysts predict that revenue from packaged software will grow the fastest,
increasing 44 percent annually over the next four years. Open-source
development will benefit the governments of developing nations as well as
industry, providing low-cost implementation of local solutions and content,
and greater standardization and transparency. Open source has also enabled
governments to develop innovative solutions in a host of areas, including
customs reform, providing online access to land titles, and electoral
reform. UNU-IIST is attempting to increase the number of open-source
developers in East Asia through the Global Desktop Project, aimed
specifically at improving the open-source desktop. The project includes a
research and engineering component, an institute of higher learning
program, the incorporation of open-source programming into school
curricula, and a community outreach program.
Click Here to View Full Article
to the top
Scientists Band Together for TRUST-worthy Research
SearchSecurity.com (03/07/06) McKay, Niall
The Team for Research in Ubiquitous Secure Technology (TRUST) initiative
is performing a key role in the nation's effort to safeguard its digital
infrastructure from cyber criminals. Funded by $19 million from the
National Science Foundation, and led by the University of California,
Berkeley, TRUST brings together computer security leaders from universities
across the country to build better systems and develop better policies for
government and business. In fact, policy changes can determine the
effectiveness of technology, particularly with regard to the use of
publicly available information such as Social Security numbers to partly
authenticate an individual, according to Fred Schneider, chief scientist of
TRUST. Schneider, who is also a professor of computer science at Cornell
University, adds that storing large amounts of information on individuals,
often without their consent or knowledge, is another issue that needs to be
addressed as a matter of policy. Among other projects, TRUST participants
are focusing on language-based security to develop "security grammar" for
computer programming languages, as a way to warn systems and users before
they run software executables and worms downloads. Participants from
Stanford University have developed software for the U.S. Secret Service
called PwdHash, which is designed to prevent a cyber attacker from
intercepting messages in a public key exchange and substituting his own for
the requested one. Experts from Carnegie Mellon, San Jose State, and
Vanderbilt universities and several small liberal arts colleges are
involved in TRUST, which is also receiving assistance from companies such
as IBM, Cisco Systems, and Microsoft.
Click Here to View Full Article
to the top
GA Tech Develops Ultra-Efficient Embedded Architectures
Based on Probabilistic Technology
LinuxElectrons (03/09/06)
Researchers at the Georgia Institute of Technology have used probabilistic
technology to develop embedded architectures that save an enormous amount
of energy, and presented their work at the Design, Automation, and Test in
Europe (DATE) Conference in Munich, Germany, on March 9. Dr. Krishna
Palem, a joint professor in the Georgia Tech College of Computing and the
School of Electrical and Computer Engineering, says the embedded
architecture based on Probabilistic CMOS (PCMOS) delivers architectural and
application gains by a factor of up to 560 in simulations. "Probabilistic
architectures extend PCMOS to computing substrates beyond devices," says
Palem, also founding director of the Center for Research in Embedded
Systems & Technology. "By mixing chip measurements and simulations, gains
have been shown using this technology for such applications as
Hyper-encryption as applied to computer security, and through cognitive
applications such as speech recognition and pattern recognition as well as
image decompression." Conventional CMOS semiconductor technology is moving
toward the nanoscale, and the issues of noise and energy efficiency need to
be addressed. The PCMOS device uses noise as a resource for gaining
greater savings of energy.
Click Here to View Full Article
to the top
Gadget Girls and Boys With Their Toys: How to Attract and
Keep More Women in Engineering
University of Edinburgh (03/09/06)
A recent study sponsored by the Economic and Social Research Council and
conducted by the University of Edinburgh's Dr. Wendy Faulkner sought to
investigate whether engineering workplaces are friendlier to and more
supportive of men than women, based on interviews with and/or observation
of 66 male and female engineering professionals covering a broad spectrum
of disciplines and industrial sectors. The results of the study
demonstrate that, though governments and industry have made significant
progress in tapping women for engineering jobs, further improvements in
both the recruitment and retention of female engineers are necessary. The
engineering industry's lack of appeal to women is partly due to a
persistent stereotype: The engineer as a technologically adept but
socially unskilled male, which is not a reflection of reality, according to
Faulkner's study. To combat this image problem, it is recommended that
efforts to recruit more women focus on "normalizing" engineering as a
career choice for women in order to eliminate the assumption by people both
inside and outside the field that "the engineer" will be a man; recruitment
campaigns that do not appeal to gender stereotypes, but do appeal to men
and women's shared interest in math, science, and technology; and the
promotion of engineering as a technical as well as social field. To retain
more female as well as male engineers, good practice at both the university
and the workplace must be sustained and promoted. Seemingly trivial
aspects of workplace culture can discourage women's sense of belonging:
Such factors include coarse and offensive humor, male-oriented routines for
greeting and addressing each other, gender exclusive language and social
circles, women's tendency to physically stand out in mostly male workplaces
so that their engineering skills are de-emphasized, and sexual harassment
and/or flirting. Sustained and sensitive diversity training backed by
senior management can help both male and female engineers fight
gender-discriminating workplace cultures.
Click Here to View Full Article
to the top
Government Sides Against EBay in Patent Dispute
Washington Post (03/11/06) P. D1; Noguchi, Yuki
EBay earns about one third of its business from goods sold using its "Buy
It Now" system, which could be shut down if the Supreme Court agrees with a
brief filed by the Office of the Solicitor General against eBay on March
10. The brief says eBay willfully infringed on patents owned by
MercExchange, and though eBay was found guilty of such infringement an
injunction was denied by a district court; the decision was overturned by
an appellate court, and the solicitor general urged the Supreme Court to
permit the injunction. EBay's argument that the appellate court applied a
"nearly automatic" injunction rather than permitting the district court to
use discretion is disputed by the solicitor general's brief. The brief
goes on to say that the district court was incorrect in its acceptance of
eBay's claim that MercExchange would not be irreparably damaged by eBay's
continued operation of the "Buy It Now" system because MercExchange was
willing to license its patent. Washington patent attorney Stephen Maebius
said the effects of a Supreme Court decision that patent violation
automatically necessitates a shutdown will be profound. "It really does
tinker with the balance of power when it comes to injunctions," he said.
EBay argued that an injunction should not "automatically" be awarded to a
litigant: MercExchange, the contention goes, was no longer actively
practicing its patent, and should therefore not try to threaten a business
with an accusation of patent infringement. MercExchange founder Thomas
Woolston countered that his company does practice its patent by licensing
it to other companies, and claimed that "If the Supreme Court says 'yes' to
the injunction, they will be saying 'yes' to competition."
Click Here to View Full Article
to the top
IETF Taking on 911 Problem Within VoIP
Network World (03/13/06) P. 32; Marsan, Carolyn Duffy
The IETF is working on a technical solution that could solve the problems
of how to best route emergency communications such as 911 calls over the
Internet and how to ensure that police and firefighters can locate and
respond to VoIP 911 calls made from office buildings. The solution, which
is called Emergency Context Resolution with Internet Technologies (ECRIT),
will allow an IP phone to obtain its location information--such as a street
address or office number--when it is used to dial 911. The IP phone will
then query a database using a new mapping protocol that will take its
location information and find the appropriate emergency call center. After
the emergency call center is found, the IP phone will place a call to that
emergency call center, which will be given the location of the caller.
ECRIT will require enterprises to make a number of changes, including
upgrades to their IP phones and IP PBXs. Companies will also need to
create a database with the location of every IP address on the network.
Jon Peterson, a NeuStar fellow and a member of the IETF leadership who is
advising the ECRIT working group, says enterprises may end up using DHCP to
acquire the physical location of IP addresses. "It's very simple to
provide DHCP mapping to push location information down to the phone,"
Peterson says. ECRIT systems will not likely be deployed for several
years, IETF leaders say. "This is not something that is going to get
changed overnight," Peterson says.
Click Here to View Full Article
to the top
There's Nothing Small About Smalltalk
eWeek (03/06/06) Vol. 23, No. 10, P. D5; Coffee, Peter
While the Smalltalk programming language was ahead of its time when it
appeared 10 years before Java and required high processor speeds and
considerable memory budgets, the language continues to endure in Java's
shadow. IBM is abandoning Smalltalk in April, though Cincom Systems has
continued to refine the language's VisualWorks development tools after ANSI
standardized Smalltalk as a language in May 1998. Commercial and
non-commercial versions of VisualWorks 7.4 became available early this
year, offering developers a laundry list of utilities and components.
VisualWorks runs capably on both Windows XP and Apple OS X, providing an
exceptionally open run-time environment in its browser. Smalltalk has
incorporated namespaces and other features to make the programming
environment more friendly to developers. "Our virtual machine isn't as
widespread as the Java virtual machine, so we don't feel as much of a need
to lock it down," said Cincom's James Robertson, noting the ability of
VisualWorks to operate in a Windows, Macintosh, or Unix environment.
Robertson also says that Smalltalk is still distinguished by its totally
object-oriented nature that sometimes poses a challenge for Java and C++ to
simulate.
Click Here to View Full Article
to the top
The Wi-Fi Revolution
Red Herring (03/13/06) Vol. 3, No. 9, P. 32
Communications services are transforming into software applications that
have no special association with location, platform, or device, and phone
companies' control of communications pipelines is dwindling as a result.
FON, a "bottom-up" global mesh of Wi-Fi hot spots, stands to play a big
role in the provision of total wireless coverage by enabling its members to
access other members' Wi-Fi networks for free in return for allowing other
people to use their Internet connections. Through Wi-Fi mesh networking,
Wi-Fi providers can broaden their reach and offer broadband service in a
wide area, thus rivaling cellular technology in performance and price.
Reaching the widest swath of consumers requires wireless broadband access,
and FON could help address the problem of implementing enough access points
for seamless access by conscripting home-based wireless routers. FON
already boasts more than 20,000 users since its launch four months ago, and
expects to have 1 million hot spots by the end of the decade. FON believes
unified global Wi-Fi could proliferate faster by enabling its community
members to download its software, instead of by setting up Wi-Fi boxes at
individual hot spots. The Cloud CEO George Polk wagers that FON will have
to deal with the challenge of managing a network with equipment from
multiple vendors, while Ozone's Rafi Haladjian says that "FON is not going
to be an overnight sensation, but it will be part of a really, really
powerful trend that is happening, which is taking existing infrastructure
and pulling it together into new types of networks."
Click Here to View Full Article
to the top
A Research Library Based on the Historical Collections of
the Internet Archive
D-Lib Magazine (02/06) Vol. 12, No. 2,Arms, William Y.; Aya, Selcuk;
Dmitriev, Pavel
A Web library is being constructed at Cornell University for a time when a
significant amount of information analysis and synthesis is computerized,
and the effort seeks to organize the Internet Archive's approximately 10
billion Web pages and supply intuitive access tools for relatively
inexperienced researchers. Social scientists and computer scientists drove
the Cornell initiative, which is funded by a National Science Foundation
grant, and the needs of these two research groups were determined through
interviews. These sessions informed the division of the library project's
software tools into two categories: Tools for building indexes and
extracting data subsets that focus on the collection as a whole, and
analysis tools that work on comparably small subsets. The Web library's
draft usage principles dictate that the content of the Web pages and the
identifying metadata must remain unchanged; the act of posting information
on the Web includes an implied license to use it for archiving, academic
research, and other limited purposes; the library's creation raises no
privacy issues per se, but research involving data mining that identifies
individuals must take privacy into account; and all users must be
authenticated and limited to non-commercial, academic research. The
architecture of the researcher usage methodology is based on the
anticipation of basic access services and subset extraction services, which
assume that the user employs a computer program to interact with the
collections. Key components of the Web library include a petabyte data
store with 2 GB multi-processor computers running Microsoft Windows Server
2003, and a 100 Mbps connection between the Internet Archive and Internet2.
The library's data storage element consists of a relational database that
stores metadata about every Web page, a Page Store that supplies a single
copy of every unique page, and a robotic tape archive.
Click Here to View Full Article
to the top
Middleware Challenges and Approaches for Wireless Sensor
Networks
IEEE Distributed Systems Online (03/06) Vol. 7, No. 3,Hadim, Salem;
Mohamed, Nader
The design and deployment challenges of wireless sensor network (WSN)
technologies could be met in a unique way through the use of a middleware
layer, write Salem Hadim and Nader Mohamed at the Stevens Institute of
Technology. Middleware efforts should provide mechanisms for limited power
and resource management; scalability, mobility, and dynamic network
topology; dynamic network configuration; heterogeneity; real-world
integration; in-network aggregation of data from diverse sources;
maintenance and adjustment of quality of service (QoS); and the
incorporation of security into the initial software design phases,
facilitating requirements such as confidentiality, authentication,
integrity, freshness, and availability. Issues raised by programming
sensor networks usually fall into the classes of programming support and
programming abstractions. The former class consists of the virtual machine
based, modular programming based, database based, application driven, and
message-oriented middleware subclasses. The latter class boasts the two
main subclasses of global behavior and local nodal behavior. Hadim and
Mohamed define a framework for assessing middleware according to
heterogeneity, scalability, power awareness, mobility, ease of use, and
openness. Ease of use is defined by the middleware's abstraction level,
while openness refers to how easy it is extend and tweak the system as
functional needs shift. Fully meeting WSN challenges with middleware is a
topic to be discussed, with a focus on such goals as fulfilling QoS needs
while supplying a high-level abstraction that tackles sensor node
heterogeneity, and addressing various sensor network application challenges
while developing a simple and expressive programming interface.
Click Here to View Full Article
to the top