Association for Computing Machinery Elects Officers for
2006-2008 Term
AScribe Newswire (05/31/06)
ACM has announced the election of Stuart Feldman for a two-year term as
president beginning July 1. Feldman has promised to broaden ACM's global
influence, particularly in countries with rapidly growing technology
industries. Feldman, the winner of the 2003 ACM Software System Award,
also wants the association to exert greater influence on policy makers
under his stewardship. Feldman currently serves as ACM's vice president,
and has helped launch numerous programs offering technical and career
assistance to computing professionals, as well as co-founding ACM Queue,
ACM's publication targeting industry leaders. Wendy Hall was elected to a
two-year term as vice president, and Alan Chesnais was elected to the
position of secretary-treasurer. Hall, the chair of the Women's Forum of
the British Computer Society, and a former society president, is also
likely to bring a greater international focus to the association. Among
Hall's research interests are advanced knowledge technologies, digital
libraries, and the Semantic Web. Hall chaired WWW2006, a conference
co-sponsored by ACM, and she is an active member of the Special Interest
Group on Hypertext, Hypermedia, and Web (SIGWEB) and the Special Interest
Group on Multimedia (SIGMM). Chesnais, who served as ACM SIGGRAPH
president from July 2002 to June 2005, is also committed to improving ACM's
international profile. Elected to four-year terms as Officers
Members-at-Large were Bruce Maggs, a computer science professor at Carnegie
Mellon University, Google's Kevin Scott, and Jeannette Wing, head of the
computer science department at Carnegie Mellon.
Click Here to View Full Article
to the top
Half of Tech Pros Planning to Look for New Jobs, Survey
Says
InformationWeek (05/31/06) McGee, Marianne Kolbasuk
The employment outlook for IT has improved, prompting more industry
professionals to consider looking for new tech jobs. According to a new
survey from staffing firm Spherion, 48 percent of IT professionals plan to
look for a new IT job within the next year, an increase of 9 percent from
the fourth quarter of 2005. The Harris Interactive survey also reveals
that only 36 percent of the overall workforce in the country plans to look
for new jobs. "Tech workers tend to make job changes more frequently than
other workers," says Spherion's Brendan Courtney. "They tend to be more
mercenary, looking for more money, better work-life balance, working with
new technologies." Tech workers were largely confident about their job
security, with 70 percent adding that they did not believe their job would
be eliminated within the next 12 months. During the 12 months ended March
31, tech industry employment in the United States reached a record 3.472
million workers, compared with the previous high of 3.455 million workers
in the third quarter of 2001 at the height of the dot-com hiring boom. The
survey is in line with an InformationWeek Research report conducted this
spring that found 41 percent of IT staff were "somewhat" or "actively"
looking for a new job.
Click Here to View Full Article
to the top
Internet Firms Told to Keep Records on Customers
Longer
Washington Post (06/02/06) P. D5; Sherman, Mark
Internet companies have been instructed by leading law enforcement
officials to hold onto customer records for a longer period of time in
order to help in investigations of terrorism and child pornography, and a
meeting between industry representatives and Justice Department officials
to discuss the issue is scheduled for today. Privacy concerns were raised
by ISP executives a week earlier at a conference with FBI director Robert
Mueller III and Attorney General Alberto Gonzales, where the issue of
longer record retention was first brought up, according to Assistant
Attorney General Rachel Brand on Thursday. Gonzales has said that some
child pornography investigations have been hampered because Internet firms
do not keep records long enough. Brand said Gonzales has not yet decided
how to move forward and that the Justice Department would give privacy
consideration. She insisted that whatever proposal is presented would not
mandate the preservation of customers' communications content. The
information would be held by the companies, and could be acquired by the
government through legal channels. No sweeping requirements exist for
preservation of data, though federal authorities can request the
maintenance of records for as long as half a year if there is suspicion of
criminal activity. Google made an official statement that "Any proposals
related to data require careful review and must balance the legitimate
interests of individual users, law enforcement agencies and Internet
companies."
Click Here to View Full Article
to the top
Digital Dialogue: And Now, the Answers
International Herald Tribune (05/31/06) Shannon, Victoria
Internet pioneer Tim Berners-Lee answered a series of online questions
concerning the neutrality, accessibility, and future of the Internet during
the International World Wide Web Conference last week. Berners-Lee
addressed one query by stating the that Net is not free in terms of
"zero-cost," but is free in terms of communication. He added that this
openness is under threat in the United States by telecom companies seeking
to establish a fastlane for video and other kinds of bandwidth-heavy
content via high-speed Internet connections so that such content can be
prioritized for a fee. Berners-Lee does not think the death of the
Internet's neutrality--and thus its diversity--is unavoidable, and is
confident that "the public will push for the real Internet." Several
questions focused on the development of the Semantic Web, and Berners-Lee
replied that Semantic Web technology is starting to permeate the
mainstream, although its progress varies by field: He noted that the life
sciences field is a particularly fervent center of development, while both
large and small companies are rolling out products based on the Semantic
Web. A related question prompted Berners-Lee to postulate that the
proliferation of the Semantic Web is being hindered by the "network
effect," in which its chief value--being able to link data to all kinds of
other data--is largely unknown by the community at large because of its
sparse presence. One person asked what the World Wide Web Consortium (W3C)
can do to ensure that international organizations and government comply
with W3C guidelines for providing Web accessibility for the disabled, and
Berners-Lee replied that the consortium can educate and coordinate
discussions of the issue, but cannot enforce the guidelines. He responded
to a question about the Net's status in the next one to two decades by
offering an idealized vision of a system characterized by universality,
collaborative spaces, access to all expected data on the Semantic Web, and
a balance between privacy expectations and the transparency needed to
enforce them.
Click Here to View Full Article
to the top
Q&A With IBM's Blue Gene/L Chief Architect
HPC Wire (06/02/06) Vol. 15, No. 22,
In a recent interview, IBM Fellow Alan Gara, the chief designer of the
Blue Gene supercomputer, shared his thoughts about the Blue Gene
architecture and the general nature of supercomputing. Blue Gene addressed
the issue of power efficiency, which Gara believes is the fundamental
constraint holding back the improvement of supercomputing performance.
Blue Gene's combination of a smaller size, an inexpensive design, and low
power consumption has opened a whole new set of applications for
supercomputing. Gara describes the move toward massive parallelism in
computing architecture as a paradigm shift, leading to systems that process
computations of unprecedented complexity while only consuming a fraction of
the energy required by the fastest systems that exist today. One of the
most significant challenges Gara faced in developing Blue Gene was
designing software that could be scaled to more than 100,000 processors.
Gara says that his team followed the principles of simplicity, performance,
and familiarity as they developed the Blue Gene software. The developers
imposed a simplifying rule that ensures that a Blue Gene partition can only
run one parallel job at a time. IBM is still committed to its goal of
creating a petaflop computer, Gara says, noting that Blue Gene was the
first step in that direction. Gara predicts a flurry of advances in
supercomputing over the next 10 years, particularly as silicon ceases to
offer improvements in energy consumption.
Click Here to View Full Article
to the top
Microsoft in India
Technology Review (06/01/06) Roush, Wade
Microsoft recently followed IBM, Hewlett-Packard, and other major computer
companies in opening a research facility in India, though the country could
be especially important for Microsoft as it tries to claim a greater share
of India's rapidly growing desktop market for its Windows or Office
software. In a recent interview, Microsoft Research India's Kentaro Toyama
discussed the new center's operations and objectives. Toyama describes
India as a unique research environment, home to a booming technology
economy while much of its population still lives in poverty. India is a
natural environment for Microsoft's research about the role of computing in
poor communities, Toyama says. The center is currently focusing its
research on six areas: photography, digital geographies, multilingual
systems such as speech recognition and natural language processing,
communications hardware, software engineering, and emerging markets. With
22 officially recognized languages, India is also an ideal setting for
research into multilingual computer applications. In the area of software
engineering, Microsoft Research India is examining the issues that arise
when trying to create software while collaborating with teams in other
parts of the world. When Microsoft researchers found that many rural
schools do not have enough PCs for each child to use, they set about
developing technology to enable multiple users to interact with the
existing PC, such as the application that accommodates as many mice as
there are USB ports on the computer. Engaging multiple children in the
operation of a PC helps to do away with the common scenario of a group of
children clustered around a computer watching a dominant child, usually a
boy, monopolize the mouse and keyboard.
Click Here to View Full Article
to the top
Building New Cultural Knowledge Services With
BRICKS
IST Results (06/02/06)
The IST-funded BRICKS project is developing the infrastructure for
cultural institutions to preserve and share their digital content.
Libraries, archives, museums, and other institutions will be able to use
the open-source software at no cost. A recent conference highlighted the
program's growing popularity. "Around 30 organizations attended the
conference and all of them wanted to join the BRICKS community," said
Silvia Boi, communications director of the program. "Many of them believe
we can help solve some of their problems by increasing their visibility and
helping them to do more with the cultural resources they have." The
visibility of digital content can be limited because digital resources are
often stored in centralized libraries, and many institutions have not made
their content available on the open Web due to copyright issues. To
address that issue, the BRICKS software provides an interoperable,
vendor-neutral system for dealing with intellectual property issues.
BRICKS will be especially beneficial for smaller institutions that often
lack the resources to maintain their own systems. Each institution will be
a node on an infinitely scalable network modeled after peer-to-peer
architecture. Institutions that have their own systems for storing and
distributing content can still deploy the BRICKS system because of the
modular nature of the software. The project has developed four vertical
applications based on the decentralized network structure, including an
application that compiles and distributes information about different
archaeological sites that enables them to be recreated online in multimedia
formats. A Living Memory function allows users to annotate content with
their memories or feelings, creating "new cultural knowledge" while
providing access to existing content. Another tool allows for the
retrieval and comparison of different historical texts. The other
application is designed to help smaller institutions manage their
collections more effectively.
Click Here to View Full Article
to the top
Experts to Discuss Future of Robotics
Indiana Daily Student (06/01/06) Cunningham, Matt
Biologically inspired robots were the focus of much of the initial
discussion at the International Conference on Development and Learning
(ICDL), which got underway Wednesday at Indiana University. John Lipinski,
a researcher at the University of Iowa, says combining linguistics and
non-linguistic spatial learning has the potential to be very helpful in
autonomous robots. In particular, Lipinski says his research could impact
the vision systems of robots, and may be used to teach direction-oriented
language to a robot so that it could coordinate in its environment.
Co-sponsored by the IEEE Computational Intelligence Society, ICDL has drawn
representatives from organizations such as the Los Alamos National
Laboratory, the MIT Media Lab, and Microsoft Research to IU through June 3
to discuss the direction of artificial intelligence. IU psychology
chairwoman Linda Smith notes that AI has drawn upon research from a number
of disciplines, from robotics and neuroscience to computer science and
engineering, adding that the areas have become more closely related. At
IU, 69 graduate students in psychology are pursuing joint PhDs in cognitive
science. "Although some AI systems are really good at some stuff like
[older artificial intelligence technology] can play chess, to really move
to the next generation of robots the idea is to use what we know," says
Smith, who is involved in the Biologically Inspired Cognitive Architectures
initiative of the Defense Advanced Research Projects Agency (DARPA).
"Understanding the role of development will play a role in the next
generation of robots."
Click Here to View Full Article
to the top
Military Getting High-Tech Help From SRI Lab
San Francisco Chronicle (05/29/06) P. E1; Abate, Tom
Researchers at SRI International have developed a two-way language
translation computer for the Defense Department that could help U.S.
soldiers communicate with native Iraqis. The department has already
shipped 32 of the IraqComm systems to U.S. military personnel to test in
the field. IraqComm has a vocabulary of 40,000 English words and 50,000
Iraqi Arabic words, and is not designed for complex or wide-ranging
conversations. SRI demonstrated the system at its headquarters where
computer scientist Harry Bratt played the role of U.S. soldier questioning
Saad Alabbodi, an Iraqi immigrant posing as a civilian in his native
country. IraqComm does not provide perfect translations, but its audio
playback is close enough to convey the basic meaning of a conversation, at
least in situations without a lot of background noise and where both
parties speak in short sentences. The Defense Department and other
agencies have ramped up funding for machine translation programs since the
Sept. 11 attacks, with particular emphasis on Arabic languages such as
Pashto and Dari. The system is housed in a durable laptop complete with a
microphone and a host of SRI software. The project comes after another
Defense Department-funded initiative called the Phraselator, a rugged
handheld designed to recognize 800 to 1,000 English phrases. While
IraqComm represents a big step forward, human interpreters are still vastly
superior given their ability to determine from context which of two or more
possible meanings of a word applies, such as whether the word "trunk"
refers to an elephant or a car. SRI's Doug Bercow believes that it could
be five to 10 years before two-way machine translation can be put to
practical use, even in confined settings under ideal conditions.
Click Here to View Full Article
to the top
A Gem of a Language for Java and .Net
DevX News (05/26/06) Patrizio, Andy
Since the release of Ruby on Rails last year, the development community
has given the Ruby programming language a fresh look. Originally developed
in 1993 by a Japanese programmer, Ruby is a dynamic language that is five
times more powerful than the static C++, according to the Burton Group's
Richard Monson-Haefel. In essence, what Ruby can accomplish in one line of
code would require five lines of code in C++. Two new projects are
attempting to broaden the base of Ruby by bringing the language to the .Net
and Java environments. "You get the productivity of a dynamic language and
the extensibility of these ecosystems," Monson-Haefel said. "Dynamic
languages don't have a lot of libraries. They often have trouble taking
off because they don't have the library support you see with Java and
.Net." Sun has unofficially endorsed the JRuby project to bring Ruby to
Java, but Microsoft has shown no signs of support for the Ruby-on-.Net
project Iron Ruby. IronRuby is the work of Wilco Bauwer, a college student
in the Netherlands and a former Microsoft intern. His project rewrites
Ruby in Microsoft's C#, compiling the code to MSIL. "I figured that if
people love Ruby so much, why not let people write Ruby for .Net as well?"
Bauwer said in an email interview. At the recent JavaOne conference, Sun
expressed its interest in making the Java platform multilingual, according
to Charles Nutter, one of the Java consultants who launched the JRuby
project. Nutter and his partner Thomas Enebo presented their research at
JavaOne with Sun's approval. The maturity of Java enables JRuby to connect
to any JDBC database with the JDBC driver, while Ruby on Rails would
typically need one driver for each database it tries to connect to.
Click Here to View Full Article
to the top
The Future of the IT Organisation
ComputerWeekly.com (05/30/2006) Bradbury, Danny
ICANN's control over the Domain Name System and the U.S. government's sway
over Internet governance has many calling for international oversight.
ICANN has recently come under fire for its deal with VeriSign that, despite
opposition from 40 percent of ICANN's members, gave the latter control of
the .com domain in essence forever, along with the ability to raise
registration charges in exchange for an end to litigation over ICANN's
forced closure of VeriSign's Sitefinder service. Critics are questioning
why two or more companies can't be charged with overseeing dot-com. "The
problem is that you cannot go faster than the speed of light," says ICANN
CEO Paul Twomey. "If you set up two registries to operate .com, and you
registered a domain in one and I tried to register one in the other, it is
feasible that you could end up with a domain registered to two people." So
what about a two-phase commit database? "Who pays for it?" asks Twomey.
Meanwhile, some are taking the initiative to create their own domain naming
system. New.net used a browser plug-in to translate non-standard TLDs such
as .books into a domain-like .books.new.net, much to ICANN's disapproval.
China has also used a plug-in to convert what looks like .com in Chinese
into .com.cn. Another issue gaining attention on the world of the Web is
"Net neutrality," though this transcends borders. Network builders want
content providers to subsidize the cost of the cable and fiber systems they
are laying down instead of having to pass on all of the costs to consumers.
"Customers should not be the only ones to pay for this new world," says
Deutsche Telecom CEO Kai-Uwe Ricke. "Web companies that use this
infrastructure for their business should also make a contribution...If
customers are not willing to pay and Google and [others] are not willing to
pay, there will not be any high-speed data highways."
Click Here to View Full Article
to the top
Robot Hand Controlled by Thought Alone
New Scientist Tech (05/26/06) Knight, Will
Researchers in Japan have used a functional magnetic resonance imaging
(fMRI) technique to control a robotic hand through the power of thought.
Yukiyasu Kamitani and colleagues at the ATR Computational Neuroscience
Laboratories in Kyoto were assisted by researchers at the Honda Research
Institute in Saitama in developing the fMRI scanning technology. In a
demonstration, the researchers had a subject lay inside an MRI scanner,
then make "rock, paper, scissor" shapes with a hand. The MRI scanner
recorded brain activity as the subject made the movements with her hand,
and delivered the data to a connected computer. A brief training period
ensued before the computer made the connection between brain activity and
the corresponding shape, and then commanded the robotic hand to mimic the
rock, paper, and scissor hand movements. The real-time fMRI on brain
activity is considered a breakthrough in research into prosthetics and the
operation of computers using the power of thought. Although Klaus-Robert
Mueller, a researcher at the Fraunhofer Institute in Germany, has some
concerns about the cost and complexity of the system, he says it produces
higher resolution. "We will need several breakthroughs in related
technologies, including those for brain scanning hardware, before this type
of non-invasive systems will be used in daily life," says Kamitani.
Click Here to View Full Article
to the top
Mars Robots to Get Smart Upgrade
BBC News (05/28/06) Amos, Jonathan
NASA plans to provide its Mars rovers with new software in the next month
that will enable the robots to sift through images of clouds and dust
devils, and decide which pictures to send back to Earth. For the space
agency's researchers, searching through the images for the most significant
data is a task that requires an inordinate amount of time. "The idea now
is to collect as much data as the instrument can, analyze them onboard for
features of specific interest, and then down-link only the data that have
the highest priority," says Rebecca Castano, who works at NASA's Jet
Propulsion Laboratory (JPL). Later in the year, the Mars Odyssey orbiter,
which has been mapping the planet for the past five years, will receive new
autonomous flight software. The success of the implementation of
Autonomous Sciencecraft Experiment software on NASA's Earth Observing-1
satellite has prompted the agency to seriously consider autonomous
operation to be the future of its robotic craft. Though it would typically
take several weeks to discover that a remote volcano was active, autonomous
software reprogrammed the EO-1 camera to take more pictures of Mt Erebus as
soon as it detected heat from the lava lake at the mountain's summit in
2004. "This has helped us reduce the operations cost of this mission from
$3.6 million to $1.6 million a year--over half that reduction was directly
attributed to the onboard automation that we're talking about," says Steve
Chien, principal investigator for autonomous sciencecraft at JPL.
Click Here to View Full Article
to the top
Data Mining: The New Weapon in the War on
Terrorism?
Federal Computer Week (05/29/06) Vol. 20, No. 17, P. 38; Sternstein, Aliya
The data-mining technology needed to support a massive government
initiative to ferret out terrorists through analysis of phone records will
be costly and computationally intensive, and could compromise the privacy
of ordinary U.S. citizens. While it is uncertain if the government is
actually using data-mining techniques to sift through the tens of millions
of records it has collected from Verizon, BellSouth, and AT&T, it would
need supercomputers comparable to IBM's Blue Gene to derive meaningful
information from a dataset so large, says Nathan Hoskin of Planning
Systems. Hoskin estimates that such a system would cost between $20
million and $50 million. To effectively mine the data, the system would
use clustering algorithms to focus on relationships between similar data,
link analysis to find connections between disparate data, and rule mining
to find patterns within the data. Privacy advocates warn that giving the
government unfettered access to citizens' phone records, even in the name
of fighting terrorism, could lead to a host of civil rights violations
without ever producing a lead. Critics have compared the possible
data-mining initiative to the aborted Total Information Awareness program
envisioned by the Defense Department to preemptively combat terrorist
attacks by analyzing patterns within a huge repository of electronic data.
Data-mining experts say that even if the phone companies are not turning
over customers' personal identifying information such as names and street
addresses, the government could easily retrieve that information from other
databases and services. While data mining does not go as far as
wiretapping, privacy advocates warn that the threat is very real.
"Listening to the content of calls is more intrusive, but nobody should
underestimate the privacy invasion that's involved in tracing who's talking
to whom," said the ACLU's Jay Stanley, adding that mining records of phone
calls for terrorists is inefficient and tantamount to labeling the entire
U.S. population as suspects.
Click Here to View Full Article
to the top
U.S. Holds Own vs. China, India Engineer Grads
EE Times (05/29/06)No. 1425, P. 1; Riley, Sheila
The Duke University researchers who debunked the often repeated fear that
the United States is falling behind China and India in the number of
engineering graduates it produces each year testified before the House
Committee on Education and the Workforce last month. "It's contrary to
what everyone else is saying," said Vivek Wadhwa, an adjunct professor at
Duke. The study found that China and India are not producing nearly as
many graduates as is widely believed, and that U.S. engineers are most
likely better trained, because China and India both include graduates of
two- and three-year school in their totals, and China has a looser
definition of what constitutes an engineer. When a visiting Chinese
scholar told the Duke researchers that the numbers they used in their
original December report were inaccurate, they began contacting China's
engineering schools directly and found that the reporting varied from
school to school, rather than from province to province, as they had
earlier assumed. Ultimately, the researchers were unable to make a
meaningful comparison between the United States and China, though the 2005
data showed that 77 Chinese universities demonstrated "significant
increases" in the number of engineering graduates compared with 2005. "The
ambiguity of numbers is a problem with Chinese statistics and business in
general, because of the really top-down nature of the Chinese government,"
said George Haley, a professor of industrial marketing at the University of
New Haven. Haley said the government issues a quota of engineers which the
universities find a way to meet, and often exceed, though that does not
translate into qualified engineers by U.S. standards. "My conclusion is
that China is graduating more engineers than the U.S. in raw numbers, and
that those numbers are very high," said Wadhwa. "However, their focus is
on quantity, not quality."
Click Here to View Full Article
to the top
Code Warriors Battle On
Washington Technology (05/29/06) Vol. 21, No. 10, P. 20; Beizer, Doug
In an effort to update the methods of encryption used by the intelligence
community, the NSA and the Defense Department have implemented an ongoing
program called the Cryptographic Modernization Initiative. "In the
encryption world, probably on a timeframe of every seven to 10 years,
there's a need for new encryption algorithms," says SafeNet Chairman
Anthony Caputo. "Because every year, the enemy or hackers' tools are
getting better, periodically you have to increase the strength of the
encryption algorithms. That's what the Cryptographic Modernization
[Initiative] does." A major change in the encryption world came when the
National Institute of Standards and Technology adopted the Advanced
Encryption Standard (AES) in 1991, according to Alan Sherman, associate
professor of computer science at the University of Maryland. The old
system was designed for 56-bit technology, while the current AES is fixed
at 128 bits with key sizes of 128, 192, or 256 bits. SafeNet has received
approval from the NSA to develop a classified version of its 10-Gigabit
SafeEnterprise Sonet Encryptor for use in federal intelligence agencies and
defense and civilian groups. SafeNet's system consists of small,
special-purpose computers that encrypt and decrypt traffic at the endpoints
of communication nodes. Cryptography experts claim that while
software-based encryption is sufficient for most IP traffic, only hardware
encryption protects both the algorithm and the encryption key. "Our
devices in the field today have encryption algorithms much stronger than
commercial encryption algorithms, but you still need to periodically
strengthen algorithms to make sure the communications links continue to
have good security," Caputo said. In addition to intelligence, government
agencies use encryption to protect information such as health and tax
records, and Sherman notes the potential applications in securing e-voting
systems.
Click Here to View Full Article
to the top
Predict the Future--Or Try, Anyway
InformationWeek (05/29/06)No. 1091, P. 38; Whiting, Rick
Business intelligence is migrating to the predictive analytics model, in
which historical data is run through mathematical algorithms to outline
trends and patterns to make educated guesses about future outcomes.
Predictive analytics is being embedded by vendors into common business
decision-making processes, and IDC is projecting an 8 percent annual
increase in predictive analytics software sales revenues to $3 billion by
2008. Domains where predictive analytics tools are finding use include
marketing, health care, inventory and supply chain management, law
enforcement and crime deterrence, and accident prevention. Companies'
growing interest in these tools is based on the ever-increasing amount of
data they collect via more powerful and affordable computers. Tibco
Software CEO Vivek Ranadive foresees the broad adoption of predictive
analytics software to improve customer retention, boost the efficiency of
the supply chain, and keep inventory up to date. And, although University
of Rhode Island computer science and statistics professor Lutz Hamel
cautions that predictive analytics could never be used to predict the
outcome of the stock market due to the overwhelming number of variables,
the technology is already being profitably employed to predict short-term
trading trends. Perhaps the ultimate application of predictive analytics
will be to mine unstructured Internet content. However, vendors must be
careful not to overhype their products' predictive abilities, since their
accuracy varies with the complexity of the scenario being evaluated, the
variables that come into play, and the volume and quality of the supporting
data. There are also ethical concerns about predictive analytics, such as
whether the technology will facilitate discrimination based on mass
profiling; but responsible employment will reside with equally responsible
deployment.
Click Here to View Full Article
to the top
TinyOS: Operating System Design for Wireless Sensor
Networks
Sensors (05/06) Vol. 23, No. 5, P. 14; Culler, David E.
TinyOS, an operating system specifically designed for wireless sensor
networks (WSNs), supports structured event-driven execution and
component-based software that focuses a great deal of concurrency in a
small footprint, augments robustness, and maximizes energy efficiency while
effecting deployment of refined algorithms and protocols. Structured
event-driven execution supports the evolution of hardware through the
replacement of components, while the modular architecture affords
flexibility, stability, and programming simplicity. TinyOS is usually
configured as a quintet of subsystems--sensors/actuators, communications,
storage, timers, and processor/power management--that serve as a platform
for higher-level services. Networking tiers aimed at disparate
applications with different methods for discovery, routing, power
reduction, reliability, and congestion control have been developed by the
TinyOS community, and they supply embedded applications with higher-level
communication services via a TinyOS programming interface. Reliable
dissemination, aggregate data collection, and directed routing constitute
the most broadly used WSN services. The TinyOS community has an
international reach, and its members include academic and industrial
researchers and developers. The maturation of WSN technology carries with
it the ability to observe points and phenomena that previously could not be
observed because of access, mobility, or distance limitations.
Click Here to View Full Article
to the top
A Broadband Utopia
IEEE Spectrum (05/06) Vol. 43, No. 5, P. 48; Cherry, Steven
In a perfect world, broadband subscribers would receive their phone, TV,
and Internet services from their own pick of providers, and the Utah
Telecommunication Open Infrastructure Agency (Utopia) is realizing this
vision by delivering such services to customers in 14 Utah cities at data
rates that are at least 10 times that of the local incumbent phone
company's DSL service. The Utopia network can deliver services that are
cheaper and richer, as well as faster. Utopia's lower costs will be a
definite plus for businesses, and the ability for multiple services
providers to peacefully coexist on the network will supply them with a vast
reserve of bandwidth, giving consumers the luxury of choice. Utopia, which
is a nonprofit government agency identified as an offshoot of the
municipalities that made it, owns the physical network, while the
for-profit DynamicCity of Utah is contracted to operate and maintain the
network; being a municipal project allows Utopia to be funded by bonds,
whose long maturation period and lower interest compared to commercial
capital financing offer considerable advantages. Utopia reimburses its
bonds by taking a fee whenever a service provider signs up a homeowner for
one of its services, and Utopia believes it can sustain itself if one-third
of the qualifying homes sign up for at least one service. The legislation
that cleared the way for Utopia was Utah's Municipal Cable Television and
Public Telecommunications Services Act of 2001, which included an exemption
that permitted the construction of municipal networks, provided the
municipality does not sell retail services to homeowners. The Utopia
network is designed so that individual homeowners own the bandwidth
traveling the last few hundred meters to their households, and the network
is dramatically streamlined by its reliance on the Ethernet standard to
relay Internet Protocol data packets to the individual subscriber from the
central office. DynamicCity CTO Jeff Fishburn reports that Utopia's cost
per subscriber will hold steady even as homeowner rates approach 1 Gbps,
and the network would have to undergo few modifications for each subscriber
to have up to 1 Gbps now.
Click Here to View Full Article
to the top