Wiretap Rules Are Same for Web Calls
Washington Post (06/10/06) P. D1; Hart, Kim
Internet-based phone services are legally obligated to allow wiretapping
by law enforcement officials, the U.S. Court of Appeals for the D.C.
Circuit ruled 2-1 on Friday, upholding an FCC ruling that Web-based phone
service providers must follow the same rules as traditional phone
companies. However, the court also ruled that private networks such as
those at universities and peer-to-peer systems such as instant messaging
networks are exempt because they are beyond the law's reach. Making
broadband service wiretap-compatible could make such services more
expensive, while analysts say more regulation of Web-based phone service is
also possible as the FCC may decide that Internet phone companies must pay
into the universal telephone service fund. Judge David B. Sentelle,
writing for the majority, said the FCC "offered a reasonable
interpretation" of the law, while Judge Harry T. Edwards in dissent wrote
that the law "does not give the FCC unlimited authority to regulate every
telecommunications service that might conceivably be used to assist law
enforcement." The court's decision may still be appealed. University of
Colorado professor Philip J. Weiser says the ruling will force network
providers to reengineer their networks, but those costs probably won't be
passed down to users. He says, "Any provider of broadband networks now
needs to make accounts wire-tappable. That's not the way they're
engineered and it's certainly not the cheapest way."
Click Here to View Full Article
to the top
Security Onus Is on Developers
eWeek (06/12/06) Coffee, Peter
At last month's JavaOne Conference, a panel of experts from industry and
academia convened to discuss the role of application developers in ensuring
software security. Cigital CTO Gary McGraw noted the major difference
between Java and C from a security standpoint, and that Java cleaned up
many of the shortcomings of C. The type-safe Java environment is less
prone to bugs, and it provides more cycles to consider security from an
architectural standpoint. Regardless of the quality of the individual
programmer, mistakes are inevitable, and the most important security
considerations revolve around detecting and eliminating the bugs after they
occur, said Bill Pugh, computer science professor at the University of
Maryland. While overall security has improved, software developers are
failing to keep pace with the hackers, and some still incorrectly maintain
that security is primarily an operating system or a networking problem,
according to David Wagner, professor of computer science at the University
of California, Berkeley. When Sun Microsystems co-founder Bill Joy first
saw the Java-predecessor Oak, he recognized it as an opportunity to develop
an environment with a formal semantics where programs are meaningful.
Java, Joy notes, is only one layer of an evolving level of higher and
higher abstractions required for thorough testing of the high-level
properties of a software application.
Click Here to View Full Article
to the top
Brainstorming Ways to Push Open Source
IST Results (06/09/06)
The IST-funded FLOSSPOLS project, which set out to assess the current
state of the open-source movement, found that interoperability among
different software applications is still lacking. Building on the FLOSS
project, which established the world's largest clearinghouse on open-source
usage and development, FLOSSPOLS aimed to preserve the European Union's
lead in the open-source field. "Our study revealed that preference is
often given in business tenders to certain vendors with mostly proprietary
software at national and international levels," said project coordinator
Rishab Ghosh. "Whether explicit or implicit, this preference is illegal
under EU rules. Hardware preference is already outlawed, yet the use of
specific software can often limit competition even more." Ghosh says that
even in the absence of major policy support, the rate of open-source
adoption in Europe is encouraging, and a program within the European
Commission has arrived at a definition for open standards, though it has
yet to receive formal approval from the commission. Ghosh is encouraged by
the Open Source Observatory, an EC-supported project that serves as a
repository for information on open-source deployments by public
organizations throughout Europe. In its analysis of gender, the project
found that women account for just 2 percent of participants in open-source
development and production, while they make up 20 percent of general
software developers. The project concluded that women face active
discrimination, and that European governments need to do more to encourage
female participation in the open-source community. The project notes that
some companies are more likely to hire developers with open-source skills
than applicants with strong university credentials, suggesting that schools
should do a better job of partnering with the development community.
Click Here to View Full Article
to the top
Idea for Electronic Message Tax Prompts Swift Outcry in
Europe
New York Times (06/12/06) P. C9; Crampton, Thomas
European Parliament member Alain Lamassoure's informal proposal to tax
emails and text messages throughout Europe provoked howls of outrage from
Internet users as well as phone companies and ISPs. "Taxation of emails or
Internet flies in the face of principles the EU has been trying to
support," declared secretary general of the European Internet Service
Providers Association Richard Nash. "This is one of the more bizarre
initiatives, and it is unlikely to increase the popularity of the European
Union if it succeeds." Jupiter Research analyst Thomas Husson estimates
that West Europeans spent $19 billion sending 157 billion phone text
messages last year, while IDC reckons that upwards of 60 billion emails
will be sent daily this year worldwide, compared to 31 billion in 2002.
European text messages cost about 0.10 euro to 0.15 euro each when charged
individually, and Lamassoure said this offers plenty of latitude to reduce
consumer prices and impose a levy of 0.01 euro a message. Lamassoure said
measuring emails for taxation would be more difficult at first, but
explained that while he appreciated Internet users' concern over his
proposal, "it is absurd to say that my ideas will kill the Internet." Some
technology experts and politicians say the issue spotlights disparities
between light and heavy Internet users, who usually pay the same monthly
fees. "The current system of payments for the Internet made sense when it
all started, but the incentives are getting more and more misaligned,"
reported technology consultant Esther Dyson, who prefers a system where
senders would pay on a graded scale to make sure their messages reach their
destination; this scheme would lower spam and have the cost borne by
senders of unwanted email, she believes.
Click Here to View Full Article
to the top
Revamping the Web Browser
Technology Review (06/12/05) Roush, Wade
Even as new online content was proliferating at a staggering pace, the
technology for searching the Web via a browser experienced very few changes
from 1997 to 2004, the same years that Microsoft's Internet Explorer
dominated the browser market. A host of startups has appeared in recent
years offering new software features that challenge much of the
conventional wisdom guiding browser design. Companies are beginning to
develop new browsers more suited for social-networking activities, such as
blogging, RSS feeds, and photo sharing in an attempt to keep pace with the
growth of Web content. "The Web today is very different from the Web of
the '90s, which was very much a one-to-many experience," said Peter Andrews
of Flock, one of many companies creating entirely new browsers. "Now you
have a growing community of producers building a many-to-many Web--and
browsers should integrate the functionality to support that." One new
application, Browster, offers a free supplement to Firefox and Internet
Explorer that causes a small icon to appear when hovering over a hyperlink,
and a preview window appears showing the page where the hyperlink leads.
Unlike other preview tools, the Browster window shows the destination page
in full, but disappears when a user clicks outside of it, ultimately
reducing dependency on a browser's "Back" button. When a user scrolls a
mouse over a list of Google or Yahoo! search results, full-page previews
pop up because the software pre-fetches a page for every result. The hover
feature has been so popular that users have been deploying it on sites
across the Web, even though it is fastest on Google and Yahoo! results
pages, according to CEO Scott Milener. Flock and others are developing
applications that could be even more versatile than Mozilla's Firefox
browser, such as a built-in feed reader and an integrated search tool for
both the Web and the desktop.
Click Here to View Full Article
to the top
IBM Technology Helps People Learn to Read on the
Web
Journal News (NY) (06/11/06) Alterio, Julie Moran
Researchers at IBM have developed a Web-based technology that teaches
people how to read and gives encouragement when they pronounce a word
correctly. The product of 10 years of research in speech recognition and
human-computer interaction, Reading Companion uses voice-recognition
technology to "listen" as a user reads into a microphone while a panda on
the screen either responds with praise for a correct reading, or by
encouraging the reader to try again. "The way kids learn to read ideally
is sitting on a mother's or aunt's or grandfather's lap. You get instant
feedback when you pronounce a word wrong. That's what we were trying to
reproduce in a software solution," said IBM's Jennifer Lai. Most reading
programs focus mainly on comprehension, and pay little if any attention to
pronunciation. The software lets children practice their reading
independently, without having to worry about making mistakes in front of
the whole class. In addition to enhancing voice-recognition technologies
for general use, the Reading Companion could greatly improve literacy, as
nearly 90 million adults in the United States would function better in
society with a higher reading level, according to the National Center for
Family Literacy. IBM has pledged $2 million to deploy Reading Companion in
62 U.S. and Canadian schools and adult education centers, with plans to add
more international sites later this year. Reading Companion, the first
Web-based speech-recognition program, required developers to devise a
method for rapidly sending packets of audio data over the Internet while
ensuring that the software retains enough data to make meaningful analysis
of the reader's fluency. Reading Companion could also have a significant
impact on adults whose native language is not English, as it challenges
them to read e-books on useful life skills such as writing a resume or
visiting a doctor.
Click Here to View Full Article
to the top
Google Researchers Propose TV Eavesdropping
InformationWeek (06/07/06) Claburn, Thomas
Google is in the early R&D stages of developing a scheme that would enable
a laptop PC to capture TV sound and immediately deliver personalized
Internet content to the computer. Two researchers from the company
presented a research paper on the use of ambient-audio identification
technology in such a manner last week at the interactive television
conference EURO ITV in Athens, Greece. "We showed how to sample the
ambient sound emitted from a TV and automatically determine what is being
watched from a small signature of the sound--all with complete privacy and
minuscule effort," Michele Covell and Shumeet Baluja wrote on the Google
Research Blog. "The system could keep up with the users while they channel
surf, presenting them with a real-time forum about a live political debate
one minute and an ad-hoc chat room for a sporting event in the next."
Google has not announced any specific product plans for a scheme that could
become a promising advertising tool for marketers who want a better
understanding of the mass media audience. The company maintains that it
takes privacy seriously, and that the system would not be intrusive to the
point of intercepting any conversations in the background. Google could
ultimately draw more people away from TV and to the Internet if the
technology proves to be a success, says analyst Cynthia Brumfield.
Click Here to View Full Article
to the top
Computer 'Beings' Evolve as Society
Discovery Channel (06/08/06) Staedter, Tracy
Five research institutions in Europe have teamed up to create a computer
simulation of an artificial world that reproduces individual, evolutionary,
and social learning. The NEW TIES (New and Emergent World Models Through
Individual, Evolutionary, and Social Learning) project will make use of
millions of computer generated entities that live and pass on information
necessary for survival before they die. The team of computer scientists,
sociologists, and linguists involved in the project is using a computer to
generate each random agent, which will vary in gender, life expectancy,
fertility, size, and metabolism, enabling each one to respond differently
to the same sets of circumstances they face. The researchers are focusing
on presenting the agents with challenges in order to examine how they adapt
and develop their own world models. They believe the project will have an
impact on machine learning, such as exploratory or search and rescue robots
that need to work together to accomplish a task. The team also believes
policy makers will be able to use the simulation computer project to see
how a new law would impact society. The researchers say tracking the
behavior of the numerous agents will be a challenge because they can not
analyze each one on an individual basis. "You have to have new facilities
in data mining to understand what is going on in your population," says
senior researcher Michele Sebag, an artificial intelligence expert at the
University of Paris-Sud.
Click Here to View Full Article
to the top
Intel's Long-Range Research Focuses on Energy Efficiency
and Performance
Network World (06/08/06) Leung, Linda
At its fourth annual Research at Intel day, Intel opened its doors to the
public and showcased its latest research projects being developed under its
campaign to improve computers' energy efficiency and performance. In his
keynote address, CTO Justin Rattner said Intel hopes to achieve a 10-fold
improvement in the energy efficiency and performance of its processors over
the next three to four years. Intel is also pursuing a number of
enterprise projects focusing on virtualization, security, and data center
performance. One project uses traffic-adaptive filtering technology that
creates shortcuts, commonly used paths between the client and server.
Intel demonstrated the application by launching a denial-of-service attack
against a router, and the firewall protected the video-streaming
application that was running from being disturbed, which also increased
throughput. Another project is bringing Trusted Platform Modules (TPM) to
the virtual computing environment. Intel's project places software-based
Virtual TPMs in front of virtual machine clients to verify their status for
the authentication server, which then chooses whether or not to deny the
virtual machine access to the server it is requesting. Intel partnered
with researchers at Arizona State University on a project focusing on
dynamic thermal management of a data center. The job scheduler considers
the temperature of individual servers or server blades when deciding which
part of the data center to allocate a job, resulting in a more holistic
workflow management. Intel has also developed a technique for system nodes
to communicate with each other about potential low-level threats without
reporting false positives.
Click Here to View Full Article
to the top
Sun Labs' New Boss
CNet (06/08/06) Cooper, Charles
Bob Spoull, a member of the legendary team of researchers at Xerox's Palo
Alto Research Center (PARC) in the early 1970s, discussed his ambitions for
his new position as the head of Sun Microsystems Laboratories in a recent
interview. While at PARC, Spoull was part of the core group of developers
writing the Alto operating system. Spoull says that researchers at Sun
should support the activity of product developers by keeping apprised of
the latest developments in the technical community. Spoull describes the
current state of innovation as incremental, noting that the gradual
progression of research occasionally crosses thresholds, such as the effect
that the digital camera has had on the way people take pictures. Spoull
says that Sun Labs will continue its efforts to balance basic research with
the demands of the market, and that the primary focus of the research and
development division is to add value to Sun. To attract the top talent,
Spoull says that Sun has to set up international research centers and bring
foreign students to the United States. There is no concrete answer for how
technology companies should structure their staffing, but Spoull believes
that a healthy balance of outsourced and domestic workers is the best
approach. To ensure that the United States produces enough Ph.D.s to
sustain the growth of the IT sector, Spoull says that primary and secondary
schools must ensure that they are providing students with adequate training
in both technical and non-technical fields.
Click Here to View Full Article
to the top
Researchers at Carnegie Mellon Study Cheaper Ways to Run
Data Centers
Chronicle of Higher Education (06/16/06) Vol. 52, No. 41, P. A33;
Kiernan, Vincent
Carnegie Mellon University has built a $1.2 million computer center to
study why it costs so much to run a data center. The Data Center
Observatory can hold 40 racks of computers and consume more energy than 750
average-size homes. Gregory R. Ganger, a professor of electrical and
computer engineering, says there is little information on where money goes
in running a data center, which can cost four to seven times annually as
much as building a facility. Electricity and maintenance are part of the
cost, but no one knows how much is spent on the various problems
computer-support members fix or how they divide up their time to address
arising issues. The data center includes equipment that will track the
performance of its systems, while computer-support staff will keep detailed
logs of their jobs, and researchers will analyze the logs and readings from
instruments. Ganger, head of the project, says the results should
ultimately help to lower the operational costs of data centers. The data
center features an efficient cooling strategy that has the computers blow
hot air into a "hot aisle," where it is cooled before it is allowed to mix
with the rest of the air in the room.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Momentum for Global Internet Regulation Mounting
E-Commerce Times (06/08/06) Koprowski, Gene J.
The possibility of a global regulatory framework for the Internet will
likely be a focus of the next World Summit on the Information Society
meeting scheduled for October 30 to November 2 in Athens. The groundwork
was already laid at the last WSIS meeting in Tunis with an agreement to
establish an Internet Governance Forum under the umbrella of the U.N. aimed
at encouraging international participation in Web governance. The author
of this article questions how an organization that has proven so inept at
doing what it was formed to do initially, keeping peace in the world, could
possibly do a good job with governing the Internet, especially considering
that the current model seems to be working just fine, with the private
sector kicking in any time a problem, such as spam, rears its head. In the
end the debate is not about a better Internet but about wresting away its
perceived control by U.S. hands. "In many respects, the debate is about
who makes the rules, and how the process works," says Thomas Smedinghoff, a
partner at the Chicago law firm of Wildman Harrold. "But it's also a
debate between those who favor centralized regulation of Internet
activities and those who favor a market-driven environment free from
intergovernmental oversight and control."
Click Here to View Full Article
to the top
Software Could Add Meaning to 'Wiki' Links
New Scientist (06/07/06) Sparkes, Matthew
Researchers at the University of Karlsruhe in Germany have made
alterations to the software that powers Wikipedia that would enable editors
to enhance the meaning of the links between pages. With the team's
MediaWiki system, authors could add meaningful tags, or annotations, to
articles and the hypertext links that connect them. Relevant pages would
display the annotations buried in the tags, explaining the relationship
between two topics. Annotations could facilitate more intelligent searches
of wiki sites, the researchers claim, and they believe that specialized
communities that maintain their own wikis will likely be the first
adopters. "I think early adoption will be led by communities interested in
data such as animal species information," said the project's Markus
Krotzsch. "Semantic information is most useful in situations where data
can be clearly defined." Adding meaning to online content is the essence
of the vision for the Semantic Web promoted by Web architect Tim
Berners-Lee and others. The researchers are hopeful that Wikipedia will
incorporate their software, though they admit that it might have a hard
time supporting such a popular site--Wikipedia receives around 4,000 page
requests per hour.
Click Here to View Full Article
to the top
Deploying a Sensor Network in an Extreme
Environment
University of Southampton (ECS) (06/11/06) Martinez, K.; Padhy, P.;
Elsaify, A.
The GlacsWeb project employs long-lasting wireless sensor nodes implanted
under the surface of a glacier, and these nodes employ a totally customized
approach to ensure the researchers have direct governance over power
management, software, and hardware. The passive sensor probes are encased
in plastic and lowered under the ice, feeding data into a low-power base
station on top of the deployment site; the base station currently runs
embedded Linux and spends most of the time in standby mode. The station
uses 500 mW 466 MHz radio modems to transmit the data to a PC at a local
cafe, and from there the data is routed to a U.K. server. The system's
performance since deployment reflects the researchers' design decisions and
how well they reduce expected risks. The data collected by the system has
not only yielded insights on sub-glacial processes, but also on system
behavior, such as the communications systems' tendency to be affected by
cold and rainy conditions. This year's GlacsWeb deployment will involve a
multiple-hop, self-configuring ad-hoc network that would ideally boast
fully autonomous and manual-intervention-free probes that offer greater
energy efficiency and enhanced data collection. Researchers' direct
control of sensor nodes has allowed a multi-agent-based sensor network
control protocol to be designed for the project. The researchers say the
earlier GlacsWeb deployments and their performance offered clues into
refining the system "to be more fault tolerant and 'smarter,'" giving them
reason to "believe that the deployments have proved to be essential to a
better understanding of how to make real sensor networks."
Click Here to View Full Article
to the top
The Internet's Future
Washington Post (06/12/06) P. A20
As the Senate opens hearings on whether to write a Net neutrality
provision into law, it will hear arguments from a broad coalition warning
that a non-neutral Internet could dramatically slow connection speeds for
amateur users and small enterprises, while extracting fees from larger
companies for swift delivery in what would create a tiered Internet
environment with those able to pay the most receiving the best quality of
service. This is a baseless argument, however, writes a Washington Post
editorial, as it discounts the fundamental economic realities of the
Internet service market. Net neutrality advocates warn that without
codifying a flat structure for delivery of Internet content, the Internet
would begin to resemble cable television, delivering only corporate
content. More than three-fifths of the country is served by at least four
broadband providers, creating a competitive environment where users have
legitimate alternatives in a self-organizing market, unlike the cable
industry. If one Internet service provider began charging additional fees
for rapid delivery, another provider in the market would step in and offer
a cheaper alternative. A more compelling argument for Net neutrality
claims that higher entry barriers to the Internet could stifle innovation,
as upstart companies would have a harder time competing with entrenched
players and developing new applications, raising the question of whether
Internet telephony or instant messaging would have taken off had bandwidth
been a rarer commodity. However, with the U.S. Internet infrastructure
falling behind that of East Asia and Europe, a non-neutral Internet would
enable AT&T, Verizon, and others to offer faster connections in more parts
of the country, enabling the spread of streaming video and other services,
the Post argues.
Click Here to View Full Article
to the top
Trust Me, I'm a Robot
Economist Technology Quarterly (06/06) Vol. 379, No. 8481, P. 18
Important guidelines about the safety and ethical uses of robot technology
must be developed as robots migrate from the industrial sector to the
consumer arena, according to a new robo-ethics group that recently gathered
in Italy to discuss the issue. Chairman of the Swedish Royal Institute of
Technology's European Robotics Network Henrik Christensen expects the
legality of robotic sex dolls resembling children and the admission into
households of robots that are strong or heavy enough to crush people to be
among the many issues that will gain relevance in the next several years.
As robots become more complex, autonomous, and learning-capable, the
question of whether their designers should be liable for accidents or
malfunctions will become more difficult to answer, notes University of
Southern Denmark professor John Hallam. University of Sussex artificial
intelligence expert Blay Whitby says efforts to address these concerns are
so far insufficient, but there is growing interest among researchers to
improve robot safety. The regulation of robot behavior will become more
complicated as self-learning mechanisms are incorporated into robotic
systems, explains Institute of Intelligent Systems for Automation
roboticist Gianmarco Veruggio; unpredictable failures will further cloud
the issue. Whitby says Isaac Asimov's vaunted Three Laws of Robotics will
not work because they require the presence of a human-like intelligence to
operate, which is beyond the capabilities of robots today. IRobot's Colin
Angle doubts that learning-capable, general-purpose robots will grow
pervasive, and instead expects relatively dumb machines designed for
specific chores to become the norm.
Click Here to View Full Article
to the top
The Case for the Two Semantic Webs
KMWorld (06/06) Vol. 15, No. 6, P. 18; Weinberger, David
Though the Web is full of meaningful data and contextualized links that
often describe the contents of the destination page, the calls for the
Semantic Web stem from the frustration at the inability of the syntax of
the Web (HTML) to capture that meaning, writes David Weinberger. The new
syntax, Resource Description Framework (RDF), describes relationships
between two terms, collectively forming an ontology. The Semantic Web
standard OWL is used to express ontologies. Beyond RDF, Semantic Web
proponents agree on very little, however. There are multiple ontologies
for law terms that compete with each other, and each suffers from trying to
create comprehensive, objective descriptions for an overwhelmingly large
body of inherently subjective material. An alternative to this top-down
approach calls for creating as few new ontologies as possible, relying
instead on existing ontologies that could come from other domains. Rather
than creating a new definition for a relationship, users should take
advantage of an existing ontology that already has a definition for the
relationship, via a URI. That way, applications will see that the
relationship has a common definition on all sites that support the Dublin
Core. This approach calls for building the Semantic Web incrementally, and
while it lacks an overarching development plan, it is more agile than the
top-down plans and thus more likely to succeed. Opinions vary widely on
the transformative potential of the Semantic Web, while Weinberger argues
that most of the ways that users currently add meaning to the Web, such as
reputation systems, XML playlists, and buddy lists, will continue very much
as they are today, and that "the Semantic Web will help where it helps."
Click Here to View Full Article
to the top
Gone Swimmin'
IEEE Spectrum (06/06) Theberge, Michelle; Dudek, Gregory
The latest product of a multi-university effort by Canadian researchers to
develop an unmanned, durable vehicle that can autonomously probe and gather
data in aquatic environments is Aqua, an amphibious, six-flippered robot
that can travel both underwater and on land. The battery-powered robot's
compact design makes it easier to deploy than earlier underwater vehicles
(UVs), while its half-dozen flippers give it a wider range of motion,
maneuverability, and stabilization. Aqua's vision is provided by three
video cameras--two fore and one aft--and its video input is sent to its
operator through a fiber-optic tether. The machine can be controlled
remotely by joystick or can autonomously respond to visual cues. The
robot's walking ability, facilitated with the addition of rubber
appendages, was inherited from RHex, a hexapod robot developed by a joint
American-Canadian research program sponsored by the U.S. Defense Advanced
Research Projects Agency. An Aqua field test in Barbados underlined
several technical challenges: Overheating was a problem that presented no
difficulty underwater, while on land the problem was mitigated by shading
the UV. A bigger problem was the tendency for the Aqua's vision and
control modules to crash whenever the vision module experienced a surge in
its energy demand, and this issue was addressed by rerouting the power to
the vision and control modules independently. It is the researchers' hope
that the monitoring abilities of machines such as Aqua will help
conservationists preserve the world's endangered coral reefs.
Click Here to View Full Article
to the top