This Is Only a Drill: In California, Testing Technology
in a Disaster Response
New York Times (08/28/06) P. C1; Markoff, John
In an effort to modernize emergency-response tools, groups from the
Pentagon, nongovernmental organizations, and dozens of technology companies
participated in a five-day simulation to test their latest digital
disaster-response tools. Dubbed Strong Angel III, the training effort
brought together more than 800 military officers, first responders, and
experts in wireless networking from technology companies such as Google and
Microsoft. "My view is that the value of Strong Angel is 70 percent in the
social networks that will be created," said Eric Rasmussen, a Navy surgeon
who organized the conference. "What we do is try to bring people with
disparate backgrounds together and ensure that they are forced to enter
into a conversation." The participants began by constructing a makeshift
command center in a vacant building near the San Diego airport. The effort
to create a state-of-the-art ad hoc wireless network that could route
satellite map coordinates, video images, and other data failed to get off
the ground, and the network jammed with an overload of bandwidth-intensive
applications. There were some notable successes, however, such as the work
of several companies to enable sharing of a set of data digital satellite
maps based on a Microsoft technology called Simple Sharing Extensions. The
technology, which was built on industry standards such as RSS, was used to
overlay on the maps event data relayed by emergency workers from across the
San Diego area. Bringing together rivals like Google and Microsoft to
collaborate on projects such as the satellite-image mapping application was
at the core of Rasmussen's vision for the event, he said.
Click Here to View Full Article
to the top
Anita Borg Institute Names Scholarship Winners
Electronic News (08/25/06)
The Anita Borg Institute (ABI) for Women and Technology has recognized the
efforts of three women in advancing the role of women in technology and
awarded them Change Agent scholarships that cover all expenses to attend
the upcoming Grace Hopper Celebration of Women in Computing Conference.
The winners are Ijeoma Ihenachor of the Nigerian Airspace Management
Agency, Claudia Medeiros, a computer science professor at the Universidade
Estadual de Campinas in Sao Paulo, Brazil, and Suriya Thevar, a professor
and head of the Department of Library and Information Science at Annamalai
University in India. Intel has announced that it will become a full
sponsor of ABI and provide funding for the Anita Borg Leadership Award,
which will be announced at the conference. "Companies big and small must
aggressively draw on the valuable talents and life experiences of women to
compete effectively in the global high-tech industry," said Intel CTO
Justin Rattner. "Our investment in ABI is but one of the steps Intel is
taking to ensure diversity in our workforce that ultimately results in
greater creativity and innovation." Ihenachor has been a leader of
Nigeria's "Take a Daughter to Work" program, and serves as an executive
member of the Nigerian Society of Engineers. In her work, Medeiros has
focused on designing and developing scientific databases, and has played a
leading role in more than 30 multinational research and development
projects. Thevar is India's ambassador to ACM, and services as director of
Annamalai's Women's Training Center in Information and Computer
Technology.
Click Here to View Full Article
to the top
Offshoring Data to Get Recrunched
EE Times (08/25/06) Leopold, George
The National Academy of Public Administration is reexamining the
offshoring data compiled by the Commerce Department in an effort to
determine the impact of sending an increasing number of U.S. tech jobs to
China, India, and other less expensive countries. The panel aims to create
a framework for sharing and analyzing the data collected by government
agencies. One potential result of the panel's work could be to formulate a
memorandum explaining the implications of the labor statistics on
outsourcing and offshoring that could inform a government response to the
dramatic increase in the number of jobs being sent overseas. Contrary to
the position of the industry, the Commerce study found that in an effort to
save on labor costs, design-engineering jobs are in fact being sent to
countries such as India. The report, which was suppressed by the White
House for two years, also found that government and industry needed to
provide more data to flesh out the analysis of the effects of outsourcing.
Many companies are spending more aggressively on setting up facilities
overseas than at home, said project director Kenneth Ryder, citing AMD,
which last week announced that it would open a second research campus in
Shanghai, part of a region that accounts for more than two-thirds of the
world's notebook PC production.
Click Here to View Full Article
to the top
3D Design Platform Connects Varying-Standard
Applications
IST Results (08/28/06)
While the technology behind 3-D graphics design has taken off in recent
years, progress in the field is still plagued by interoperability issues
among the software tools that designers use to create content. To address
the problem, IST is funding the Uni-Verse project, an open, distributed
Internet-based platform to create 3D graphics, bridging the gap between
open-source and proprietary tools. The system dispenses with the need to
convert incompatible files, which streamlines the design process and
reduces mistakes. The platform is built on the low-latency Verse protocol,
which enables multiple applications to work in concert by transmitting data
over a network. To harness the compatibility of Verse, the developers
rewrote the code of open-source tools such as Blender, and developed
plug-ins for proprietary tools such as 3D Studio Max. Developers in
different locations can use the Internet-based platform to work
simultaneously on the same project. A problem that many designers find is
that they cannot see the ultimate product of their work, and feel in
essence that they are working blind, but Uni-Verse automatically converts
the 3D texturing work and adds it to the model, so designers do not have to
wait for the rendering process to determine what changes need to be made to
their work. In addition to expediting the workflow at design studios, the
Uni-Verse platform could also have significant benefits for architectural
firms. "There is a lot of interest in this platform," said Uni-Verse
coordinator Gert Svensson. "We were at the recent SIGGRAPH tradeshow in
Boston and the general consensus among people working in the 3D design
industry is that Uni-Verse solves a technological problem that is extremely
important in the sector."
Click Here to View Full Article
to the top
The Future of Programming: Less Is More
eWeek (08/28/06) Taft, Daryl K.
The rise of open source has shifted the face of programming toward a more
dynamic framework that will be less encumbered by configuration concerns
and proprietary infrastructures, some developers say. The grassroots
community-development model produces simplified code that could bring the
age of the vendor-created enterprise code to an end. XML will be the core
data type for languages and databases by 2010, said Borland Software's
David Intersimone, adding that future languages will be able to verify
correctness and testing with syntax extensions. Artifact management,
testing, audits, and refactoring will all be automatic in the development
environment of the future, Intersimone says. By 2010, he predicts, there
will be Web services and libraries of reusable, distributed objects
available for developers. The platform of the future should enable
developers to extend an existing language, but also simplify the process of
creating a new language with an intelligent editor for it, according to
JetBrains CEO Sergey Dmitriev. "To run platforms written in such a DSL
[domain specific language], the platform should support writing generators
to any existing runtime platform--Java or .Net or whatever," he said.
"Using such specialized DSLs allows writing programs on a much higher
level, so these programs will be much more maintainable and expressive."
Dmitriev terms this approach language-oriented programming, and expects
future languages to offer more expressive knowledge representations.
Programming needs to become creative again, Dmitriev says, which will
require better tools. An understanding of new programming skills,
particularly service-oriented architecture for business, will be
increasingly important, says JackBe CTP John Crupi. "In the future,
Web-based applications will subscribe to business events and be mostly
based on this interaction model. This new Web event model requires
programmers to program at the business event level and less at the user
event level."
Click Here to View Full Article
to the top
Patent Fight Rattles Academic Computing
Associated Press (08/27/06) Pope, Justin
Washington, D.C.-based Blackboard has been awarded a patent that
establishes its claims to some of the essential features of the software
that powers online education. Blackboard's patent does not refer to any
device or to a specific software code, but rather to the basic framework of
so-called "Learning Management Systems." Critics of Blackboard's patent
say it lays claim to the very idea of e-learning. They add if the patent
is allowed to stand, it could hamper the cooperation between academia and
the private sector that has characterized e-learning for years and explains
why online classrooms are so much better than they used to be. Michael
Feldstein, assistant director of the State University of New York's online
learning network, says Blackboard's patent is "antithetical to the way that
academia makes progress." Critics such as Feldstein have taken to the
blogosphere to make their case against Blackboard's patent. Over the last
several weeks, a voluminous Wikipedia entry has emerged tracking a history
of virtual classrooms as far back as 1945 in an effort to prove that the
idea was not Blackboard's. For its part, Blackboard--which recently became
the leading company in the field by purchasing rival WebCT--says critics
misunderstand what the patent claims. The company says the patent is
necessary in order for it to protect its $100 million investment in
e-learning technology. "It just wouldn't be a level playing field if
someone could come onto the scene tomorrow, copy everything that Blackboard
and WebCT have done and call it their own," said Blackboard general counsel
Matthew Small.
Click Here to View Full Article
to the top
African Languages Grow as a Wikipedia Presence
New York Times (08/26/06) P. A15; Cohen, Noam
An African-language version of Wikipedia, the open-source Internet
encyclopedia, was a topic of discussion at second annual Wikimania
conference this month. There are currently about 38 Wikipedias written in
African tongues, though most of them lack articles; the first
African-language Wikipedia to contain 1,000 articles is the one written in
Swahili. Ironically, many of the Wikipedias' primary contributors are
people who did not grow up speaking the language. "[The main contributors
to the Swahili Wikipedia] are all white, and to me it is very
interesting--it shows that the world is not flat, that the world is still
round," noted Wikipedia contributor Ndesanjo Macha. "We have allies,
people who are willing to help us, but we need to be in charge of our own
identity. When it comes to producing information, we don't want to be
dependent." Yale University researcher Martin Benjamin said Africanist
professors have previously been hesitant to contribute to African-language
Wikipedias, either because they feel the process takes too long, the
technology is intimidating, or academics have a snobby attitude toward
amateur contributors. Dutchman Kasper Souren is attempting to generate
interest in a Wikipedia written in the Bambara dialect by offering to pay
contributors a dollar for every article, a strategy that critics say
undermines the online encyclopedia's open-source principles. Wikipedia
founder Jimmy Wales said at the conference that the Wikipedia foundation
would probably get a grant before the end of the year to support an
African-language Wikipedia facilitator who would coordinate contributions
among bloggers, academics, community leaders, and graduate students and
"jumpstart" the encyclopedia's expansion.
Click Here to View Full Article
to the top
Tiny Ion Pump Sets New Standard in Cooling Hot Computer
Microchips
UW News (08/23/06)
Researchers at the University of Washington have developed a reliable and
efficient cooling device that could fit on a computer chip. The tiny
device creates an air jet at the chip's surface through an electrical
charge, an advance that could be invaluable as heating becomes an
increasing problem with smaller and denser chips. "With this pump, we are
able to integrate the entire cooling system right onto a chip," said
Alexander Mamishev, associate professor of electrical engineering at
Washington. "That allows for cooling in applications and spaces where it
just wasn't realistic to do before." The idea is not new, but the
researchers' prototype is the first working device created using the
method. Using an electrical field, the device can propel air at speeds
previously attainable only with conventional blowers. In testing, the pump
significantly cooled a heated surface using just 0.6 watts of power. The
prototype features an emitter with a tip radius of around 1 micron, which
creates air ions that are propelled within an electric field to the surface
of a collector. While traveling to the collector, the ions create a stream
of air that blows over the chip's surface, whisking away the heat. The
pump could be an improvement over cooling systems that circulate liquids
over the chip's surface. While the technology is promising, the pump is
still very complex, and it remains uncertain which materials will be best
to build such high-performance and durable systems.
Click Here to View Full Article
to the top
Super Computer: Tech Guru Tackled Social Ills, Too
Wall Street Journal (08/26/08) P. A7; Clark, Don; Miller, Stephen
William C. Norris, the founder of a company that made faster computing
machines than IBM in the 1960s, died Aug. 21 at age 95. Norris grew up in
Nebraska during the Depression and attended the University of Nebraska,
before he had his first experience with calculating machines while helping
the U.S. Navy decode enemy communications during World War II. He joined
several other veterans in setting up Engineering Research Associates in St.
Paul, Minn., in 1946, but in 1957 would leave to form Control Data, which
later won a legal battle with IBM by settling its antitrust suit in the
early 1970s. Control Data would grow to employ 60,000 by 1984, but
competition from Japanese companies and IBM in the large commercial
computer category and the emergence of the personal computer would force
Norris to sell and close some businesses. By that time, Control Data had
branched out into computer services, and Norris' pioneering computer
services in education and other areas enabled the company to survive. "He
was way ahead of his time in understanding that computer hardware was going
to become commoditized," says Robert Price, who replaced Norris as CEO when
he retired in 1986. Norris did not care much for Wall Street, which
criticized him for focusing too much on social initiatives, such as
building plants in inner cities, setting up training centers for teachers
and engineers, and for launching ventures targeted to farmers, convicts,
and entrepreneurs. Norris believed his greatest accomplishment was Plato,
an early online community where learning, message exchanging, and
game-playing took place.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Reaching Agreement Over Ontology Alignments
University of Southampton (ECS) (08/24/06) Laera, Loredana; Tamma,
Valentina; Euzenat, Jerome
Ontologies are critical for inter-agent communication, and
interoperability resides in the ability to reconcile disparate existing
ontologies whose format may be variegated and whose domains may overlap;
this reconciliation typically depends on the presence of correspondences or
mappings between agent ontologies. The authors offer a framework enabling
agents to agree on the terminology they use for communication by permitting
them to express their preferred choices over candidate correspondences. A
value-based argumentation framework is employed for the computation of each
agent's preferred ontology alignments. The basis of argumentation is an
exchange of arguments, for or against a correspondence, that interact with
each other through an "attack" relation. An argumentation schema is
instantiated by each argument, which employs domain knowledge taken from
extensional and intensional ontology definitions. With the generation of a
full set of arguments and counter-arguments, the agents consider which of
them should be accepted. The authors define two different types of
alignment, an agreed and agreeable alignment; the agreed alignment is the
series of mappings based on those arguments contained in every preferred
extension of every agent, while the agreeable alignment is the extension of
the agreed alignment with those mappings supported by arguments which are
in some preferred extension of every agent. "The dialogue between the
agents can...consist simply of the exchange of individual argumentation
frameworks, from which they can individually compute acceptable mappings,"
write the authors.
Click Here to View Full Article
to the top
Internet Search Engines Go on Trial
New Scientist (08/19/06) Vol. 191, No. 2565, P. 24; Reilly, Michael
Lawsuits targeting search engines allege that the engines are
discriminating against vendors by unjustly manipulating results through the
alteration of their rankings, and this raises the larger issue of whether
biased results are truly generated by search engine algorithms. New York
University's Helen Nissenbaum, who claims there is an inherent bias in
search engine results, contends that knowledge of engines' operations is
within the public interest, given the technology's widespread presence.
"These lawsuits are important because they can start the discussion of
search engine transparency," she reasons. Studies by researchers have
generally agreed with Nissenbaum's theories about biased results, noting
that search engines and the hyperlinked Internet framework adhere to a
popularity-based scheme in which newer and smaller Web sites are pushed to
the margins. Users tend to click mainly on popular, high-ranking pages,
which boosts their popularity and traffic, while less popular sites
maintain a low ranking. A recent Indiana University study that focused on
three Internet models and their impact on traffic to popular and less
popular sites concluded that search engines mitigate rather than increase
the inherent bias of the Internet. Study co-author Filippo Menczer says
most submitted queries are specific as well as variegated, and bias
"doesn't go away entirely, but you can see it's more spread out." Santa
Clara University cyber-law researcher Eric Goldman argues that exposing the
operational mechanisms of search engines could actually increase bias
through the machinations of organizations or individuals.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Lawyer Sees Need in Hollywood for New Digital Licensing
System
Investor's Business Daily (08/24/06) P. A4; Deagon, Brian
Carole Handler of Foley & Lardner says in an interview that Hollywood is
finally copping to the fact that digital piracy is an intractable problem
without a digital content licensing system. She cites Stanford professor
Larry Lessig's vision of a creative commons that "would reserve some rights
but not all rights to content, freeing some rights that copyright owners
traditionally exploited." Handler describes the creative commons as a
compromise measure stating that "there are definitely protectable and
copyrightable interests. But there are other rights of the copyright owner
that really are changing by virtue of new technology." Determining the
scope of fair use is a key copyright issue that must be addressed in the
context of new media, according to Handler. She notes that the movie
industry has begun to realize that the digital downloading of film
represents an opportunity rather than a disadvantage. "What [the studios]
are doing, intelligently I think, is saying, 'This is technology that
people want to use to see their movies,'" she says. Handler observes that
new technology such as digital video recorders has shifted control of
content from broadcasters to consumers, which calls for new business
models.
Click Here to View Full Article
to the top
Mainframes Learn New Tricks
eWeek (08/21/06) Vol. 23, No. 33, P. 13; Taft, Daryl K.
As the number of developers with mainframe skills faces declines, IBM and
some of its affiliates are partnering with universities to revive interest
and developing new mainframe tools and programs to support modern
architectures. In recent years, IBM has seen its mainframe business pick
up while the number of developers who write programs for mainframes has
actually declined. With many programmers skilled in IBM's z/OS mainframe
operating system nearing retirement, the shortage in that area will be
particularly acute. Many newcomers enter the field with little or no
mainframe experience because many colleges have dropped their
mainframe-related courses. IBM's Academic Initiative for System z now
includes more than 250 colleges and universities throughout the world,
bringing mainframe instruction to more than 10,000 students. IBM partnered
with the Share conference to launch zNextGen, a community for professionals
who are new to the mainframe environment. IBM got involved with the
program both to foster a community of mainframe professionals who could
interact and learn from each other, and to formulate an agenda that IBM and
Share could help them work through. Since the mainframe sector is so
heavily skewed toward older workers, entrants to the field have a wealth of
established best practices and mentors to use as resources. Professors
attending the Share conference suggested that the name "mainframe" might be
dropped in favor of a new term that does not carry the musty connotations
of decades-old technology, such as "large-scale computing" or
"large-systems computing." IBM is also developing a new set of mainframe
tools to cover run times, testing, debugging, service-oriented
architectures, and other functions. Also, COBOL programmers need to
cultivate skills in more up-to-date architectures such as Java, XML, and
SOA, said IBM's Michael Connor.
Click Here to View Full Article
to the top
The Educated Browser
EContent (09/06) Vol. 29, No. 7, P. 12; Bernstein, Jared
A team of researchers at George Mason University is developing techniques
to simplify the process of compiling bibliographies for academic research
projects. Led by assistant professor Dan Cohen, the team was originally
working with the Scribe initiative, a free program under GMU's Center for
History and New Media (CHNM) that supplies electronic note cards to users
so that they can manage citations by entering the metadata for each source.
Confident that there must be another way, Cohen and his group secured
funding from the Institute of Museum and Library Sciences to develop a
better alternative. Less than a year later, they produced Scholar, an open
source tool that runs in the Firefox browser, unlike existing tools such as
EndNote that run as a separate application. "The Web browser is where
students, teachers, and professors are doing an ever-greater amount of
their research," Cohen said. "And with digitized collections such as
Google's massive library project coming online in the next few years, the
amount of time spent working in the browser will become even more
significant." The program will enable researchers to grab a citation with
a single click and store it in their browser. They can then take notes on
the item, link it to others, and organize the annotations and metadata
together, which CHNM says should improve the functionality of museum and
library collections. The information retrieved by the SmartFox tool is
completely searchable, and it is stored on the client's computer, rather
than the institution's server. "After putting the bibliographic
information into the browser, the program becomes aware of what they're
researching," Cohen said, adding that Scholar can detect citations on a Web
page, take snapshots of a page, and add descriptions to digital images.
Click Here to View Full Article
to the top
Put the User in the Driver's Seat
Embedded Systems Design (08/06) Vol. 19, No. 8, P. 37; Murphy, Niall
The automotive industry has embraced the philosophy that a successful user
interface makes the user feel he is in control, while the electronics
industry has yet to adopt this view, as demonstrated by the many products
consumers return in perfect working order because they cannot master them,
writes user interface designer and author Niall Murphy. Some interfaces
incorporate second-guessing ability and execute actions automatically by
constant scale adjustment to incoming data, but Murphy recommends boosting
the scale-changing mechanism's ease of use rather than automating it.
Automation must not be excessive, otherwise the user's sense of control
erodes and he becomes less accepting of the device. An overabundance of
configuration options can also be detrimental, and Murphy notes that "Most
users want to get on with using the product, not spend time tweaking the
interface that the programmers should have gotten right the first time."
The gulf of evaluation is the mental distance a user travels between the
data presented on the user interface and useful information necessary for
formulating a course of action; reducing this gulf involves lowering the
number of things the user must recall, or the volume of info the user must
obtain from another source. The gulf of execution is the distance between
the start of the decided course of action and the desired end result, and
this distance increases as the complexity of the mapping from action to
result grows. The gulf of execution shrinks if the user interaction is
brought closer to the user's initial decision. One way to reduce the gulf
of execution is to restrict the number of buttons or mechanisms on an
interface that must be manipulated to perform an action, Murphy writes.
Click Here to View Full Article
to the top
When TCP Breaks: Delay- and Disruption-Tolerant
Networking
Internet Computing (08/06) Vol. 10, No. 4, P. 72; Farrell, Stephen;
Cahill, Vinny; Geraghty, Dermot
There are research groups devising delay- and disruption-tolerant
networking protocols for those scenarios where standard Internet protocols
are insufficient to compensate for such interruptions. The Internet
Research Task Force's (IRTF) Delay-Tolerant Networking Research Group
(DTNRG) is working on a pair of protocols--the Bundle Protocol and the
Licklider Transmission Protocol (LTP)--and this work is overlapping with
projects from the Defense Advanced Research Projects Agency (DARPA) on
disruption-tolerant networking (DTN) and the interplanetary networking
(IPN) group. The Bundle Protocol, an overlay network store-and-forward
protocol, can piggyback on the current Internet protocol suite and packages
a unit of applications data as well as any necessary control information;
the bundle is then forwarded by nodes along a path comprised of several
intermediate machines that can each store it for substantial periods. The
LTP protocol is a point-to-point protocol that is both delay- and
disruption-tolerant, employing a communications daemon to handle all
disruptive events. Though LTP can be used in other contexts, it is chiefly
designed to serve as a potential convergence layer to support the Bundle
Protocol. Other potentially relevant DTN protocols include the
carrier-pigeon IP protocol. The largest missing ingredient in a viable DTN
scheme is reliable routing, but hopes are high that commercial DTN
applications are on the horizon.
Click Here to View Full Article
to the top
Robots 'R' Us
Popular Science (09/06) Vol. 269, No. 3, P. 54; Kurzweil, Ray
Futurist, inventor, and author Ray Kurzweil writes that robot technology
is advancing toward a point where the boundary between the mechanical and
biological worlds is erased. "Once this point comes--once the accelerating
pace of technological change allows us to build machines that not only
equal but surpass human intelligence--we'll see cyborgs...androids...and
other combinations beyond what we can even imagine," he projects. Despite
the utilitarian nature of modern robots, the inspiring principle and most
popular image of robotic design is a machine that is human-like in both
appearance and function. Kurzweil says understanding the human brain's
mechanisms is key to building robots with human-level intelligence, and we
are moving closer and closer to such a goal through the year-on-year
doubling of the performance/price ratio, capacity, and bandwidth of both
electronic and biological information technology. He acknowledges the
complexity of the brain while also realizing that it has a comprehensible
and manageable design principle. "By applying the law of accelerating
returns to the problem of analyzing the brain's complexity, we can
reasonably forecast that there will be exhaustive models and simulations of
all several hundred regions of the human brain within about 20 years,"
Kurzweil explains. With such advances, a machine with human-level
intelligence could be realized by 2029. Kurzweil concludes that detailed
androids and robotically assisted life extension and brain enhancement will
also be possible by that time.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top