U.S. Loosens Its Control Over Web Address Manager
New York Times (09/30/06) P. B4; Shannon, Victoria
As part of the three-year renewal of its partnership with ICANN, the
nonprofit group that manages Internet domains, the U.S. Commerce Department
has agreed to loosen its control over ICANN and potentially release it from
government oversight altogether. The agreement provides ICANN with greater
autonomy and also provides for a midterm in 18 months that could lead to
ICANN's release from government oversight. Though the Internet grew in
large part out of U.S. government and university research, its growing
worldwide importance has led other countries--particularly in Asia and the
Middle East--to voice objections to the effective veto power the U.S.
government has over ICANN. The latest "joint project agreement" between
ICANN and the U.S. government is intended to be the last of six that, since
1998, have given ICANN the authority to maintain the Internet technically.
Commerce Department official John M.R. Kneuer said the goal of the latest
agreement is to enable ICANN to become a private-sector organization,
saying, "Private-sector management of the Internet is demonstrably
effective." Paul Kane, chairman of the country-code domain registry group
Centr, agreed that the marketplace, rather than a government or group of
governments, is the best way to serve the Internet's worldwide interests:
"The Internet today is run by private networks interconnecting computers
around the world," he said, adding that "it is not in the private sector's
interests to have an inefficient Internet."
Click Here to View Full Article
to the top
NSF Awards Texas Advanced Computing Center $59 Million
for High-Performance Computing
University of Texas at Austin (09/28/06)
The University of Texas at Austin's Texas Advanced Computing Center (TACC)
and its partners at Cornell University and Arizona State University have
won a five-year, $59 million grant from the National Science Foundation
(NSF) to purchase, operate, and maintain a high-performance computing (HPC)
system with the aim of providing scientists and engineers throughout the
United States with unprecedented computing power. TACC has teamed up with
Sun Microsystems to implement a supercomputer that will by its final
configuration perform more than 420 teraflops, and boast more than 100
terabytes of memory and 1.7 petabytes of disk storage. "This Sun system
will enable scientific codes to achieve greater performance on vastly
larger problems, with higher resolution and accuracy, than ever before,"
comments TACC director Jay Boisseau. "It will be one of the most important
scientific instruments in the world." HPC systems are allowing scientists
to probe critical problems in almost all scientific fields, and HPC
resources have become vital to knowledge discovery in geosciences, life
sciences, engineering, and social sciences, with the results that they
generate directly affecting society and the quality of life. With HPC,
researchers can perform experiments that would be impossible under other
circumstances, such as investigating the evolution of the universe.
Click Here to View Full Article
to the top
Grants for Advanced Computing Awarded
UC Davis News and Information (09/29/06) Fell, Andy
The U.S. Energy Department announced on Sept. 7 several grants for
advanced computing projects led by UC Davis researchers. Department of
Computer Science and Institute for Data Analysis and Visualization
professor Kwan-Liu Ma earned a $1.6 million annual grant to set up an
Institute for Ultrascale Visualization, which will develop tools to
accommodate and analyze massive amounts of supercomputer-generated data,
and educate researchers about these tools via conferences, outreach
programs, and summer schools. A $1.2 million annual grant will go to
Department of Chemistry professor Giulia Galli, whose project, "Quantum
Simulations of Materials and Nanostructures," will focus on methods to
model atomic behavior using basic quantum mechanical laws, en route to
simulating materials and chemical reactions for the purpose of gaining a
better understanding of materials' behavior under varying conditions as
well as developing new types of materials. Four other grants for projects
funded via the Energy Department's Scientific Discovery through Advanced
Computing program will involve the participation of UC Davis scientists.
Among the projects is a University of Cincinnati-led initiative called
"Modeling Materials at the Petascale," whose participants include UC Davis
computer science professor and chair Zhaojun Bai and physics professors
Sergey Savrasov and Richard Scalettar. Bai said researchers are starting
to consider the use of petascale computers when they become available. The
National Science Foundation sent out a call for petascale computing
development proposals in 2005. A petascale computer's speed would beat
that of BlueGene/L, the current world champion, by about tenfold.
Click Here to View Full Article
to the top
SC06 Conference Advance Registration Deadline Is Sunday,
Oct. 15
Business Wire (10/02/06)
Advance registration for the SC2006 conference, sponsored by ACM and IEEE
and taking place November 11-17, 2006, in Tampa, Florida, closes Sunday,
October 15, after which fees for the Technical Program registration and
Tutorials increase. The annual conference will feature a number of
sessions, lectures, and technical papers focused on issues related to
advanced computing, networking, data storage, and analysis. The slogan for
this year's conference is the "Powerful Beyond Imagination." Panel
discussions will feature Keynote speaker Ray Kurzweil and other industry
experts discussing high-performance computing issues. Information about
registering for the event via the Internet is available at
http://sc06.supercomp.org/registration/.
Click Here to View Full Article
to the top
Penn State Joins International Effort to Secure Wireless,
Sensor Networks
Penn State Live (09/28/06)
The U.S. Army Research Laboratory and the United Kingdom's Ministry of
Defense has awarded as much as $135.8 million to the International
Technology Alliance in Network and Information Sciences, a consortium
comprised of 24 members. The money is designated for research efforts
focused on high-tech secure wireless and sensor networks. IBM heads the
consortium. Other participants include Klein Associates, Columbia
University, Carnegie Mellon University, the University of Maryland,
Renssalear Polytechnic Institute, and Penn State's Networking and Security
Research Center. The consortium's research efforts will cover secure
systems, sensor information processing, and other areas. Penn State
computer science and engineering professor Thomas La Porta, director of
Penn State's center, says that research is focused on developing algorithms
and protocols for timely data transmission. He adds that the algorithms
need to work effectively when addressing the various requirements of
multiple missions. He says, "The goal of this work is to create algorithms
and protocols that ensure the required information is being delivered to
the most important applications and people in time for it to be of use.
The algorithms must consider requirements from multiple missions, each with
different information needs, importance and timeframes, and dynamically
configuring the network to gather, process and deliver the data to maximize
the utility of the network. To meet these goals, the area team will define
methods for quantifying and representing the 'quality' of information, the
requirements and importance of each mission, and algorithms for configuring
a sensor network."
Click Here to View Full Article
to the top
ODU Researchers to Develop Software for
Supercomputers
Virginian-Pilot (10/01/06) Bowers, Matthew
A recent five-year, $7 million U.S. Department of Energy Grant will allow
Alex Pothen, a computer science professor at Old Dominion University, and
Assefaw Gebremedhin, a research scientist at ODU, to establish a research
institute called the Combinatorial Scientific Computing and Petascale
Simulations Institute (CSCAPES). Pothen, as the institute's principal
investigator, along with Gebremedhin and fellow researcher Florin Dobrian,
will develop software in conjunction with national laboratories in New
Mexico and Illinois and with Ohio State and Colorado State universities.
The software will allow scientists to take advantage of the growing power
of the new generation of computers, some of which can handle nearly a third
of a quadrillion calculations per second. Existing programs only make it
possible to use a small percentage of a computer's maximum possible
performance. The scientists hope to solve a variety of problems, such as
linking machines, breaking bottlenecks in obtaining data, and figuring out
which operations can take place simultaneously to save time and which must
follow one another. The scientists hope that in doing so they will make it
possible for "domain scientists"--chemists, physicists, and others--to
achieve breakthroughs in complex areas that require an enormous amount of
computing power, such as environmental decontamination and global
warming.
Click Here to View Full Article
to the top
Google Researcher Speaks on Company's Latest
Innovations
Daily Californian (09/26/06)
Google's latest advances in machine learning and information extraction
was the topic of discussion during a recent talk at UC Berkeley. Peter
Norvig, director of machine learning, search quality, and research at
Google, talked about Statistical Machine Translation, a computer
translation program that Google is developing with hopes of improving the
accuracy of translation and giving it more human-like qualities. Norvig,
an alumnus of UC Berkeley, said the translation program has the potential
to give users the ability to take greater advantage of large amounts of
data. Google also wants users to have the ability to type in a few
different words and receive a list of several related words, and Norvig
said the company believes these "sets" will help improve the accuracy of
searches. Another focus of Google is user trend graphs, which can be used
to follow the volume of different searches throughout the year. Google
views user trend graphs as helping to provide a better understanding of
users, said Norvig. "This is only an idea of new things we are working on,
and the ways in which technology can be used," said Norvig. "There is so
much data and there are so many things you can do with it."
Click Here to View Full Article
to the top
New Models to Improve the Reliability of Virtual
Organizations
University of Southampton (ECS) (09/29/06) Lewis, Joyce
Researchers at the University of Southampton are working on models that
will help improve the reliability and trustworthiness of virtual
organizations. Such organizations consist of members who are
geographically separated--frequently linked by computer networking--but are
able to give the outward appearance of being single unified organizations
with an actual physical location. The increasing prevalence of virtual
organizations with computerized agents acting on companies' behalf is
making it more important to ensure that the computerized agents behave
responsibly, said Prof. Michael Luck. Luck and his team have been working
with Cardiff University, the University of Aberdeen, and British Telecom on
a project called Grid-enabled Constraint-Oriented Negotiation in an Open
Information Services Environment, or CONOISE-G. "The trustworthiness and
reputation of agents are significant issues, especially in the context of
virtual organizations in which the agents must rely on each other to ensure
coherent and effective behavior," says Luck, adding that there has been
little work in this field thus far. The researchers are working to
implement a prototype system that examines trust and reputation,
standardizing communication, and policing within the virtual
organization.
Click Here to View Full Article
to the top
Engaging SE Asian Research Talent in European ICT
Research
IST Results (10/02/06)
Until recently, participation of countries such as Malaysia, Singapore,
Taiwan, and Thailand in the IST program has been low, but this problem is
being addressed through initiatives such as the completed GAPFILL project,
which spurred research groups in southeast Asian nations to start and
contribute to more IST efforts through national promotional events, help
services, and information Web sites. "European companies in IST consortia
can certainly benefit from having southeast Asian participants, which bring
not only technological skills but also know local political conditions and
markets very well," notes director of GAPFILL project coordinator Sigma
Consultants Roger Torrenti. Malaysia's Mimos Berhard, Thailand's National
Science and Technology Development Agency, Singapore's Institute for
Infocomm Research, and Taiwan's National Science Council are among the
national research agencies that made up the GAPFILL project consortium.
GAPFILL coordinated major cooperation events in each nation, featuring
presentations, exhibition spaces for European delegations to confer with
local organizations, technical visits, and informal social events; events
in Singapore, Thailand, Taiwan, and Malaysia were supported by dedicated
Web sites. "The success of these events far exceeded our expectations,
with substantial national media coverage and strong political support from
the science and technology ministries," reports Torrenti. GAPFILL
established help desk services with trained personnel in the four countries
and Europe to aid groups with the development and submission of joint
projects to the IST program. Torrenti estimates that 200 corporation
research efforts have been identified and bolstered through GAPFILL.
Torrenti says GAPFILL has surpassed expectations when it comes to
generating interest and potential for cooperation in both Europe and
southeast Asia, and with developing bilateral cooperation.
Click Here to View Full Article
to the top
What a Lot of Bots
The Engineer (09/18/06)
The Swarmanoid project is an effort sponsored by the European Union to
build a distributed robotic system that is adaptable to building
environments. "The name Swarmanoid comes from the idea that this type of
robot is intended to take a different approach to the construction of
robots, rather than creating humanoid robots," commented project leader Dr.
Marco Dorigo of Belgium's Universite Libre de Bruxelles. "Although they
will have a shape that is not reminiscent of human beings, these will be
able to act effectively in human-made environments." The project involves
60 dynamically linked small autonomous robots that come in three varieties:
Eye-bots, hand-bots, and foot-bots. The three types will operate in
conjunction to create a heterogeneous robotic system that is functional in
three dimensions. The eye-bots will be tasked with observation and
environmental analysis while clinging to the ceiling and transmitting data
to the hand-bots, which will climb walls and other vertical surfaces, and
the foot-bots, which will negotiate uneven terrain and transport materials,
including other robots. The robot swarm will be designed to organize into
specific shapes to tackle certain problems. The Swarmanoid initiative will
not just require the construction of robots, but also the development of
distributed algorithms to ascertain the swarmanoid's actions and a
communications architecture to enable system control.
Click Here to View Full Article
to the top
New Models to Improve Reliability of Virtual
Organizations
Innovations Report (09/28/06) Murphy, Helene
The Grid-enabled Constraint-Oriented Negotiation in an Open Information
Services Environment (CONOISE-G) project this month is expected to complete
its research into the reliability and trustworthiness of people who are
part of virtual organizations. Michael Luck, a professor in the School of
Electronics and Computer Science at the University of Southampton, says
there will be a need to know that computerized agents are behaving
responsibly as virtual organizations and agents grow. Southampton is
participating in CONOISE-G, along with Cardiff University, the University
of Aberdeen, and British Telecom. The researchers have developed models of
how virtual organizations form and operate, and they are implementing the
system, standardizing communication between agents, and policing a virtual
organization. "Only limited work has been carried out in this area so far,
with the majority of developers adopting the stance of complete trust,"
says Luck. "This, however, avoids the complex issues which are crucial for
the reliability and dependability of these systems and which our research
aims to address directly."
Click Here to View Full Article
to the top
Entanglement Unties a Tough Quantum Computing
Problem
USC Viterbi School of Engineering (09/28/06) Mankin, Eric
The way can be cleared for error correction coding--an important
breakthrough in quantum computing--by including entangled photons within
the message stream, a trio of USC Viterbi School of Engineering theorists
report in Science. "This method allows the use of highly efficient turbo
codes, operating close to the theoretical limits of efficiency, something
never before possible," proclaims lead author on the study and electrical
engineering professor Todd Brun. Error codes are a critical requirement
for quantum computing systems, which process quantum data carried on single
light particles (photons), but Brun notes that not all measurements can be
executed concurrently in quantum mechanics. "When most classical error
correction codes are translated into quantum codes, it is no longer
possible to measure all of their syndromes; measuring some of the error
syndromes disrupts the measurement of others," he explains. The technique
outlined by Brun and colleagues is to blend entangled and normal photons
together, and use the entanglement property that allows two measurements
that would be incompatible on a single quantum bit (qubit) to sometimes be
performed by measuring both halves of an entangled pair. The USC
researchers are attempting to determine the best combination of entangled
and normal photons for optimal error coding performance.
Click Here to View Full Article
to the top
'Tower of Babel' Technology Nears
BBC News (09/27/06)
The development of software defined radio (SDR) "Tower of Babel"
technology that will allow a single wireless product to understand any kind
of radio wave signal through the use of software found on the device was
discussed at the International Conference on Telecommunications and
Computers at the University of Portsmouth in the U.K. Now, most devices
rely on hardware to convert analogue signals into digital format. But
European space firm EADS-Astrium has developed software-based technology
for military purposes capable of picking up radio signals passing through
the air waves. The next step would be "cognitive" radio technology that
has SDR capabilities and can also detect and utilize unused bandwidth. The
University of Portsmouth's Dr. David Ndzi says, "SDR is what one could call
a Tower of Babel-type technology, in that wireless devices that previously
understood only one or a few languages, or standards, will suddenly be able
to talk to each other freely regardless of frequency or conflicting
protocols." Computing power and the ability to quickly convert analog
radio waves into digital code are the two limitations holding SDR back
right now, says EADS-Astrium's Francis Kinsella, but he says, "we have
advances in both those areas that could really mean an explosion in the
next five to 10 years for SDR."
Click Here to View Full Article
to the top
Garcia Looks to Raise Cybersecurity's Profile
Government Computer News (09/25/06) Vol. 25, No. 29, Wait, Patience
After a two-year vacancy, Greg Garcia has been appointed the new assistant
secretary for cybersecurity and telecommunications at the Department of
Homeland Security (DHS). Garcia will be the first person to ever hold the
position and seeks to increase the level of awareness of IT security.
Garcia is also the vice president for information security programs at the
Information Technology Association of America (ITAA) and has been with the
group since 2003. "I think they picked the right guy," says Joe Tasker at
ITAA. "This is his forte, translating real, hard-core technology into
policy." Cyber Security Industry Alliance executive director Paul Kurtz
also believes that Garcia was a good choice. He says, "Greg is a solid
pick for the position. He knows information security issues and has good
connections in the private sector. He is also earnest and focused. This
combination, with consistent senior support within DHS, will enable DHS to
move forward on critical information security issues." DHS National Cyber
Security Division director Amit Yoran and other former cybersecurity
officials, including Richard Clarke and Howard Schmidt, and emphasized the
need for to raise the profile of cybersecurity in the administration.
Click Here to View Full Article
to the top
Energy Controlled Reporting for Industrial Monitoring
Wireless Sensor Networks
University of Southampton (ECS) (09/25/06) Merrett, Geoff V.; Harris, Nick
R.; Al-Hashimi, Bashir M.
A group of University of Southampton School of Electronics and Computer
Science researchers propose a method to add longevity to wireless sensor
networks (WSNs) by combining energy management and information control.
The Information manageD Energy Aware aLgorithm for Sensor networks with
Rule Managed Reporting (IDEALS/RMR) technique is set up so that each sensor
node locally decides the degree of its individual network engagement by
balancing available energy resources with each packet's data content. The
content is determined via a series of rules that characterize potential
events in the sensed environment, and which particularize when reporting
should occur as well as the importance of each packet. The rules include
threshold, differential, feature, periodic, and routine rules, and these
rules are also assigned a message priority tied to event importance. The
researchers simulated IDEALS/RMR's use in an industrial WSN monitoring a
water pumping station, and found that the network's lifetime and
connectivity can be significantly increased. Furthermore, sustained
operation can be achieved when the technique is paired with energy
harvesting.
Click Here to View Full Article
to the top
Books Without Boundaries: A Brief Tour of the System-Wide
Print Book Collection
Ubiquity (09/25/06) Vol. 7, No. 37, Lavoie, Brian F.; Schonfeld, Roger C.
As libraries continue to adopt to the increasingly networked digital age,
print collections will see substantial transformations, with the bulk of
the change coming at the system-wide level. Isolated library units will
become aggregated into the combined collections of multiple libraries
variously defined at the state, regional, or nationwide levels. As
large-scale digitization projects such as the Open Content Alliance and
Google Print take shape, aggregation could even come to encompass all
libraries everywhere. Naturally, this will require librarians to adopt a
system-wide perspective when making retention and preservation decisions to
avoid unnecessary duplication. Though the forces driving this system-wide
perspective have been underway for some time, the data required to support
informed system-wide policy decisions have been largely unavailable. The
closest existing library to the ideal of a totally system-wide collection
is OCLC's WorldCat database, which contains some 60 million bibliographic
records encompassing the collections of more than 20,000 institutions
around the world. When assessing figures such as WorldCat's total, it is
important to remember that a single work can have multiple manifestations,
or physical embodiments of the work. In the case of WorldCat's system-wide
collection, there are 32 million print book manifestations of 26 million
works. Particularly in the age of digitization, collection overlap is a
key concern for the system-wide library. A sampling of the WorldCat
database revealed the difficulty of determining the uniqueness of
individual works, owing to the limited data available. More than half the
resources in the system-wide collection were published in English, while no
sub-continental languages such as Urdu or Hindi made it in the top 25. In
all likelihood, there are millions of books, all of them out of copyright,
that are missing from the system-wide collection, though arriving at exact
figures for the so-called "book gap" is problematic as well. In the end,
to arrive at a satisfactory understanding of the preservation requirements
of a system-wide collection, librarians will need better data to learn more
about rare and unique titles.
Click Here to View Full Article
to the top
Connecting the Dots
American Scientist (10/06) Vol. 94, No. 4, P. 400; Hayes, Brian
Brian Hayes thinks a multidisciplinary focus on mathematics and social
networking could perhaps give intelligence analysts the means to make
reasonable assumptions about terrorist conspiracies based on surveillance
data. Studies of the terrorist groups connected to major incidents such as
9/11 and the 2004 Madrid bombing uncovered patterns very similar to the
prevailing social network structure suggested by Stanford University's Mark
Granovetter, in which clusters or cliques tightly bound internally by
strong ties between close members are loosely linked to each other by weak
ties between casual acquaintances. Terrorist cells were found to consist
of several dense clusters with strongly linked nodes, that communicate with
each other only through comparatively loose and inconsistent couplings.
The National Security Agency (NSA), meanwhile, appears to be applying graph
theory to the analysis of a telephone call database in order to trace links
among terrorist plotters, using only the phone numbers at the two ends of
each call and the date and time of a call's beginning and end. News
reports indicate that the NSA could be hoping to use the call graph to
unmask plots without any previous guidance simply by sifting the archive
for "patterns that might point to suspects." Such patterns, in conformance
with social-network theory, would be dense subgraphs that exist in
comparative isolation from their environment. Hayes writes that
tentatively, the computational power necessary for analyzing call graphs
seems to be readily available, but the main challenge is to enable
algorithms to "somehow distinguish a few dozen people intent on mayhem from
other groups of the same size and structure who are planning a family
reunion, canvassing the neighborhood for a lost cat, running for city
council or war-dialing to win free concert tickets from a radio
station."
Click Here to View Full Article
to the top