Swiss Multicore Project Wins Microsoft Grant
EE Times Europe (07/30/08) Holland, Colin
A project at the Swiss Federal Institute of Technology (ETH Zurich) is one
of seven academic research projects that will share a $1.5 million
Microsoft External Research grant as part of the Safe and Scalable
Multicore Computing request for proposal (RFP). Microsoft's RFP is
intended to stimulate and enable bold, substantial research in multicore
software that re-examines the relationships between computer architecture,
operating systems, runtimes, compilers, and applications. The goal is to
propose new mechanisms and paradigms that will result in safe and scalable
concurrent systems and applications, with a focus on mainstream client
platforms. The ETH Zurich project, the Reliable and Efficient Concurrent
Object-Oriented Programs (RECOOP), is the only non-U.S. project to receive
a grant. RECOOP's goal is to develop a practical formal semantics and
proof mechanism to enable programmers to reason abstractly about concurrent
programs and allow proofs of formal properties of these programs. RECOOP
researchers hope to enable precise reasoning on concurrent programs at a
level of abstraction comparable with what is possible on sequential
programs using modern languages and programming techniques. Microsoft says
integrating transactions into the design and implementation of modern
programming languages is surprisingly difficult, and the goal of the
research project is to remove such difficulties through advancements in
language semantics, compilers, runtime systems, and performance
evaluation.
Click Here to View Full Article
to the top
House S&T Committee Review Federal IT R&D Program
Computing Research Association (08/01/08) Harsha, Peter
The federal Networking and Information Technology Research and Development
(NITRD) program was recently reviewed by the House Science and Technology
Committee. Committee Chairman Rep. Bart Gordon (D-Tenn.) said the program
"has made a substantial contribution to moving computation to an equal
place along side theory and experiment for conducting research in science
and engineering" as well as playing a key role in the development of "the
computing and networking infrastructure needed to support leading edge
research and to drive the technology forward for a range of commercial
applications that benefit society broadly." No witness or member of
Congress challenged recommendations that the President's Council of
Advisors for Science and Technology (PCAST) issued in 2007, and there was
broad consensus that NITRD must improve its interagency planning. The
Computing Research Association's Dan Reed testified that the withdrawal of
the Defense Advanced Research Projects Agency's support for a sizable
portion of university computer science research represented a major loss of
what made the federal portfolio for IT R&D so successful--namely the
diversity of funding models and mission requirements. Gordon was
especially interested in how the NITRD program could comply with PCAST's
recommendation that it be rebalanced to stress more high-risk, long-range
research, with Reed contending that researchers should entertain more
adventurous thinking in their research proposals while reviewers ought to
be willing to merit proposals that are high-risk but potentially
high-payoff. Members' queries frequently touched on the issue of
cybersecurity, partly sparked by talks about the ubiquity of computer
devices and people's expanding access to them. There was agreement among
panelists that a lot of research must be performed on understanding how to
shield cyber-physical systems.
Click Here to View Full Article
to the top
Study in CACM August Issue Finds Wikipedia Faces No
Limits to Growth
AScribe Newswire (07/31/08)
A study published in the August 2008 issue of Communications of the ACM
found that Wikipedia is likely to continue to grow while maintaining its
usability. The study's authors, Diomidis Spinellis and Panagiotis Louridas
from the Athens University of Economics and Business, identified a growth
pattern called preferential attachment, making the study the first time
this pattern has been studied live on a structure the size of Wikipedia.
The study considered two possible growth patterns for Wikipedia, which
currently contains 385 Gbytes of data. Either new concepts are added
without having corresponding articles, or the number of new concepts will
grow slower than the number of articles. In the first scenario,
Wikipedia's coverage will deteriorate as articles are drowned in an
increasing number of undefined concepts, while in the second case
Wikipedia's growth could stall. However, the study found that Wikipedia
actually fits in between the two extremes. The researchers found that the
ratio of undefined to defined concepts in Wikipedia remained stable over
time, and that articles are added to Wikipedia in a collaborative fashion,
with contributors often adding a new article when they encounter a missing
entry. The researchers concluded that Wikipedia grows by having new
articles linked to the most popular existing articles. The preferential
attachment growth pattern seen in Wikipedia has also been used to explain
the number of species per genus, the Internet, scientific citations, and
collaboration networks between people.
Click Here to View Full Article
to the top
IETF Tackling P2P Data Traffic
Heise Online (Germany) (07/31/08) Ermert, Monika
An Internet Engineering Task Force (IETF) working group is searching for
ways to make data traffic on peer-to-peer (P2P) networks more effective.
At a recent developers' meeting in Dublin, Ireland, IETF's Jon Peterson
warned that the problem of P2P traffic needs to be handled. Internet
service providers say that 50 percent to 80 percent of all data traffic
comes from P2P networks. The focus of IETF's efforts is a mechanism that
can respond to a P2P user query for the best P2P node, which should solve
the problem of having the P2P client use an indirect route to get to a
server with the desired content. For example, the mechanism would prevent
a server in Dublin from downloading content from a server in Tokyo when the
same content is available on a server in London. Other groups working on
solutions include Yale University's P4P working group, and a European
Union-supported research project working on a solution called Network-Aware
P2P TV Application Over Wise Networks. IETF says the first step should be
standardizing the interface between P2P applications and the external
server that specifies which is the closest, most powerful, or fastest P2P
server.
Click Here to View Full Article
to the top
San Diego Supercomputer Center Director Urges Academia to
Make Cyberinfrastructure 'Real'
UCSD News (07/29/08) Zverina, Jan
University of California San Diego Supercomputer Center director Fran
Berman says creating a cyberinfrastructure (CI) is essential for future
research advancement and discovery. "Fundamental to modern research,
education, work, and life, CI has the potential to overcome the barriers of
geography, time, and individual capability to create new paradigms and
approaches, to catalyze invention, innovation, and discovery, and to deepen
our understanding of the world around us," Berman says. She says the
challenge is that in the research and education community, CI is
simultaneously a work-in-progress and a stable infrastructure driver for
invention and innovation, and the academic community struggles to provision
and sustain broad-use community CI within a traditional academic framework.
Finding a solution will involve a paradigm shift in how academics think
about designing, evolving, provisioning, and learning about CI, as well as
partnerships between academics, the government, and the private sector,
Berman says. Meanwhile, forward-looking academic institutions are
launching CI-based initiatives in the hopes of advancing their own research
capabilities and educational expertise. "Both research and education
initiatives will be critical to ensuring that the academic community can
conduct 21st century research and education with 21st century tools and
infrastructure," Berman says.
Click Here to View Full Article
to the top
Experimental Networking Testbed Gets Bandwidth
Government Computer News (07/30/08) Jackson, William
The Internet2 consortium will provide a dedicated 10Gbps circuit to
support the Global Environment for Network Innovation's (GENI's) research
efforts. GENI aims to provide researchers with an environment to
experiment with networking problems and protocols without being hindered by
production networks and the requirements of real users. The Internet2's
nationwide, high-performance network for the education and research
communities was overhauled last year to increase its capacity tenfold to
100Gbps. The 10Gbps circuit being dedicated to the GENI effort will allow
GENI subcontractors and developers to access the circuit at every
connection point on the network to foster nationwide collaboration.
Internet2 CEO Douglas Van Houweling says the donation gives Internet2 an
opportunity to use its new capacity to support cutting-edge research in the
development of future generations of the Internet.
Click Here to View Full Article
to the top
Mother Earth Gets Undressed
Nature (07/31/08) Charles, Katrina
Computer scientists and geologists backed by the United Nations'
International Year of Planet Earth have created the first digital
geological map of the globe. OneGeology integrated national and regional
geological maps from across the world to create the database. The
geological maps can be accessed for free, and will give users a chance to
see what rocks under their feet look like, as cities, forests, and soil are
stripped away from the maps. They will be able to see the various colors
of different types of rock on the maps, which could be used to find areas
for extracting minerals, supplies of water, and offshore territorial
boundaries. "Knowledge of the rocks that we all live on has become
increasingly important, and sharing knowledge at a time of global
environmental change is crucial," says the British Geological Survey's Ian
Jackson. OneGeology also helped accelerate the development of a new Web
language, Geoscience Mark-up Language, so that countries would be able to
share data and make it freely accessible.
Click Here to View Full Article
to the top
IBM Software Acts as Human Memory Backup
Computerworld (07/31/08) Gaudin, Sharon
IBM researchers have developed Pensieve, software that helps people keep
track of what's happened in their lives. Pensieve uses images, sounds, and
text recorded on mobile devices to help people recall names, faces,
conversations, and events. The software collects and organizes pieces of
information, stores them, and helps the user extract the information later.
"Today, we're flooded with information. It's an information overload and
we're not capable of handling it," says IBM project leader Eran Belinsky.
"This would relieve us from the anxiousness or need to try to remember
everything." Pensieve is similar to the MyLifeBits' project being
conducted by Microsoft researcher Gordon Bell, who is also developing a way
for people to remember different aspects of their lives. Bell is trying to
store his life on a laptop by collecting telephone conversations, music,
lectures, books he's written and read, and photographs. Belinsky says
hundreds of IBM employees are currently testing the software.
Click Here to View Full Article
to the top
Academics to Get a Glimpse of Microsoft's Sphere
CNet (07/28/08)
Microsoft plans to display its spherical surface computer at its annual
Faculty Summit. The university researchers attending the event will be
among the first people outside of Microsoft Research to see Sphere, a
sphere-shaped, multitouch computer that is similar to the company's
tabletop Surface computer. A projector is used to beam the "screen" onto a
globe-like display, and input is picked up via infrared cameras. Gaming
and mapping are among the applications that benefit from the spherical
shape of the computer. For example, multiple users would be able to
rotate, stretch, and move pictures while using a photo-sharing application.
Sphere is largely the work of surface computing expert Hrvoje Benko.
Microsoft has not announced any commercial plans for Sphere, but
Microsoft's Bill Gates says surface computing could have a place in the
home and office.
Click Here to View Full Article
to the top
CSIRO Develops Technology That Goes Where GPS
Can't
Computerworld Australia (07/31/08) Hendry, Andrew
Australia's Commonwealth Scientific and Industrial Research Organisation
(CSIRO) has developed a new wireless localization system that can track,
sense, and communicate in areas where global positioning systems (GPSs) and
other wireless technologies do not work. The terrestrial localization
system has received a grant to commercialize the technology for use by
Australia's emergency services. The technology will allow first-response
emergency workers to be tracked through dangerous environments such as
collapsed buildings or underground mines where other technologies may not
work. GPS systems only work outdoors where an adequate signal can be
received, meaning that canyons, cliffs, caves, urban areas, and underground
environments are often inaccessible to GPS systems. GPS also relies on
infrastructure from the U.S. Department of Defense, so areas that do not
receive its signals are also inaccessible to GPS. CSIRO scientist Mark
Hedley says the system is based on radio frequency tracking technologies
that use a series of nodes placed in an environment, in addition to nodes
attached to emergency workers. Hedley says the system measures the
distance between the fixed nodes and the node attached to the emergency
worker. Using the measurement of the radio signals between the tags and
the command and control centers, the system can determine the location of
the emergency workers, as well as other useful information.
Click Here to View Full Article
to the top
Stream Computing Helps Monitor Sick Infants
Computerworld Canada (07/25/08) Lau, Kathleen
A university of Ontario Institute of Technology (UOIT) research project
will use advanced stream computing software developed by IBM to help
doctors detect subtle changes in critically ill premature babies. The
software is intended to help medical professionals make better decisions by
monitoring physiological data such as respiration, heart rate, and body
temperature, as well as environmental data gathered from sensors, and
combining the data with information collected through traditional
paper-based methods. UOIT professor Carolyn McGregor says observing subtle
physiological changes can help detect life-threatening conditions in
premature infants up to 24 hours in advance. The neonatal intensive care
unit at Toronto's Hospital for Sick Children will be the first to deploy
the technology, followed by the IWK Health Center in Halifax, Nova Scotia,
and another unit in Australia. However, the software will first undergo a
year-long testing period at UOIT's Health Informatics Laboratory using data
previously collected for neonatal intensive care units. When fully
developed, the software will be able to process the 512 readings per minute
generated by medical devices. IBM has given the researchers access to the
prototype software as part of its First of a Kind program, which connects
research teams with customers, says IBM T.J. Watson Research Center's Maria
Ebling.
Click Here to View Full Article
to the top
Attackers' Behavior Builds Better Blacklists
Security Focus (07/24/08) Lemos, Robert
Computer scientists at SRI International and the SANS Institute have
developed the Highly Predictive Blacklist algorithm, a technique that
determines an attacker's preference for victims' networks in order to
prioritize additions to blacklists. The technique allows network owners to
correlate attacks on their networks with attackers' preferences for other
networks, using a system similar to Google's PageRank System. The
researchers correlated attackers' choices in targets using firewall logs
contributed by participants in the SANS Institute's DShield service. By
matching the preferred victims of a known attacker, the researchers were
able to develop per-network blacklists that perform better than either
massive global lists or more focused local lists. "Our experiments
demonstrate that our Highly Predictive Blacklist algorithm consistently
creates firewall filters that are exercised at much higher rates than those
from conventional blacklist methods," says SRI's Phillip Porras. The
blacklists were created in three stages. First, the researchers removed
any unreliable alerts from the logs submitted by contributors. Next,
relevance-based rankings were used to prioritize attacks for each
contributor. Lastly, the system gave priorities to patterns that match
known malware propagation trends. The system was tested using 720 million
log entries and found to outperform global and local blacklists in more
than 80 percent of the cases.
Click Here to View Full Article
to the top
Say Goodbye to Virtual Bureaucracy
AlphaGalileo (07/29/08)
The participants in the pan-European EUREKA project called Fidelity
(Federated Identity Management based on LIBERTY) are looking for commercial
uses for the new system. Fidelity is designed to make it easier for people
to do business online, without having to worry about remembering their
various passwords and user names, and whether their personal information is
secure from fraudsters. Internet users would have a single password with
an identity provider, who would take care of the virtual paperwork every
time they visited a Web site. The identity provider would operate in a
formal partnership with service providers, such as Internet stores, and
attribute providers that securely host the customers' personal attributes
to be shared. "The system gives customers much more control," says Vincent
Etchebarne, innovative services developer at France Telecom's Orange.
Customers will be able to give limited information, only what a company
needs, and at any time have their personal data removed from the records of
a company. European telecoms operators Orange, TeliaSonera, and Telenor
are behind the project.
Click Here to View Full Article
to the top
SC08 Technical Program Registration Opens Aug. 4
HPC Wire (07/31/08)
The SC08 international conference on high performance computing,
networking, storage, and analysis, co-sponsored by ACM, will offer a
technical program that consists of two days of tutorials, three days of
technical paper presentations, six panel discussions, 13 workshops, several
birds-of-a-feather sessions, and poster presentations. There will be 20
Technical Papers sessions, in which 59 papers out of 277 submissions will
be presented. "A 21 percent acceptance rate definitely shows that we took
only the cream of the crop, and that's our goal every year," says Technical
Papers co-chair Darren Kerbyson. SC08 will feature papers on GPU
applications, petaflop architectures, e-science grids, OS kernels, and
10-gigabit wide-area networks, says Technical Papers co-chair DK Panda.
Selected workshops include Node Level Parallelism for Large Scale
Supercomputers, Grid Computing Environments 2008, Power Efficiency and the
Path to Exascale Computing, Bridging Multicore's Programmability Gap,
Advanced Modeling and Simulation for Fission Nuclear Energy, and
High-Performance Reconfigurable Computing Technology and Applications.
Registration opens Monday, Aug. 4. SC08 takes place Nov. 15-21, in
Austin.
Click Here to View Full Article
to the top
Europe's Next-Generation Broadband
ICT Results (07/25/08)
The Multi-Service Access Everywhere (MUSE) project, a joint effort
involving Europe's leading technology firms and research institutes, has
finished the first phase of an effort to accelerate the rollout of
next-generation broadband services. The project has resulted in 10s of
megabits per second broadband service becoming available in many European
countries. MUSE has helped to establish standards and define a roadmap
that has gained industry consensus, helping limit the risks faced by main
stakeholders, improve stakeholder confidence, and increase broadband
investment. Belgium, the Netherlands, Britain, Germany, and other
countries are now deploying services with Very High Speed Digital
Subscriber Line (vDSL), an access technology that provides up to 100 Mbps.
MUSE researchers looked at broadband access architectures, access and edge
nodes, DSL, fiber optic, fixed wireless, back-end integration,
interconnection between public and home networks, and generic test suites.
"There is often misunderstanding; people think we were just looking at
improving the access bit-rate, but that aspect of the project accounted for
only 20 percent of our budget," says project coordinator Peter Vetter.
"The main challenge was to enable multi-service delivery through an
integrated end-to-end approach."
Click Here to View Full Article
to the top
Girl Power! Summer Camp Grooms Tomorrow's Techies
Computerworld (07/28/08) Vol. 42, No. 30, P. 32; King, Julia
Girls' interest in computing starts to flag in middle school, and one
effort to maintain that enthusiasm is Technology Goddesses, a program of
technology camps that offers girls instruction in digital design, computer
graphics, Web site development, and digital moviemaking. Technology
Goddesses founder Cora Carmody with California's Jacobs Engineering Group
hopes the program will improve the pertinence and excitement of technology
to this group, and eventually reverse the declining numbers of women in
technology-related careers. She notes that girls' learning patterns differ
from those of boys, and the Technology Goddesses program reflects this by
placing participants in a "girl-friendly" learning environment that
encourages sociability, teamwork, and exposure to role models who tend to
be older girls rather than adults. Technology Goddesses has a partnership
with the Girl Scouts. "Through Technology Goddesses, the girls learn to
use technology and gain life skills and develop critical-thinking skills,"
says Jo Dee Jacob, CEO of Girl Scouts, San Diego-Imperial Council. "They
educate themselves and others." By bringing in professionals from a wide
spectrum of fields, the program demonstrates to girls the infinite career
possibilities a background in technology can nurture. IT professionals who
volunteer to help at Technology Goddesses camps or start camps of their own
can enhance their company's reputation as an IT employer of choice as well
as take a look at potential future members of the tech workforce, says
Science Applications International Corp.'s Susie Schmitt.
Click Here to View Full Article
to the top
Flattened Butterfly Network Lets Data Fly Through
Supercomputers and Multicore Processors
IEEE Spectrum (07/08) Savage, Neil
A growing problem of increasingly powerful supercomputers and chips that
employ multicore processors is the danger that they will route data
inefficiently and waste time, energy, and money, and William Dally with
Stanford University's computer science department says he and colleagues at
supercomputer manufacturer Cray can solve this problem with the development
of a "flattened butterfly" architecture. The flattened butterfly upgrades
the butterfly architecture by mixing columns of routers and connecting each
router to additional processors, reducing the number of router-to-router
connections by 50 percent and allowing data traveling between the
processors to reach any other processor in fewer jumps even though the
physical route may be longer. The flattened butterfly can adaptively
detect congestion and overshoots only when necessary. It is demonstrated
in Dally's simulations that in multicore processors the flattened butterfly
architecture can boost data throughput as much as 50 percent over a
standard mesh network, lower power consumption by 38 percent, and shave 28
percent off of latency. The architecture was unveiled by Dally at the
International Symposium on Computer Architecture (ISCA) last year, and in
June 2008 Dally's team presented the Dragonfly update--a scalable version
of the flattened butterfly for very-high-density networks, including
supercomputers with 1 million nodes--at this year's ISCA. Dragonfly is
capable of grouping 64 routers linked through a flattened butterfly into
one virtual router and then link that to other virtual routers using
another flattened butterfly. Wires are used to electrically interconnect
the grouped routers, while an optical link is used to connect each virtual
router to every other one. The system relies on lower-cost electrical
links for short routes and can use the optical interconnects for long
routes, and Dally says this architecture is perfect for linking data
centers.
Click Here to View Full Article
to the top