PC Users Come to Aid of Scientists
Baltimore Sun (08/10/06) P. 1A; Stroh, Michael
Scientists facing declining research budgets and mounting volumes of data
are increasingly leaning on the computing power of interested amateurs to
help with their experiments. In one project, computer users are enlisted
to search for 50 microscopic clumps of interstellar dust left behind by a
comet. "It's like trying to find 50 ants on a football field," said
physicist Andrew Westphal. It would take years for NASA scientists to find
them all, he said. The scope of the project led Westphal and his
colleagues at the University of California, Berkeley, to launch the
Stardust@home initiative, whereby anyone with a computer can volunteer to
scour the findings of the Stardust satellite in search of the microscopic
dust. That idea, what is known as distributed computing, originated with
the SETI@home program, which relies on volunteers to download software to
help with the search for extraterrestrial life. SETI proved that the scant
processing power of the PC can have a significant impact when multiplied by
thousands of users, and more than two dozen similar projects have since
been launched, developing new drugs for AIDS, forecasting global climate
change, and other applications. "I wouldn't go so far as to say
distributed computing has completely changed the scientific landscape. But
it's on the verge of doing that," said UC Berkeley computer scientist David
Anderson, who also directs the SETI@home project. "The groups that have
more computing power, they're just able to do research other people can't."
Before rolling out the Stardust@home project, Westphal and his colleagues
toyed with the idea of writing pattern-recognition software to detect the
comet dust, but they soon found that it travels at a rate of speed too fast
for any one lab to measure.
Click Here to View Full Article
to the top
Debate Over E-Voting Is Still Plaguing Elections
National Journal's Technology Daily (08/09/06) Casey, Winter
The controversy surrounding e-voting systems, which has been raging since
their rapid deployment in the wake of the disputed 2000 presidential
election, shows little signs of subsiding. New complaints have emerged in
Georgia about a primary election in which Rep. Cynthia McKinney was roundly
defeated. "Electronic voting machines are a threat to our democracy,"
McKinney said after the election. "So let the word go out: We aren't going
to tolerate any more stolen elections." Diebold has reiterated its
position that its machines are accurate and reliable. Meanwhile, several
states have yet to comply with the 2002 Help America Vote Act, according to
the Committee on a Framework for Understanding Electronic Voting. This
year's primaries mark the first large-scale deployment of electronic voting
systems, and the relationships between election officials and equipment
vendors have become increasingly strained, the committee has found. The
panel also warns that proper training for poll workers will be an important
issue for the November elections. For information on ACM's e-voting
activities, visit
http://www.acm.org/usacm.
Click Here to View Full Article
to the top
Warnings Against Thwarting Technological
Innovation
Chronicle of Higher Education (08/09/06) Foster, Andrea L.
ACM's public policy committee chairman Eugene H. Spafford called on
Congress last month to not pass legislation that would force manufacturers
to build computers with technology designed to restrict the use of
copyrighted material. Spafford's letter to Senator Ted Stevens (R-Alaska),
chairman of the Senate Committee on Commerce, Science, and Transportation,
cautioned that "By mandating a technical approach that may be foiled,
consumers and innovation will suffer, while having little impact on
infringement." For more information on Spafford's comments, visit
http://www.acm.org/usacm.
Click Here to View Full Article
to the top
Reinventing the Transistor
Technology Review (08/10/06) Graham-Rowe, Duncan
The increasing number of functions is taxing the lifespan of batteries in
cell phones and other portable devices, but Nokia is developing a new
technique that could lead to a tenfold decrease in energy consumption. By
operating transistors at lower-than-normal levels, Nokia's technique
effectively places idle transistors or those executing low-performance
tasks in a sort of standby mode. "In computer design, power consumption is
getting to be a major driving force," said Nokia's Jamey Hicks. "The limit
on the size of the device gives us a limit on the total energy budget."
Nokia enlisted the help of MIT Microsystems Technology Laboratories
Director Anantha Chandrakasan to develop energy-efficient devices that use
transistors that operate at levels below the threshold typically required
to shut on or off. Transistors can hold a position of 1 or 0 below that
threshold, but they are less stable, Chandrakasan says. Developing
subthreshold transistors requires input voltages to remain consistent.
Chandrakasan found that lowering the voltages of transistors leads to a
fivefold to tenfold reduction in energy consumption, though the speed of
the circuit drops dramatically. Chandrakasan is working with Nokia to
develop a video compression chip that employs that technique to conserve
power in devices such as digital cameras. The research could also have an
impact on RFID tags and medical applications.
Click Here to View Full Article
to the top
'Data Miners' at UCI Moving Beyond Google
Orange County Register (CA) (08/08/06) Stewart, Colin
Researchers at the University of California, Irvine, have developed a
text-mining technique that can search volumes of data without being told
what to look for. This type of text mining, known as statistical topic
modeling, could have broad implications beyond the Internet, such as
marketing, historical research, and medicine. Topic modeling arranges data
into categories by monitoring and recording words that are frequently
paired together. "To put it simply, text mining has made an evolutionary
jump," said David Newman, a computer scientist at UC Irvine. "In just a
few short years, it could become a common and useful tool for everyone from
medical doctors to advertisers, from publishers to politicians." Topic
modeling has yet to emerge from the university environment, though Newman
predicts that it will see commercial deployment in the next few years.
Newman has used topic modeling to create categories from related words and
names of people, places, and groups to analyze 330,000 newspaper stories,
taken mostly from the New York Times. Based on the number of words the
Times devoted to a particular topic from 2000 to 2002, Newman was able to
draw sweeping conclusions that could be of great interest to marketers. He
found that over that period, for instance, the popularity of football
increased, while interest in the Tour de France fell off slightly.
Similarly, an analysis of articles an ads published in the Pennsylvania
Gazette from 1728 to 1800 revealed an inverse correlation between interest
in fashion and trade and interest in religion. Researchers also used topic
modeling to make connections between the 250,000 emails that Enron turned
over to the Justice Department.
Click Here to View Full Article
to the top
AI & Poker: A Smart Bet
Dr. Dobb's Journal (08/09/06) Erickson, Jon
The American Association of Artificial Intelligence held a computer poker
challenge for the first time this year during the organization's conference
in Boston. Researchers from the University of Alberta won the AAAI
Computer Poker Competition, as their computer program "Hyperborean" bested
four other bots in the two tournaments set up for one-on-one Texas Hold
'Em. The performance of Hyperborean was notable in that the bot made all
of its betting decisions instantaneously in both the normal and slower pace
tournaments. Computer programs developed by researchers at Carnegie Mellon
University and Monash University in Australia, and by Teppo Salonen from
Irvine, Calif., and Morten Lynge from Ikast, Denmark, also competed in the
event. The demanding competition had the bots play more than a quarter of
a million games, play every series of deals twice, and run on identical
computer systems. Artificial intelligence researchers can learn much from
the game of poker, in which skill, chance, and uncertainty have to be taken
into consideration. "Poker is a nice well-defined problem for studying
some truly fundamental issues, like how to handle deliberate
misinformation, and how to make intelligent guesses based on partial
knowledge," says Darse Billings, lead designer for the Alberta team. "Good
solutions in this domain could have an impact in many other computer
applications."
Click Here to View Full Article
to the top
Engineers: DC Power Saves Data Center Dough
eWeek (08/08/06) Burt, Jeffrey
Later this month, researchers at the Lawrence Berkeley National Laboratory
and roughly 20 technology suppliers will conclude a demonstration that they
claim shows how DC power distribution can save at least 15 percent on
energy consumption and cost in a data center. The program, set up at Sun
Microsystems' campus in Newark, Calif., gauges efficiency at the levels of
the rack and the facility at large. The facility has been available
through several open houses designed to evangelize the benefits of DC
power. The researchers will next begin looking for a major company willing
to become an early adopter. A combination of rising energy costs, smaller
and more powerful processors, and greater server densities has elevated
energy efficiency to a top priority, and many companies will soon be
spending more money powering and cooling their data centers than they will
on the actual products that go inside them, according to Bernie Meyerson,
CTO of IBM's Systems and Technology Group. With chip and system makers
developing software and management tools that give administrators a greater
degree of control over thermal issues, critics say that the DC conversion
might not be necessary or worth the expense of retrofitting an entire data
center. But the Berkeley Lab team showed that DC power can be used in
existing data centers simply by hardwiring the backs of the servers. In
testing, the system showed a 15 percent improvement in energy efficiency at
the facility level, though the improvement could be even more dramatic in
an actual data center with factors such as redundancy in play.
Click Here to View Full Article
to the top
Green Pigment Spins Chip Promise
BBC News (08/09/06)
A team of researchers has found that a dye developed in the 18th century
could now be used in spintronic devices because it can holds its magnetism
at room temperature, unlike other materials that must be cooled. "The big
challenge is to develop materials that can perform these kinds of functions
not just at cryogenic temperatures but at practical temperatures," said
Daniel Gamelin, a University of Washington professor who worked on the
project. Cobalt green is a combination of zinc oxide and cobalt that was
always eschewed by the art community because it was expensive and produced
weak colors. Spintronic technology manipulates the magnetic properties in
electrons to boost computational power, potentially leading to faster, more
energy-efficient computers. Some hard disks already rely on spintronic
technology, and, theoretically, spintronics could be applied to sensors and
memory. As conventional fabrication techniques approach the limitations of
scaling, spintronic devices could offer a solution, though until now, the
only materials that produced useful spintronic properties only did so at a
very low temperature. To test the viability of the pigment, the
researchers doped the zinc oxide so that magnetic cobalt replaced some of
the zinc ions. They then aligned the cobalt ions by subjecting the
semiconductor material to a zinc metal vapor, causing the material to
become magnetic. The magnetism continued as the material was heated to
room temperature, but disappeared as it was heated further. While the
research shows promise, it is a long way from commercialization, and the
researchers will next try to integrate the materials with silicon
semiconductors.
Click Here to View Full Article
to the top
Seeing Is Believing
The Engineer Online (08/07/06)
Researchers at Salford University are heading a consortium that is working
to make visual communications systems more natural by integrating
eye-tracking technologies with Immersive Projection Technology (IPT). The
idea is to give users of videoconferencing technology the ability to gaze
into the eyes of other people and see the direction in which they are
looking, which would allow them to pick up and interpret non-verbal
communication. Existing video technology only offers limited eye-tracking
capability. Reading, UCL, and Roehampton universities are participating in
the consortium, along with industry partners. The consortium has developed
a prototype eye-gaze system, and will install the technology in the CAVE
(Cave Automatic Virtual Environment) display system, a setup for projecting
stereo images on floors and walls to offer an illusion of reality.
"Videoconferencing allows you to look into another person's space, but this
will allow you to walk around in it," says the director of Salford's Center
for Virtual Environments, David Roberts, who likens the system to the Star
Trek holodeck. Users in different locations will be able to meet in the
virtual environment and walk around objects, such as an aircraft engine,
and discuss its features. The consortium will spend a year building the
system, and a second year studying and analyzing the performance and
potential of the technology.
Click Here to View Full Article
to the top
Developing Data Solutions
Access Online (08/08/06) Baker, Trish
The National Center for Supercomputing Applications (NCSA) is partnering
with scientists at the National Optical Astronomy Observatory (NOAO) to
process, analyze, and share data collected from large-scale scientific
experiments, particularly the Large Synoptic Survey Telescope (LSST) that
will launch in 2013. Though the vast quantities of data collected from
such projects are critical to advancing scientific understanding, managing
and storing all that information is a distinct challenge. Upon its launch,
the LSST is expected to produce some 15 TB of raw data and 100 TB of
processed data every night. A close partnership with the scientific
community has been a key feature of the NCSA's approach to developing cyber
environments for distributed computing. The LSST project consists of three
major parts: the telescope, the camera, and the data management system.
The camera will be the largest ever built, and the telescope itself will be
large enough to provide a wide area of view--the camera and telescope will
produce images of the entire viewable sky every three days. In order to
optimize the imaging of the incoming data and promptly alert astronomers to
interesting phenomena, the system will need to be able to process
information in almost real time. NCSA developed an archive replication
center built around middleware developed at the San Diego Supercomputing
Center. By storing data at multiple sites, the system provides security
through redundancy. "The NCSA/NOAO collaborative effort provides a
backbone for secure data access, which is a vital component for
astronomical portals and multi-location image archives," said Chris Miller,
an assistant astronomer at the Cerro Tololo Inter-American Observatory.
"These security measures are currently missing from...most astronomical
archive tools and services." Access to the LSST will be offered through a
Web-based community.
Click Here to View Full Article
to the top
Carnegie Mellon Researchers Develop New Type of Mobile
Robot That Balances and Moves on a Ball Instead of Legs or Wheels
Carnegie Mellon News (08/09/06)
Mobile robotics researcher Ralph Hollis believes dynamically stable robots
have more potential for integration into human environments than current
legged robots. Hollis, a professor at Carnegie Mellon University, is the
driving force behind a new kind of robot that is about the height and width
of a person, weighs about 95 pounds, and balances and moves on a single
urethane-coated metal sphere. "Ballbot" can maneuver in small spaces
because of its long and thin shape and because it does not have to face the
direction it intends to move before it does so. Traditional robots that
use legs or wheels tend to have a wide base, which makes it difficult to
employ them among people and furniture, and operating them too fast or on a
slope can cause such robots to fall over. Hollis' self-contained,
battery-operated, omnidirectional robot makes use of internal sensors to
provide balancing data to an onboard computer, which uses the information
to activate rollers that mobilize the ball on which it moves. Adding a
head and arms could aid Ballbot further in rotation and balance, "but there
are many hurdles to overcome, like responding to unplanned contact with its
surroundings, planning motion in cluttered spaces, and safety issues," says
Hollis. A dynamically stable robot could one day operate in close contact
with the elderly, disabled, or in an office environment, he believes.
Hollis has received grants over the past two years from the National
Science Foundation for his research on the Ballbot, which stands in place
on three retractable legs when not in operation.
Click Here to View Full Article
to the top
Computer Visualization Puts Cars Back on Buffalo's Main
Street
University at Buffalo News (08/08/06)
Researchers at the University at Buffalo have created an interactive,
real-time visualization of what the city's Main Street would look like if
it were reopened to vehicular traffic. Roughly 11 blocks of Main Street
were blocked off to vehicular traffic 20 years ago, having been replaced by
a light rail system and a pedestrian mall. The Buffalo researchers created
their detailed simulation so that residents could take an immersive look at
how a proposal for Main Street to support both cars and light rail might
work. "The three-dimensional, real-time traffic visualization allows the
public and planners to see how the proposed integration of car and rail
traffic would work on Main Street before any of the actual construction
begins," said Adam Koniak, an urban visualization expert at Buffalo's
Center for Computational Research. Koniak added that the public must have
an understanding of the proposal's potential impact before they decide to
move forward. The visualizations show, for instance, what the average
waiting time would be for cars stopped at an intersection, so that people
can get a sense for what types of bottlenecks might arise if the measure
passed. The urban-planning application developed from the center's
research in complex scientific visualization and simulation in medicine and
other data-intensive environments.
Click Here to View Full Article
to the top
The Digital World Is the Real World
Computerworld (08/07/06) Anthes, Gary
In a recent interview, SAP Labs' Ike Nassi discussed his vision for the
future of wireless networking. Nassi predicts a convergence of the virtual
world and the real world, brought on in large part by the increasing
deployment of RFID technology and embedded microprocessors. Nassi is
working with the city of Palo Alto, Calif., to equip firetrucks with a host
of wireless communications devices to link them back to SAP's systems, for
example. Among other things, the project is seeking to find out why a
firetruck would take what would appear to be a nonoptimal route to a fire.
While microprocessors have long been a part of the automobile industry,
wireless networking has been slow to catch on, Nassi says. Network-enabled
cars could notify owners when they were in need of a software upgrade, for
instance. Nassi is also exploring the possibilities of an RFID-enabled
assembly line that could track parts and reduce the frequency of shutdowns.
Nassi recommends that IT managers adopt existing standards such as OSGI
(Open Service Gateway Initiative) as soon as possible. Switching to a
service-oriented architecture can reduce costs and increase accuracy in a
variety of applications, Nassi says. Nassi envisions future business
applications being governed by a software language that can make explicit
models that would be understandable by a wider audience than just C++
engineers.
Click Here to View Full Article
to the top
Creating a Science of the Web
Science (08/11/06) Vol. 313, No. 5788, P. 769; Berners-Lee, Tim; Hall,
Wendy; Hendler, James
While the Web has fundamentally altered the way that scientists interact
with each other, most of its development has been ad hoc, and many
researchers are beginning to realize that a comprehensive framework is
required to understand and model the Web as it continues to evolve. The
science of the Web would be an interdisciplinary system of understanding
the architectural principles that have guided the Web's growth and ensuring
that the basic tenets of privacy and trustworthiness are upheld. Thus far,
computer scientists have been principally concerned with developing better
information-retrieval algorithms, while non-computing researchers, though
increasingly dependent on the Web, are disconnected from the budding Web
research community and have no structured plan as to how to keep apprised
of the emerging trends in that sector. At a recent workshop at the British
Computer Society, Web researchers discussed the major engineering issues
facing Web science, including structure, topology, and scaling. One area
where the Web clearly needs improvement is mathematical modeling, as
current information-retrieval models are inadequate at such a large scale.
Another ongoing trend is the transition from text documents to data
resources, a semantic approach that enables computers to understand
information based on relational data and logical assertions. Researchers
are also exploring the application of logic-based languages to model data,
answer questions, and check hypotheses. Though the potential of the
Semantic Web has been widely touted, most of the world's data remain locked
in large vaults that are not available on the open Web, which has greatly
limited the reuse of information. Web scientists will have to address
issues such as how to map between different data models and how to search a
series of linked repositories. There are also the policy issues that
emerge over who should control access to data resources and legal
challenges in the areas of privacy and intellectual property.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
The Robotic Economy
Futurist (08/06) Vol. 40, No. 4, P. 50; Brown, Arnold
Intelligent machines and systems form the linchpin of our economic
prosperity, but they must remain subservient to people in order for this
prosperity to be sustained. Technological advances, combined with the
trend of abstracting work and, by extension, workers, is driving automation
on a potentially vast scale; even creative work is not immune, as archived
knowledge can be harnessed without actually employing people to think it
up. Gartner's Neil MacDonald expects the rate of job loss that can be
credited to automation to be about twice that attributed to outsourcing
over the next decade. Our reliance on machines is expected to become even
greater as problems become too complex for human minds to cope with. A
renewed Luddite movement seems an inevitable consequence of mechanization,
and employers will be pressured to help those whose jobs will be lost to
automation through retraining programs or other measures. Assessing
people's value solely by their economic contributions is an attitude that
will likely have to change as robots proliferate. The relationship between
humans and machines could change radically as machines assume more humanoid
characteristics, both externally and internally. Central to this
development is the debate over whether such machines should be endowed with
rights similar to the basic freedoms people enjoy.
Click Here to View Full Article
to the top
Intrusion-Tolerant Middleware: The Road to Automatic
Security
IEEE Security & Privacy (08/06) Vol. 4, No. 4, P. 54; Verissimo, Paulo
E.; Neves, Nuno F.; Cachin, Christian
The concept of intrusion tolerance, a methodology designed to fortify
computer systems against attacks and accidental faults by seamlessly
addressing both issues via a common security and dependability strategy, is
the heart of the Malicious-and Accidental-Fault Tolerance for Internet
Applications (MAFTIA) project. Intrusion tolerance is a measure of last
resort that acts after an intrusion but prior to a system failure, based on
automatic methods dependent on local mechanisms and distributed protocols,
and that combine detection, recovery, or masking tactics.
Intrusion-tolerance mechanisms are selectively employed by the MAFTIA
architecture to construct tiers of progressively more trusted components
and middleware subsystems from untrusted elements such as hosts and
networks. The architecture can be represented in at least three distinct
dimensions: A hardware dimension comprised of the host and networking
devices that make up the physical distributed system; the local support
services supplied by the operating system and runtime platform in every
node; and distributed software, the middleware layers that piggyback on the
runtime and support each host's provided mechanisms as well as the native
MAFTIA services of authorization, intrusion detection, and trusted third
parties. The architecture can support components with different types and
severity of attacks, intrusions, and vulnerabilities concurrently via
architectural hybridization, which marries high performance at the level of
controlled failure systems to high resilience at the level of arbitrary
failure systems. This concept allows the realistic deployment of the
wormholes model, a hybrid distributed-system model that assumes the
existence of augmented distributed-system elements or wormholes that can
provide stronger behavior than is postulated for the rest of the system.
The MAFTIA middleware's layers, from lowest to highest, are the multipoint
network (MN) module, the communication support services (CS) module, and
the activity support services (AS) module. Each module feeds into failure
detection and membership management.
Click Here to View Full Article
to the top
Lessons for the Future Internet: Learning from the
Past
Educause Review (08/06) Vol. 41, No. 4, P. 16; Roberts, Michael M.
First president and CEO of ICANN Michael Roberts outlines four stages of
the Internet's growth, noting the role that academic contributions have
played. The first stage was characterized by federally funded research and
the creation of NSFNet II, while the next stage saw enthusiastic academic
usage and further development of the Internet, which led to the foundation
of what would eventually be Internet2. The third stage of Internet growth
saw the Internet reach and exceed both international and domestic
saturation, and the U.S. government subsequently made ICANN responsible for
the network's technical administration; however, ICANN has for the most
part failed in its mission to function via broad consensus mechanisms,
owing to the growing politicization of the Web. The fourth stage of growth
involves the maturation of the Internet into a global and universal network
that reflects human society, and with it has come renewed national and
international concern over Internet policy, specifically the use of the
Internet to meet social objectives, the extent of governmental economic
Internet regulation, and the degree to which network users' expectations
for privacy should be preempted by national security priorities. The
existence of legislation dealing with each of these issues makes the
challenge to lawmakers twofold: They must determine the proper role for
governments to play as the Internet's growth and development continues, and
also how societies worldwide switch from antiquated technology and laws to
a new balance between society, technology, and politics. Roberts says the
academic community, on the strength of its open and collaborative nature,
can be a vital player in the Internet's continued evolution. The author
cites several areas where academic support and advocacy is critical,
including federal funding for university research into networking; the
provision of universal affordable broadband and middleware; the use of
academic network facilities as testbeds for advanced technologies, such as
converged voice, video, and data; and the preservation of the Internet
commons.
Click Here to View Full Article
to the top