Election Glitches 'Could Get Ugly'
USA Today (09/14/06) P. 1A; Wolf, Richard
With the crucial midterm elections just eight weeks away, state and local
governments are scrambling to prepare voting machines and train poll
workers to fix the problems that are expected to arise. Glitches have
already occurred this year in numerous states, and election officials warn
that this year's election has more potential for technological difficulty
than any other since 2000, with some 30 percent of the nation's precincts
using new equipment. "If you're ever going to have a problem, it's going
to be that first election," said Kimball Brace, president of Election Data
Services. Almost half of all U.S. counties have upgraded their voting
systems to optical-scan or electronic voting since 2000, but they are still
largely dependent on poll workers with an average age of 72 who are
generally not experienced computer users. The principal concerns that
observers voice are a shortage of technical support staff with both the
precincts and the vendors, heightened demand for equipment delaying
deliveries, and the touch-screen machines that have a paper backup for
audits and recounts. "There are so many potential failure points this year
that some of it could get ugly," said R. Doug Lewis of the Election Center.
For information about ACM's e-voting activities, visit
http://www.acm.org/usacm
Click Here to View Full Article
to the top
Simulated IT Attacks Reveal Response Flaws
eWeek (09/13/06) Hines, Matt
The U.S. Department of Homeland Security has released the results of its
Cyber Storm exercise, outlining the areas where government agencies and
enterprises need to shore up their responsiveness to new IT threats. The
exercise found that communication between the public and private sector in
the event of an attack on IT infrastructure is insufficient, and that those
groups could be hampered by their inability to discern the full scope of an
attack. The results did indicate that progress is being made on those two
fronts, however. Cyber Storm was intended to assess the
information-sharing capabilities and level of readiness for an attack
throughout the federal, state, and local levels of government. The testing
conditions were designed to be a controlled environment where participants
could simulate the coordination that would be required during a major cyber
event. More than 100 public and private organizations at more than 60
locations in five countries participated in the exercise, which aimed to
recreate the adverse effects that an attack or disaster could have on
critical infrastructure. "In many ways, this exercise was designed to push
the system to the maximum edge. That allows you to identify the greatest
points of vulnerability, and we're fundamentally working to update and take
lessons from Cyber Storm and Katrina and look at how we can improve
coordination," said Andy Purdy, acting director of the National Cyber
Security Division at the Department of Homeland Security. Cyber Storm
participants simulated cyberattacks against the nation's energy,
transportation, and IT infrastructures that would have the potential to
cause ripple effects throughout the government, economic, and social
environments of participating countries. Responders tended to handle
single threats effectively, but had trouble correlating multiple incidents
occurring throughout public and private infrastructure. The report did
find, however, that the existing communication platform between
international governments is relatively effective.
Click Here to View Full Article
to the top
Researchers Reveal 'Extremely Serious' Vulnerabilities in
E-Voting Machines
Princeton University (09/14/06) Riordan, Teresa
A team of Princeton University computer scientists claims to have
developed software that can manipulate ballot counts in e-voting machines
and be installed in under a minute in the most commonly deployed systems.
"We have created and analyzed the code in the spirit of helping to guide
public officials so that they can make wise decisions about how to secure
elections," said Edward Felten, director of Princeton's new Center for
Information Technology Policy. In their examination of the Diebold
AccuVote-TS machine, Felten and his colleagues found that the machine is
vulnerable to numerous serious threats. In a brief video on their Web
site, the researchers outline how the vote-stealing software can disrupt a
mock election. The researchers show how the systems can fall prey to
viruses that can automatically transmit themselves from one machine to
another without being detected. Felten said that policymakers should take
the threat of malicious software infecting the machines seriously, and that
there is reason to be worried about other e-voting machines, in addition to
the one that was tested. "There is reason for concern about other machines
as well, even though our paper doesn't directly evaluate them," Felten
said. "Jurisdictions using these machines should think seriously about
finding a backup system in time for the November elections." For
information about ACM's e-voting activities, visit
http://www.acm.org/usacm
Click Here to View Full Article
to the top
Experimental AI Powers Robot Army
Wired News (09/14/06) Hambling, David
The U.S. Air Force is working to develop robots with navigational
abilities well beyond those required to complete the DARPA Grand Challenge.
Whereas those robots merely had to steer themselves over miles of desert
terrain, intelligent agents for the military would have to be able to
autonomously navigate into underground bunkers, map unfamiliar sites in
three dimensions, and determine what is inside those sites without being
detected. With those goals in mind, the Air Force Research Laboratory
(AFRL) is looking well beyond the capabilities of any existing system, and
staking its hopes on developing new software that would enable the robots
to learn, walk, and interact in a more sophisticated way than ever before.
The software is based on the concept of developing new ideas building on
existing knowledge, and similar applications have already written music and
designed soft drinks. The software is a form of neural network with two
identifying features. One is the noise that is introduced into the network
to jumble existing ideas into new forms; the other is a filter to compare
the novel ideas with existing knowledge and discard what is deemed
unsuitable. Self-learning and adaptability will be central to the success
of the software. The research is based on Stephen Thaler's Creativity
Machine, which excels at adapting to new physical features and ferreting
out the most efficient way to perform a particular task. In his work for
AFRL, Thaler has been designing what he calls Creative Robots, which can
work together in a swarm to accomplish a common goal. "This approach has
less chance of getting stuck than any other" when negotiating unfamiliar
obstacles, said AFRL's Lloyd Reshard. Thaler's current project, called
CSMARRT (Creative, Self-Learning, Multi-Sensory, Adaptive, Reconfigurable,
Robotics Toolbox), is a software package built for designing and modeling
virtual robots that can control any type of robot hardware and handle
locomotion, sensors, and intelligent behavior to execute a mission.
Click Here to View Full Article
to the top
Why Johnny Can't Code
Salon.com (09/14/06) Brin, David
The programmers responsible for engineering the advanced capabilities of
today's PCs cut their teeth on line-programming languages such as BASIC,
but the absence of such languages on modern PCs and education removes an
important tool for getting young people interested in programming, writes
David Brin. He argues that BASIC is priceless for teaching programming
because it is easy to pick up, yet textbooks are starting to exclude the
language because of a perception throughout the computer industry that it
is obsolete. Though BASIC may be tedious and arduous to work with, the
language has an undeniable power as an inspiration for modern-day
programmers and their innovations. The lack of such languages'
availability to children today translates into a lack of technological
empowerment for tomorrow's coders, according to Brin. He laments that
computer industry powerhouses such as Apple and Microsoft, "For all of
their high-flown education initiatives (like the '$100 laptop')...seem bent
on providing information consumption devices, not tools that teach creative
thinking and technological mastery." Brin writes that many kids wish to
learn programming skills using fundamental instruments such as BASIC, but
with BASIC becoming unavailable, these kids are at a disadvantage that
could stifle creativity.
Click Here to View Full Article
to the top
Is Supercomputing Going Hetero?
HPC Wire (09/15/06) Vol. 15, No. 37, Feldman, Michael
An increasing number of big-name vendors are betting on heterogeneous
architectures as the next major stage of high-performance computing. Cray
has perhaps placed the largest bet on heterogeneous architectures, having
staked its future on its Adaptive Computing vision, which imagines systems
comprised of multiple types of processing engines. Many industry observers
are already looking ahead to the obsolescence of today's architectures with
multiple processors and cores. Eventually, the simple addition of more
processors to a system will yield no effect. Heterogeneous architectures
improve efficiency by allowing specialized processing engines to be paired
more precisely with different application codes. For certain types of
code, one specialized chip can do the work of 100 conventional processors.
The transition to heterogeneous architectures is likely to be more
challenging than the conversion from single-core to multicore designs,
raising challenges such as coupling the disparate processing engines and
determining the ratio of different kinds of processors. But the major
challenge will be scaling software to a heterogeneous architecture. The
software will have to be able to intelligently map application code to the
various available processor resources. The central question revolves
around how to design such software, according to Ken Kennedy, director of
the Center for Scalable Application Development Software. "How do you
build software tools that are scalable from a system with a single
homogeneous processor to a high-end computing platform with tens, or even
hundreds, of thousands of heterogeneous processors?" Kennedy asks.
Click Here to View Full Article
to the top
UA Scientists Probe 'Dark Web' to Uncover Potential
Terrorist Threats
KVOA 4 (Tucson, AZ) (09/12/06) McNamara, Tom
For the past four years, scientists at the University of Arizona have been
aiding U.S. government intelligence agencies in their efforts to make sense
of the terrorist-related information that is floating around on the Web.
As part of the Dark Web project, University of Arizona Eller College of
Management professor Dr. Hsinchun Chen and his colleagues have worked out
formulas and algorithms for measuring social interactions of terrorists
online, and the degree of hatred and violence that is expressed in their
communications. Dark Web is now the largest computer database on terrorist
Web sites and chat forums, with Chen adding that the number of terror Web
sites has grown from hundreds when he started the project to about 5,000.
Chen, who is currently tracking about 400 known terrorists, weeds out
information that is unlikely to be useful to government agents, but passes
along relevant information to intelligence agency experts to conduct
sophisticated analysis on his leads. "This could be like a 'myspace' for
the terrorist group, how they're interlinking with each other on the Web,"
Chen says of Dark Web. "It's on the Web, but you need more sophisticated
technology to understand this phenomena." Chen is also assisted by
advanced computer science students, and some of the students and his staff
have been hired by the CIA and other agencies.
Click Here to View Full Article
to the top
Why We Need More Mathematicians, Scientists and Engineers
to Win the Global Economic Battle
The Hill (09/13/06) Landrieu, Mary
The United States must produce more mathematicians, scientists, and
engineers if the nation is to remain competitive globally, writes Sen. Mary
Landrieu (D-La.), a member of the Small Business and Entrepreneurship and
Appropriations committees. The future of American workers is at stake,
considering more jobs in the years to come will demand technical degrees,
and the country must prepare its population for the changing employment
market. Studies continue to show that U.S. students are lagging behind
other nations in math literacy, and that more young Americans are
expressing a lack of interest in pursuing engineering studies, according to
Landrieu. At the same time, countries such as South Korea are now
graduating as many engineers as the United States, and some analysts are
predicting more than 90 percent of scientists and engineers will live in
Asia in the next five years. Meanwhile, federal investment in basic
research has been on the decline, but the business community is starting to
respond by collaborating on programs such as Tapping America's Potential
(TAP), or launching their own initiatives, says Landrieu. Congress needs
to address the issue of American competitiveness now, and the Protecting
America's Competitive Edge (PACE) Act is an example of the kind of
foresight that U.S. leaders must have if the nation is to remain the
scientific leader. PACE is committed to establishing high schools that
specialize in math and science, strengthening training for grade school
educators who teach math and science, creating new fellowships and offering
tuition support, investing in programs and internships at national
laboratories, supporting independent research, and starting the Advanced
Research Projects Authority in the Energy Department.
Click Here to View Full Article
to the top
U.S. Likely to Keep Control of Internet Name
System
Reuters (09/13/06) Rothstein, Joel
State Department Bureau of Economic Affairs ambassador David Gross, U.S.
coordinator of international communications and information policy since
2001, said Wednesday that the U.S. is likely to retain control over the
Internet domain naming system once a memorandum of understanding between
ICANN and the U.S. Department of Commerce expires at the end of the month,
despite international criticism. The statement follows a July public
hearing chaired by Commerce acting assistant secretary for communications
and information John Kneuer where overall pessimism over ICANN's readiness
to operate independently once the MoU expires was expressed. The Center
for Democracy & Technology's David McGuire says, "I don't think the U.S.
government will relinquish control of ICANN if there is a risk that the
process could get subsumed by a UN-type organization."
Click Here to View Full Article
to the top
Techies Hot on Concept of 'Wisdom of Crowds,' But It Has
Some Pitfalls
USA Today (09/13/06) P. 4B; Maney, Kevin
The idea behind James Surowiecki's popular 2004 book, "The Wisdom of
Crowds," is that thousands or millions of people make better collective
decisions than individual experts. The theoretical foundation for
democracy, the idea is not a new one, but its implications are magnified
when applied to the Internet. "The Internet provides a mechanism to get
lots of diverse opinions and aggregate it in a quick and cost-effective
way," says Surowiecki. By extension, the theory holds that Wikipedia,
which is the product of tens of thousands of unpaid contributors, should be
a better encyclopedia written by experts. Likewise, Internet mechanisms
such as Digg, which allows readers to vote stories to the front page,
should do a better job of finding the best stories than professional
editors. The problem that Digg ran into was that groups of savvy users
began conspiring to artificially boost the popularity of certain stories.
In response, Digg has adopted programs to undermine the effectiveness of
block voting, a move that has drawn the ire of its regular users. The
notion that the wisdom-of-crowds principle needs structure to be effective
was demonstrated when the U.K.'s Department of Food and Rural Affairs
enlisted the public to help write environmental contracts in the form of a
wiki. Despite the shortcomings of the theory, there remains a high level
of enthusiasm for the wisdom-of-crowds philosophy. Google, for instance,
employs the principle when ranking search results, and the forecasts of the
Hollywood Stock Exchange site, where users buy "stocks" of movies and
stars, are far more accurate predictors of a movie's success than the
internal predictions of studios.
Click Here to View Full Article
to the top
Researchers Find a Bigger Prime Number
St. Louis Post-Dispatch (09/13/06) Kumar, Kavita
Researchers at Central Missouri State University have used a stable of 850
computers to find the world's largest prime number. With 9.8 million
digits, the number found by math and computer science professor Curtis
Cooper and chemistry professor Steven Boone tops their discovery last
December of a prime number with 9.15 million digits. "It's another great
discovery," said Richard Crandall, a Reed University professor who
developed the algorithm behind the software that the researchers are using.
"The are to be commended for their good luck," he added. The Electronic
Frontier Foundation is offering a $100,000 prize to anyone who can find a
prime number with 10 million digits. With only 850 computers dedicated to
the search for prime numbers, of which there are an infinite number, the
researchers would only be expected to produce a breakthrough finding
roughly once a decade, Crandall said. The software is available for free
and can run on anyone's computer. The program runs whenever the computers
are on, but it is a low priority so it does not interfere with the
computer's other operations. Each computer receives an untested number
from a server in San Diego. Each computer takes about 30 to 40 days to
test a number on the order of 9 million digits. Before Cooper and Boone
made their breakthrough last December, just eight out of the thousands of
people around the world running the software had come up with record prime
numbers. Some 44,000 groups throughout the world are using the software on
71,000 computers. While Cooper and Boone have clearly had luck on their
side, they also are the group with the largest number of computers, and
they have limited their search to numbers in the 9-million digit range,
while other groups chasing the prize money could be searching in the
10-million digit range, the researchers say.
Click Here to View Full Article
to the top
Microsoft Building Security Language for Grids
eWeek (09/13/06) Taft, Daryl K.
Microsoft is developing a new language to improve the security of grid
environments through features such as decentralized authorization policies,
according to the company's Blair Dillaway. The Security Policy Assertion
Language (SecPAL) is a product of an ongoing Microsoft initiative to
develop solutions for access control in large-scale grid environments. The
need for tight control over trust relationships and delegated access rights
has become more important than ever with the development of broad-based,
decentralized distributed computing. The SecPAL prototype mimics a
multidomain grid environment, incorporating existing Microsoft products and
industry standards such as XML. The need for a new language to express
security policies comes from the difficulty of describing the multitude of
entities and relationships in large-scale grid environments. In addition
to access control, SecPAL is also a tool "for expressing trust
relationships, authorization policies, delegation policies, identity and
attribute assertions, capability assertions, revocations, and audit
requirements," Dillaway said in a white paper. The language also lessens
the reconciliation requirements for disparate security technologies and the
need for semantic translation. SecPAL enables a grid user to temporarily
delegate a subset of access rights to another user who needs them for a
particular job while keeping the rest of the rights restricted. Dillaway
claims that SecPAL is more efficient and usable than existing technologies.
In the future, SecPAL could be applied to automated access delegation, job
management rights, and constrained trust management, Dillaway said.
Click Here to View Full Article
to the top
CSAIL Director Brooks to Step Down by 2007
The Tech (09/12/06) Vol. 126, No. 37, Kim, Jihye
MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) is
losing its director, Rodney Brooks, who wants to focus more on teaching and
research at the lab. Brooks plans to relinquish his duties by the end of
June 2007. He served as the director of the Artificial Intelligence
Laboratory for six years and as its associate director for four years,
before MIT merged the AI Lab with the Laboratory for Computer Science in
2003 to form CSAIL, which he has directed since its inception. Over the
past 10 years, CSAIL has been involved in a number of smaller collaborative
research projects with outside companies, and it currently has a joint lab
with Nokia that is focused on developing cell phone software and hardware.
Brooks says pursuing long-term projects is a challenge for the lab because
all of its funding comes from external sources. Brooks says he was also
focused on bringing more women to CSAIL, considering there is a higher
percentage of undergraduate women at MIT than at the lab. His research
plans include a theoretical project that would bring the adaptability of
biological systems to computing, and a more practical, long-term project to
design a cost-effective, personal robot worker that would be as easy to
operate as a personal computer.
Click Here to View Full Article
to the top
Higgins Lays Out Roadmap for Open Source Identity
Project
Network World (09/14/06) Fontana, John
IBM, Novell, and academic researchers have joined forces on an open-source
project that aims to integrate applications and identity systems. The
Higgins project, a framework with interface and middleware components,
seeks to integrate identity, profile, and relationship data from multiple
systems. The Higgins project framework will support applications with a
front-end that is based on a browser, rich client, or Web services. The
Higgins researchers hope to release the Identity Attribute Service
middleware that sits on top of identity repositories. In an attempt to
avoid having to move data around the network, the middleware continuously
aggregates data from multiple sources, combining them into one identity
credential. "It is very important for Higgins to enhance privacy," said
Parity Communications CEO and project lead Paul Trevithick. "We will
segregate information into distinct contexts." The Higgins group also
plans to develop an open-source Security Token Service to run on clients
and servers and facilitate the exchange of security tokens. The project is
also developing a user interface component, called I-Card, that displays a
list of digital identity cards for authentication and other purposes.
I-Cards will have read and write capabilities so that new information can
be supplied by technologies such as RSS. For the initial reference, the
researchers plan to develop Java binding and implementation, and some core
components will use the C programming language. In its enabling
components, the framework will support PHP, Python, and Ruby.
Click Here to View Full Article
to the top
RFID Security Consortium Receives $1.1 Million NSF
Grant
RFID Journal (09/08/06) O'Connor, Mary Catherine
The NSF has issued a $1.1 million grant to the RFID Consortium for
Security and Privacy (CUSP) to explore the security and privacy
implications of RFID technology. CUSP is made up of academics and
representatives from the private industry who will work together to examine
the ways that RFID technology can affect consumer privacy and security, as
well as potential deployment options that are safe for both customers and
corporations. The CUSP researchers will also attempt to develop
cryptographic protocols and partner with standards groups to improve the
quality of data-protection tools. "Our plan is to look at ongoing [RFID]
deployments and how to make them strong in respect to privacy and
authentication," said Kevin Fu, assistant computer science professor the
University of Massachusetts and the leader of the consortium. Any security
tools that the group develops will be open source, Fu added. UMass and The
Johns Hopkins University will be the two academic institutions hosting the
research. RSA Laboratories, which has been researching security risks in
RFID payment and identification systems, will also be an integral part of
the project. While RFID technology can be used for security purposes such
as key fobs and contactless smart cards, the tags that are currently
deployed are insufficiently protected, according to RSA's Ari Juels.
Adding cryptography to tags will not be easy, however, particularly with
passive tags that only have a small amount of processing power. RSA and
California's Bay Area Rapid Transit (BART), which is interested in
improving the security of smart cards, are currently the only two members
of the consortium's advisory committee.
Click Here to View Full Article
to the top
Where to Find the Freshest Air in Town
New Scientist (09/09/06) Vol. 191, No. 2568, P. 26; Reilly, Michael
Inexpensive sensors have become a means for simple and cheap pollution
monitoring projects ranging from volunteers riding around London on bikes
wearing specially equipped backpacks to smoke detectors modified by
University of California researchers to detect ultra-fine particles to the
mapping of pollution in lower Manhattan through handheld devices. "We were
going for simplicity and ease of use," notes digital media artist Brooke
Singer, co-creator of the third effort, known as the Area's Immediate
Reading (AIR) project. "This will help ordinary people participate in the
conversation about air quality issues--a conversation they don't usually
have access to." Next month, new-media artist Shannon Spanhake will launch
through San Francisco State University a project to monitor pollution in
San Francisco using a network of hundreds of volunteers equipped with cell
phone-sized devices outfitted with sensors. Spanhake's intention is to
teach students to construct the devices themselves and enable the network
using the city's planned municipal Wi-Fi infrastructure. Maintaining data
quality can become challenging because low-cost sensors allow collection of
so much information. Researchers aim to tackle this challenge by improving
their data-processing software. Low-cost sensors are envisioned as tools
for environmental science initiatives in addition to pollution monitoring
and public health studies.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Will Your Vote Count?
CIO Insight (08/06)No. 71, P. 43; D'Agostino, Debra
Many problems with electronic voting systems persist six years after the
2000 presidential election illustrated the need for voting modernization,
and the government faces a tough challenge in improving confidence levels
in e-voting. Among the factors that have shaken people's faith in
e-voting's reliability is the miscounting or deletion of votes due to
malfunction; the potential of voter fraud because of insufficient security
measures; and error-rife statewide registered-voter databases. Johns
Hopkins University computer science professor Avi Rubin says, "The problem
is that technology makes it easier to manipulate elections in an invisible
way. Because the systems are less transparent, the attacks can scale."
But perhaps the most damaging contributor is a widespread feeling among
U.S. voters that the electoral process is broken. Experts say a voting
system that is truly fair and accurate is not an impossibility if certain
precautions are taken, most notably a voter-verifiable paper trail, random
post-election audits, parallel testing of systems on election day, a
prohibition on wireless capabilities, and stringent compliance with
detailed chain-of-custody procedures. There is disagreement among states
regarding which steps are actually needed. Rep. Rush Holt (D-N.J.) is
supporting federal legislation that would make all steps mandatory, but
whether such measures fly or fall may depend on American taxpayers'
willingness to foot the bill. Holt says, "I suspect there are many
thousands--maybe even millions--of Americans who don't believe the results
of some recent election or other. We have to do everything we can to
restore confidence in the mechanism of democracy." Carnegie Mellon
University's Michael Shamos argues that there is little money left for
additional voting systems security, since the bulk of the Help America Vote
Act's funding has been spent already.
Click Here to View Full Article
to the top
Bursting Tech Bubbles Before They Balloon
IEEE Spectrum (09/06) Vol. 43, No. 9, P. 50; Gorbis, Marina; Pescovitz,
David
Projections of the trajectory of science and technology over the next 10
to 50 years through the examination of key trends point to the emergence of
certain technologies and the absence of others, according to a poll of over
700 IEEE Fellows jointly conducted by IEEE Spectrum and the Institute for
the Future (IFTF). The Fellows agree that bandwidth and computation will
continue to increase, enabling such innovations as faster and more accurate
modeling of complex systems, almost flawless handwriting recognition,
unstructured speech recognition, automatic real-time language processing,
better climate simulation, and advanced interactive computer graphics. A
transition from massive, centralized infrastructure networks to
lightweight, scalable, and modular grids through the emergence of new
materials and information technologies is anticipated in the next five
decades; among the technologies expected to play important roles are Wi-Fi,
Voice over Internet Protocol (VoIP), software-defined radio, distributed
power systems, and alternative energy sources. Ninety-five percent of the
respondents agree that the proliferation of radio frequency identification
(RFID) devices is probable as tiny sensors are increasingly embedded in
everyday objects and locations. Insect-sized microbots that can be used
for search-and-rescue operations and the application of MEMS to internal
medicine are expected over the next few decades by many Fellows, but there
is little agreement that supersmall robots that function inside the human
body will become a reality because of coordination, communication, and
control issues. The affordability and availability of synthetic biology
technology such as personal genetic profiles and cheap DNA synthesis is
given a one- to two-decade window, although there is heavy skepticism that
implantable brain-machine interfaces will be widely embraced. The results
of the survey appear to validate former IFTF President Roy Amara's
assertion that "We tend to overestimate the impact of a technology in the
short run and underestimate it in the long run."
Click Here to View Full Article
to the top