E-Voting Systems Vulnerable to Viruses and Other Security
Attacks, New Report Finds
UC Berkeley News (08/02/07) Yang, Sarah
The source code in electronic voting machines contains security holes that
leave them vulnerable to attack, conclude University of California,
Berkeley researchers in a new report. The source code report was part of
California Secretary of State Debra Bowen's "top-to-bottom review" of
electronic voting machines. The researchers, led by UC Berkeley associate
professor of computer science David Wagner, said that many of the security
problems found were similar on each of the three systems examined, which
includes machines by Diebold Elections Systems, Sequoia Voting Systems, and
Hart InterCivic. "The most severe problem we found was the potential for
viruses to be introduced into a machine and spread throughout the voting
system," Wagner says. "In the worst-case scenario, these malicious codes
could be used to compromise the votes recorded on the machines' memory
cards or render the machines non-functional on election day." The
vulnerabilities on the machines could allow a virus on one machine to
infect an entire county's system when votes are uploaded to a central
computer to be counted. Wagner says the flaws found would allow an
attacker to defeat any technological countermeasures in the software.
"Unfortunately, these vulnerabilities are not trivial implementation bugs
that can be patched up," Wagner says. "The software just wasn't designed
with fundamental safeguards in place to make them resilient to intrusion."
The researchers also found flaws that could jeopardize voting anonymously
in two of the systems. Bowen is expected to make decision regarding the
certification of the machines on Aug. 3, six months before the state's
primary election. For information about ACM's many e-voting activities,
visit
http://www.acm.org/usacm
Click Here to View Full Article
to the top
Congress Eyes R&D Spending to Counter Offshoring of
Jobs
Computerworld (08/01/07) Thibodeau, Patrick
The two houses of Congress reached an agreement that will allocate $43
billion over the next three years to promote basic scientific research.
The agreement establishes the Advanced Research Projects Agency for Energy
(ARPA-E), modeled after the Defense Advanced Research Projects Agency and
with an initial budget of $300 million for the 2008 fiscal year. ARPA-E
will be part of the U.S. Department of Energy and is intended to find
high-risk, high reward technology. Rep. Bart Gordon (D-Tenn.), chair of
the House Science and Technology Committee, says, "If we're really going to
become energy independent, it's going to take a bump in technology, so this
may be the most important energy bill that we will pass." Funding for
ARPA-E is part of the America COMPETES Act, which allocates $17 billion to
the Energy Department and $22 billion to the National Science Foundation
over the next two years, with the objective of doubling the NSF's budget
over the next seven years. The legislation seeks to ensure the federal
government continues to fund technology research considered too risky for
venture capital. The American COMPETES Act also calls for the creation of
the Technology Innovation Program (TIP), a successor to the Advanced
Technology Program, that would receive $100 million next year, with the
funding increasing to $131.5 million and $140.5 million over the next two
years. TIP would provide funding for current projects and set aside $40
million per year for new projects, reserving its resources for small and
midsize companies.
Click Here to View Full Article
to the top
New Report Finds States Not Doing Enough to Ensure
Accurate Count on Electronic Voting Machines
Brennan Center for Justice (NYU School of Law) (08/01/07) Rosen, Jonathan
The majority of states using electronic voting machines do not have
adequate security measures and are not equipped to find sophisticated and
targeted software-based attacks, non-systemic programming errors, or
software bugs that could alter an election's results, concludes a report
from the Brennan Center for Justice at NYU's School of Law and the
Samuelson Law, Technology and Public Policy Clinic at the University of
California, Berkeley's Boalt Hall School of Law. The report, "Post
Election Audits: Restoring Trust in Elections," says that more focused and
rigorous audits of paper records can improve the integrity of election
results. "No matter how long we study machines, we're never going to think
of every attack or find every bug. We can try to close up every hole we
find, but ultimately using paper records to check electronic tallies is the
only way we can trust these machines," says lead author of the report
Lawrence Norden, who is also head of the Brennan Center's Voting Technology
Assessment Project. To emphasize the importance of auditing, the Brennan
Center released data compiled by Common Cause that highlighted instances of
machine malfunctions altering vote tallies in 30 states. Common Cause
director of Voting Integrity Programs Susannah Goodman says, "We need
systemic, mandatory audits to insure that voters choose candidates not
software bugs or programming errors." The report found that of the 38
states that require or use voter-verifiable paper records, 23 do not
require audits after every election, and of the ones that do, none use
audit methods that would maximize the chances of finding targeted
software-based attacks, programming errors, or software bugs that would
affect the outcome of the election.
Click Here to View Full Article
to the top
Cutting-Edge Laser Technology Transforms Classic Video
Games to Giant Size at SIGGRAPH Technology Conference in San Diego
Business Wire (08/02/07)
ACM's SIGGRAPH conference will use a unique video game event to lead into
the Computer Animation Festival each night. The audience will be able to
watch celebrity players use a state-of-the-art laser projection system to
play Asteroids, Tempest, Star Wars, and several other classic arcade games
on an enormous projection screen. The laser technology will show real-time
vector graphics in vibrant, non-pixelated color. Jim Blinn, a computer
scientist who has worked at NASA's Jet Propulsion Laboratory, and Glen
Entis, senior vice president and chief visual and technical officer at
Electronic Arts, will be among the celebrity players. "Playing these
classic games like they've never been seen before is the perfect nod to the
early days of the video games industry as well as to the early days of
computer graphics," says Paul Debevec, chair of the Computer Animation
Festival, which is scheduled for Aug. 6-8, 2007, at the San Diego Civic
Center. "With our festival showcasing the most groundbreaking computer
animations from around the globe, including a record number of pieces from
the video game industry, it's a thrill to be able to start the show with
faithful, larger-than-life versions of the games that helped attract so
many of the SIGGRAPH audience to the field of computer graphics."
Click Here to View Full Article
to the top
Sharing a Joke Could Help Man and Robot Interact
New Scientist (08/01/07) Reilly, Michael
University of Cincinnati artificial intelligence researcher Julia Taylor
demonstrated a computer program that is able to understand when someone is
joking at last week's American Association for Artificial Intelligence
conference in Vancouver, Canada. Taylor teamed with UC AI researcher
Lawrence Mazlack to create the bot, which makes use of a database of words,
knows how words can be used in different ways to create new meanings, and
can determine the likely meaning of new sentences. Robots will need to
determine whether someone has said something that was meant to be funny if
humans are to accept them as companions or helpers. Taylor and Mazlack
developed the bot to recognize jokes that turn on a simple pun, and they
are now working to personalize its sense of humor so it can take the
experiences of people into consideration when assessing whether their words
were meant to be funny. "If you've been in a car accident, you probably
won't find a joke about a car accident funny," Taylor explains. Meanwhile,
Rada Mihalcea is working with other experts at the University of North
Texas in Denton on a bot that is able to determine humor through the
frequency of certain words that are used in jokes.
Click Here to View Full Article
to the top
Could Tiny Sensors Detect Bridge Crises?
Associated Press (08/03/07) Mygatt, Matt
Los Alamos National Laboratory scientists, working with the University of
California at San Diego, are developing a network of sensors that could
detect the early warning signs of structural failure in bridges. The small
sensors, about the size of a credit card, would be put on bridges to give
enough warning to shut down the bridge or have preventative maintenance
work done to avoid a serious failure. "The idea is to put arrays of
sensors on structures, such as bridges, and look for the changes of
patterns of signals coming out of those sensors that would give an
indication of damage forming and if it is propagating," says laboratory
civil engineer Chuck Farrar. The sensors might be powered by microwaves or
the sun, and would use radiotelemetry to send data to a computer for
analysis. The sensors would be monitoring for electrical charges caused by
stress on material such as steel-reinforced concrete. It will probably be
several years more before the sensors are commercially available, Farrar
said. The researchers are currently trying to build in microprocessors and
wireless telemetry systems so the sensors can work as standalone monitoring
devices. Another bridge sensor project is being conducted at the
University of Michigan and Stanford University, and is experimenting with
using a remote-control helicopter to send a pulse to the sensors to provide
power and to take a reading. Drexel University is also researching bridge
monitoring. There is still significant work to be done on the projects,
and cooperation between civil engineers, electrical engineers, and computer
scientists is needed to bring the technology together. "The hardest part
is getting data from damaged structures to use in the study," Farrar says.
"Nobody wants to give you a very expensive bridge to just test a data
integration algorithm."
Click Here to View Full Article
to the top
Halt 'High Risk' E-Voting: British Watchdog
Reuters (08/02/07) Griffiths, Peter
Britain's election watchdog says Internet voting trials are too risky to
continue, and that Britain was fortunate not to have had a security breach
during a pilot in May. In a new report, the Electoral Commission calls for
a halt to e-voting trials until the government comes up with a plan for
testing, securing, and assuring the quality of the voting strategy. "We
have learned a good deal from pilots over the past few years," says Peter
Wardle, chief executive of the Electoral Commission. "But we do not see
any merit in continuing with small-scale, piecemeal piloting where similar
innovations are explored each year without sufficient planning and
implementation time." In addition to concerns about fraud, transparency,
and public trust, the watchdog also says e-voting is costly. The Electoral
Commission cites a number of problems during the e-voting pilot in local
elections earlier in the year, including people forgetting the Internet
password needed to vote, and others believing they could vote over the
telephone.
Click Here to View Full Article
to the top
Clarke Wants to Know, Where Did We Go Wrong?
Government Computer News (08/01/07) Jackson, William
Former U.S. counterterrorism czar Richard A. Clarke says the United States
lost its way sometime after the release of the National Strategy to Secure
Cyberspace in 2003. "I'd like to know why it was that we lost momentum in
solving the problem in more than a piecemeal manner," says Clarke, who gave
the opening keynote speech at the Black Hat Briefings. "There is no
leadership. There is no national plan implemented." Clarke says the
nation's industry, commerce, health care, and national defense are growing
increasingly dependent on an information infrastructure that cannot be
defended. Clarke says there was once a high-level of awareness that there
was a problem, but that since then little progress has been made and some
has even been lost. Clarke believes the government has failed in its part
as the role model it was supposed to be, and the situation will probably
get worse before it gets better as federal funding for security R&D has
been reduced. The problem is a lack of congressional and presidential
leadership, Clarke says, compounded by a lack of executive initiative from
the private sector. Clarke believes that without government leadership,
corporations will not put forth the effort necessary for significant
improvement unless threatened by some imminent catastrophe. Clarke says
what is needed are more and better encryption practices, a secure Domain
Name Service, service providers that filter out malware before it reaches
the local-area network and the end user, and a parallel network to provide
emergency services that uses IPv6 to prioritize traffic. Some progress has
been made, including companies that have reduced the vulnerabilities in
their software, and IPv6 has been slowly advancing.
Click Here to View Full Article
to the top
IDC: Patents Inhibit Open Source Adoption
InternetNews.com (07/31/07) Kerner, Sean Michael
The potential for copyright and patent infringement is the primary
inhibitor for organizations considering adopting more open source software,
followed closely by the lack of available support, concludes an IDC survey
of IT end users. IDC Matthew Lawton says the survey found that open source
appeals to IT users because of its low cost, total cost of ownership, and
product functionality. Users were found to be most interested in product
functionality, scalability, and reliability while access to source code and
the ability to modify and redistribute source code were less important.
"The key take away is that end users care about what software does and how
well it does it, not how it's developed or distributed," says Lawton. As
for what open source software will look like in the future, Lawton believes
it will basically stay on the traditional path with infrastructure software
such as Linux and database programs such as MySQL. "Five years from now,
infrastructure software will still represent the majority of open source
software that is adopted, but we think the profile will go down in favor of
development, reporting tools, and application software," Lawson says.
"That will take a number of years to happen, though."
Click Here to View Full Article
to the top
Researchers Set to Spark Up New More Secure Network,
Routers, Switches
Network World (07/31/07)
Stanford University researchers this summer are deploying and testing an
updated version of Ethane, an architecture for corporate networks that
provides a powerful and simple management model with strong security. Most
current corporate networks allow for open communication automatically,
which makes implementing effective security and privacy rules difficult.
Ethane establishes a set of simple-to-define access polices, all maintained
in one place, that are consistently applied across a network datapath and
ensures users, switches, or end-hosts do not receive more information than
needed. A preliminary version of Ethane was built and deployed in the fall
of 2006. The new version of Ethane reportedly has better policy language
support and a more feature-rich datapath that can support more diverse
techniques such as NAC, MAC hiding, and end-to-end L2 isolation. Ethane
works because all complex features, including routing, naming, policy
declaration, and security checks, are performed by a central controller
instead of in the switches as is the common practice. All movement on the
network must first get permission from the controller, which verifies that
the communication is allowed under network policy. If the flow is allowed,
the controller determines a route for the flow, and adds an entry for that
flow in each of the switches along the path. Stanford researchers say
their Ethane project, which is funded by Stanford's Clean Slate Project,
closely complements multiple projects at the National Science Foundation,
including the Global Environment for Network Innovations project.
Click Here to View Full Article
to the top
How Do You Build a New Internet?
Guardian Unlimited (UK) (08/01/07) Johnson, Bobbie
Researchers in the United States are asking for at least $350 million to
build the Global Environment for Network Innovations (GENI), a
next-generation research project to create a more secure and safer
replacement for today's Internet. Similar projects are also being
conducted in Europe as part of the European Union's Future and Internet
Research (Fire) program. No matter what replaces the current Internet,
there is a general agreement that more needs to be done about Internet
security and control. GENI supporters hope to create a solution in 10 to
15 years. Many potential solutions focus on "mesh networks" that link
multiple computers to create more powerful and reliable connections to the
Internet. Mesh networks make nets of multiple computers that connect to
the Internet through a single pipeline, instead of multiple parallel
connections, making a more intelligent system that is less vulnerable to
attack. Rutgers University professor Dipankar Raychaudhuri has been
working on an alternative system, but says that it is a difficult task.
"People keep trying to evolve the network, but it hasn't really changed in
20 years," Raychaudhuri says. "Once you've built something as large and
complex as the Internet it is difficult to start over again." Another
solution is to organize the information differently. Instead of spreading
small pieces of the network over hundreds of millions of computers, systems
could be developed that keep a local copy of the Internet. Web users would
spend most of their time in a self-contained system instead of spread out
over the globe, making them less vulnerable to hackers and attacks.
Click Here to View Full Article
to the top
The New Face of Identity Protection: You
University of Houston News (07/30/07) Holdsworth, Ann
University of Houston professor Ioannis Kakadiaris and researchers at the
school's Computational Biomedicine Lab have developed facial recognition
software that can be used for a variety of purposes, from securing
government facilities to making credit card purchases. The software,
called URxD, uses a three-dimensional image of a person's face to create a
biometric identifier. The Face Recognition Vendor Test, conducted by the
National Institute of Standards and Technology, found the URxD system to be
the best 3D face recognition system for examining face shape. Kakadiaris
says URxD's accuracy stems from the strength of the variables the system
uses to examine and describe a person's face, and that it would make an
excellent replacement for having to remember multiple passwords and PINs.
"Remembering dozens of personal identification numbers and passwords is not
the solution to identity theft," Kakadiaris says. "The solution is to be
able to tie your private information to your person in a way that cannot be
compromised." says Kakadiaris. Kakadiaris believes URxD will have a
positive impact on several of today's biggest issues and that someday
computers will be able to recognize the user sitting in front of them.
"Everything will be both easier and more secure, from online purchases to
parental control of what Web sites your children can visit," Kakadiaris
says.
Click Here to View Full Article
to the top
What Will Next-Generation Multicore Apps Look
Like?
ZDNet (07/30/07) Foley, Mary Jo
Computing hardware is becoming increasingly ready to make the switch to
multicore processing, but software developers are still struggling with
ways to make their software work on multicore processors. "The world is
going to move more and more away from one CPU that is multiplexed to do
everything to many CPUs, and perhaps specialty CPUs," says Craig Mundie,
Microsoft's chief research and strategy officer. "This is not the world
that the programmers target today." He says such complex programming is
typically only done by those who write core operating systems or for
supercomputing. Mundie believes the whole concept of the application will
have to be restructured to take full advantage of more powerful, multicore
processors. Mundie says that although the microprocessor and hardware
systems on the whole have grown in capability, the fundamental concept of
the application has not changed. Multicore applications will be more
asynchronous, loosely coupled, concurrent, composable, decentralized,
resiliently designed, and personal, Mundie says. "We're moving to an era
where IT will make a lot of things more personal," he says. One of the big
changes will be that computers "become more humanistic," Mundie says.
"Your ability to interact with the machine more the way you would interact
with other people will accrue from this big increase in power."
Click Here to View Full Article
to the top
How Far Could Cyber War Go?
Network World (07/26/07) Kabay, M.E.
The authors of NATO Review for Winter 2001/2002, former CERT senior
analyst Timothy Shimeall, former NATO fellow and University of Pittsburgh
professor Phil Williams, and former CERT intelligence analyst Casey
Dunleavy, establish three distinct levels of cyber war and argue that
defense planning needs to account for the virtual world to minimize damage
in the real world. The first level of cyber war, described as "cyber war
as an adjunct to military operations," is intended to achieve information
superiority or dominance in the battle space and would include physical or
cyber attacks directed at military cyber targets with the objective of
interfering with C41, or command, control, communication, computing, and
intelligence. The second level, limited cyber war, would attack cybernetic
targets with few real-world modalities but very real consequences by
launching malware, denial-of-service, and data distortion attacks. The
authors consider the third level, called unrestricted cyber war, to be the
most serious and possibly the most likely type of cyber war to occur.
Unrestricted cyber war attacks both military and civilian targets and
deliberately tries to create mayhem and destruction. Targets may include
any part of any critical infrastructure. The attacks could result in
physical damage, including injuries and deaths among civilians. The
authors suggest that improvements need to be made in anticipation and
assessment abilities, preventative and deterrent measures, defensive
capabilities, and damage mitigation and reconstruction measures.
Click Here to View Full Article
to the top
NJ Town Planning Beach of the Future
Associated Press (07/25/07) Parry, Wayne
Ocean City, N.J., is planning a high-tech makeover that involves the
deployment of Wi-Fi and radio-frequency identification tags to deliver
Internet access and public services. Among the technologies to be
implemented are wristbands worn by visitors that debit their credit cards
or bank accounts to pay for beach access or other services, and
solar-powered, sensor-outfitted garbage cans that automatically email
cleanup crews when they are full. The technology would also make it
possible to link wristbands to each other, which for example, would enable
parents to know where their children are. Sensors stationed throughout the
area could also send text messages to users' cell phones. Jonathan Baltuch
of Marketing Resources reports that Ocean City's mostly obstruction-free
beach should be very amenable to the undisrupted transmission of wireless
signals. The city would own the wireless network, which would allow
officials to know precisely how many beach visitors are present at one
time. Baltuch reckons that the network could produce $14 million in
revenue for Ocean City over the first five years of its operation. The
MuniWireless Web site estimates that close to 20 coastal municipalities
employ wireless Internet systems across the United States.
Click Here to View Full Article
to the top
The Macleans.ca Interview: Jonathan Schaeffer
Maclean's (07/20/07) Lunau, Kate
In 1989, University of Alberta computer science professor Jonathan
Schaeffer wrote a program called Chinook that, after 18 years of
calculations and 500 billion billion possible positions, "solved" the game
of checkers by determining that a perfectly played match would always end
in a draw. In a recent interview, Schaeffer says the most significant
aspect of his research is that the solution is one million times bigger
than any other optimal problem previously solved. "This is like a quantum
leap forward in the size of problems that people have been able to solve,"
says Schaeffer. As for checkers, Schaeffer does not expect the fact that
there is a solution to lessen the popularity of the game. He notes that
people who enjoy playing the game can test themselves against a beatable
version of Chinook that is available on the university's Web site.
Schaeffer says that now that one of his long-term projects has been solved,
he will be paying more attention to another long-term project focused on
poker. Schaeffer started a poker playing program in 1991 that is still
under development. Schaeffer says it took until 2002 to develop a strong
poker program, but now the program is capable of competing against
professional human players, as it did at the recent first Man-Machine World
Poker Championship. While the human players are still superior, Schaeffer
expects that one day his software will be the World Man-Machine Poker
champion.
Click Here to View Full Article
to the top
Battle for the Future of the Net
Business Week (07/25/07) Schenker, Jennifer L.
In an effort to avoid falling behind the United States again, European
companies and policymakers are making a huge investment in Semantic Web
research. European researchers have seen some of their best ideas,
including the router and MP3 music compression, capitalized on by American
companies after European companies thought they were not worthwhile
investments, and European businesses and governments are anxious to not
make the same mistakes again. The Semantic Web goes beyond the relatively
static exchange of information on the Internet by adding more in-depth
media and support for massive amounts of unstructured data, essentially
making all of the world's information available online. Semantic Web users
will be able to connect information to discover correlations between
unrelated data, with potentially massive implications in fields such as
medical research, the military, and business intelligence. European
engineers have already significantly contributed to the development of
Semantic Web standards, and European governments are working to maintain
Europe's presence as an innovator in the field. "They want to create the
defining technologies for the Semantic Web and give European companies an
advantage in the market," says Mark Greaves, a scientist with the asset
management company Vulcan, which was started by Microsoft co-founder Paul
Allen. However, analysts question whether Europe's massive investment will
generate the same type of innovation that small U.S. startup companies such
as MySpace, Facebook, and Digg have with Web 2.0 applications.
Click Here to View Full Article
to the top
M'soft: Parallel Programming Model 10 Years Off
EE Times (07/23/07)No. 1485, P. 4; Merritt, Rick
A transition to a new parallel architecture for mainstream computers is
underway thanks to the advent of multicore processors, but Microsoft
experts predict that it could take as long as decade for a parallel
programming model to emerge. Defining such a model while also introducing
a more formal and organized process for software development is a goal of
Microsoft's. "We need to be able to write programs that run on the next 20
generations of Dell computers, even if the number of processors in them
goes up by a factor of 16," says parallel computing maven Burton Smith, who
was hired by Microsoft to oversee research in parallel programming
architectures. "This field won't continue to grow, be vital and solve
society's problems unless we reinvent it." Smith maintains that parallel
programming languages are currently the most critical issue in computing,
and adds that it is reasonable to expect that there will be three or four
popular and strong parallel languages in 20 years' time. No widely
accepted parallel languages that can be utilized on general-purpose systems
currently exist, and Smith says he is increasingly convinced that the
proper approach to such languages is a hybrid of functional programming and
transactional memory. Smith says he is concentrating on a programming
model for processors featuring substantially more than eight cores; he
notes that chip designers have to conduct trials with a much wider spectrum
of multithreading methods than are currently used in hardware, as well as
figure out how to allow tasks to wait for resources without holding up
other processes. Furthermore, chips will have to deploy intelligent I/O
blocks that can convert virtual addresses into physical memory sites for
each core, and the CPUs may also need to commit hardware to expediting
atomic memory transactions.
Click Here to View Full Article
to the top