Several Lawsuits Target E-Voting
USA Today (06/05/06) P. 1A; O'Driscoll, Patrick
With the primary election season on the horizon, voting rights groups have
filed lawsuits in at least six states to block the purchase or use of
computerized e-voting systems. The most recent challenge came in Colorado,
where the non-partisan advocacy group Voter Action filed suit last week
against the state and nine counties, following similar court actions
initiated by the group in California, Arizona, and New Mexico. Other
groups have filed lawsuits in Florida, Ohio, and Pennsylvania. Voting
rights advocates say the software in e-voting machines is prone to
tampering and ballot manipulation, and that election results are
unverifiable in the absence of a recountable paper trail. Under pressure
from a lawsuit, several counties in California have already dropped their
touch-screen voting machines in favor of systems with printed ballots read
by optical scanners. Six of eight states holding primaries on Tuesday will
use touch-screen systems, which are in use in approximately one-third of
counties throughout the United States. While there has never been a
confirmed incident of manipulating an actual election, a Finnish security
expert found significant flaws in a Diebold machine last month. Diebold
says the vulnerability is strictly theoretical, and that it will be fixed
later this year. E-voting defenders claim that problems typically occur
when poll workers are inadequately trained or when the systems are hastily
set up. "Certainly none of the allegations of security breaches on the
equipment have ever been demonstrated to be true," said R. Doug Lewis of
the Election Center. Many states began investing in e-voting systems after
Congress authorized more than $300 million to replace outdated voting
machines under the Help America Vote Act of 2002. For information on ACM's
e-voting activities, please visit
http://www.acm.org/usacm.
Click Here to View Full Article
to the top
Group Seeks to Make Computer Science More
Attractive
Chronicle of Higher Education (06/09/06) Carnevale, Dan
A new coalition of 10 institutions is attempting to revamp the image of
computer science in an effort to reach out to women and underrepresented
minorities. The Stars Alliance recently won a three-year, $3 million grant
from the NSF. Many schools intentionally make introductory computing
classes so difficult that only the most serious students pass, which cuts
out intelligent students who could go on to become skilled computer
scientists, according to Larry Dennis, dean of the College of Information
at Florida State University. "We're looking at curricular and
infrastructure changes to make these courses more attractive to everybody,"
he said. "Not just women and minorities, but everybody." One approach
offers courses on more application-oriented skills, such as multimedia and
Web-site development, rather than intensive concentration on mathematics
and programming. The 10 participating institutions will try to market
their program to other schools to broaden the appeal of computer science
among women, who Dennis notes are typically more interested in the social
implications of computing. The consortium is also developing a program for
computer-science majors to mentor students in middle school and high
school. In addition to kindling interest in computers among younger
students, the mentoring program will teach undergraduates to discuss and
teach computing using everyday language.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
GNU Radio Opens an Unseen World
Wired News (06/05/06) Norton, Quinn
With the aid of a Universal Software Radio Peripheral (USRP), Linux users
can harness the power of a general radio to capture the FM spectrum, read
GPS transmissions, and decode HDTV. Eric Blossom came up with the idea for
the USRP when he set out to create a software HDTV receiver before Congress
passed broadcast flag legislation restricting the types of hardware that
could receive the high-definition signal. "We'd just go build one of those
things (in software) and moot (broadcasters') control over the hardware,"
Blossom said. When he ran into the problem of bringing the antenna to the
computer, Blossom teamed up with Matt Ettus, who secured funding from the
NSF to develop the USRP by billing it as a low-cost solution for widespread
radio deployment. Ettus' motivations were more technical, however, as he
was mainly drawn to the challenge of decoding HDTV. Ettus and Blossom hope
that decentralized control of the radio spectrum will lead to greater
innovation in a world no longer hemmed in by the constraints of bandwidth.
Just as blogging democratized content on the Web, universal access to radio
will give everyone the capability to be a broadcaster, Ettus says. Toby
Oliver's company PathIntelligence uses GNU radio and the USRP to monitor
pedestrian traffic to shopping centers in the United Kingdom, using the
mobile phones' control-channel signals to locate the position of a phone by
triangulation. Shopping-center owners can use the technology--in essence
an extremely localized GPS device--to monitor traffic flows to see which
stores are most popular. "Only recently, in the last 12 months, has
computing power enabled me to do what I need to in general-purpose software
without the expensive development of dedicated DSPs," said
PathIntelligence's Toby Oliver.
Click Here to View Full Article
to the top
Government, Internet Firms in Talks Over Browsing
Data
Washington Post (06/03/06) P. D3; Ahrens, Frank
The U.S. Justice Department and FBI are holding discussions with leading
companies in the Internet field to convince companies to retain data on Web
surfing for possible use in child pornography and terrorism cases. Google,
Yahoo!, AOL, Microsoft, and others are involved in the negotiations, and
Microsoft has stated that the issue of consumer Internet privacy and
retaining data is best seen as a balance of privacy and law enforcement
concerns. The Justice Department and FBI may seek legislation from
Congress requiring data retention. They hope to base their request on a
potential industry consensus solution outlined in these ongoing
negotiations. However, in the first meeting between the Justice Department
and the companies, government officials were stern in their demands. The
second meeting featured more of a dialogue. Internet companies are wary of
changing the current process so drastically that it degrades consumer
privacy on the Internet. Currently ISPs and Internet companies refer
possible illegal activity online, such as viewing child pornography, to law
enforcement officials. Officials then must return with a warrant or
subpoena to obtain Web surfing records.
Click Here to View Full Article
to the top
What a Difference Two Decades Can Make
Advanced Imaging Pro (06/01/06) Nelson, Lee J.
Over the past 20 years, imaging has rapidly emerged as a complex field
with many subdisciplines, including remote sensing, medical imaging, and
machine vision. In a recent interview, five imaging experts discussed the
evolution of the industry. Twenty years ago, medical imaging systems were
powered by custom hardware, sharing data was difficult and required custom
programming, and electronically transmitting metadata was impossible.
Research and development mainly occurred in national laboratories and large
corporations, and remote sensing systems only worked at very limited
distances. While hundreds of companies were racing to unlock the potential
of machine vision, most products fell short of customer expectations
because of a weak general purpose computer platform and cumbersome user
interfaces. While transporting metadata is still problematic, exchanging
medical imaging data through DICOM (Digital Imaging and Communications in
Medicine) is now standard practice. The areas of sensors, optics, and
software control have all seen major improvements in the last two decades,
and increases in chip efficiency and processing power have brought many
applications out of the lab and into practical use. Appearance-based
recognition has come to replace model-based methods, which has prompted a
flood of research in the areas of illumination effects, image acquisition,
and object detection. As it has matured, the imaging industry has become
fragmented into niche applications. Twenty years ago, the industry relied
on costly and expensive embedded systems, compared with the current
embedded systems built around DSPs and microprocessors. The emergence of
Microsoft as the singular operating system environment and the development
of low-cost plastic lenses both had a seismic impact on the development and
maturation of imaging.
Click Here to View Full Article
to the top
Security Researchers to Produce New Tools
Concordia Journal (06/01/06) Vol. 1, No. 15,Black, Barbara
Cybersecurity has emerged as the most important challenge for computing
researchers ever, according to Mourad Debbabi, a Concordia Research Chair
who is leading a security research project with almost $1 million in joint
funding from Bell Canada and the Canadian Department of National Defense.
"The tremendous success of Internet-related technologies, such as Web
services, voice-over IP, mobile telephony, and so on, coupled with advances
in hardware and software engineering are giving rise to challenging and
very interesting research problems," he said. The project's first
initiative will focus on securing free and open-source software. The
second phase will formulate tools and techniques for conducting
forensically sound investigations of cybercrimes, collecting evidence and
verifying and sequencing information to support the work of law
enforcement.
Click Here to View Full Article
to the top
Online Throngs Impose a Stern Morality in China
New York Times (06/03/06) P. A1; French, Howard W.
The Internet is increasingly being used by Chinese users to investigate
others and mete out punishment for morality offenses both real and
imagined. For example, Chinese Internet users have used the Web to
scrutinize husbands suspected of cheating on their wives, investigate fraud
on Internet auction sites, examine the secret lives of celebrities, and
look into unsolved crimes. In one recent incident, a man used an Internet
bulletin board to accuse a college student of having an affair with his
wife. Within days, hundreds of thousands of anonymous Internet users
formed teams that hunted down the student, forced him to leave his
university, and caused his family to barricade themselves inside their
home. The phenomenon, known as Internet hunting in China, is setting off
alarm bells in the country. Many are comparing it to the Cultural
Revolution 40 years ago, when mobs of students taunted and beat their
professors. Mass denunciations and show trials were also common during
this period. In order to deal with the problem, the government is
considering registering all Internet users. However, free speech advocates
say there is no reason for the Chinese government to place such
restrictions on the Internet. "The Internet should be free, and I have
always opposed the idea of registering users, because this is perhaps the
only channel we have for free discussion," said Zhu Dake, a sociologist and
cultural critic at Tongji University in Shanghai. "On the other hand, the
Internet is being distorted. This creates a very difficult dilemma for
us."
Click Here to View Full Article
to the top
Twelve Research Grants Awarded to Help Fund Innovation in
Search Technology
PRNewswire (06/01/06)
Microsoft Live Labs has named the winners of $500,000 in grant money for
its Accelerating Search in Academic Research request for proposal (RFP),
which will enable the recipients to continue their study of Internet search
technologies, and data mining, discovery, and analysis. "Through this RFP
process, we have found a wealth of academic talent and ideas for search and
algorithm development that we think will transform our ability to harness
the power of the Web in the years to come, allowing users to focus less on
the work of searching and instead reap the rewards of discovery," says Gary
William Flake, director of Live Labs. The 12 RFP winners will each receive
between $25,000 to $50,000, and have access to extensive data logs from MSN
and an increased quota of queries to the MSN Search software development
kit. Winners include "The Truth Is Out There: Aggregating Answers From
Multiple Web Sources," which involves information retrieval research from
Amelie Marian of Rutgers University; and "Vinegar: Leading Indicators in
Query Logs," which covers machine learning, human-computer interaction, and
data mining research from Eytan Adar, Brian Bershad, Steven Gribble, and
Daniel Weld of the University of Washington. "VISP: Visualizing
Information Search Processes," is a proposal by Lada Adamic and Suresh
Bhavnani of the University of Michigan focusing on natural language
processing and human-computer interaction research. "Entity and Relation
Types in Web Search: Annotation Indexing and Scoring Techniques," focusing
on machine learning, information retrieval and natural language processing
research, was proposed by Soumen Chakrabarti of the Indian Institute of
Technology, while University of Illinois at Urbana-Champaign researcher
Kevin Chang's "Deepening Search: From the Surface to the Deep Web" proposal
for information retrieval and information integration research was also a
winning RFP.
Click Here to View Full Article
to the top
The Code That Keeps Your Fingerprints Secure
New Scientist (06/03/06) Biever, Celeste
Researchers at the Mitsubishi Electric Research Laboratories (MERL) have
developed a technique that secures biometric information by creating a
second code that can not be used to recreate the biometric. The algorithm
comes at a time when the government and companies are storing the biometric
information of millions of people, and there are concerns that it would not
be difficult for thieves to gain access to a biometric and then use it to
steal a digital identity. Conventional biometric systems store the raw
details of fingerprints, iris scans, and facial images. However, the
algorithm from Emin Martinian and his colleagues at MERL does not store the
raw materials, but manipulates the code to produce a shorter code called a
syndrome. The algorithm is designed to manipulate the ones and zeros of a
biometric code and as a result, gaining access to a syndrome will not do a
hacker any good because he would have no idea how to find its match in
order to "correct the error" and reconstruct the original biometric. "The
only person who should have your fingerprint is you, on the end of your
finger," says MERL director Joe Marks. Martinian says the algorithm is
safer than the warped biometric system being developed by IMB researchers,
which he maintains would not be able to prevent a thief from using a warped
biometric to decrypt the data.
Click Here to View Full Article
to the top
MSpace: What Do Numbers and Totals Mean in a Flexible
Semantic Browser
University of Southampton (ECS) (06/01/06) Wilson, Max L.; Schraefel, M.C.
Browsers that employ current search models frequently use numeric volume
indicators (NVIs) based on the initial selection of a target, and the
researchers investigated how such indicators might be represented by
semantic browsers employing a flexible exploratory search interface such as
mSpace. MSpace has a browser that follows a columnar slice design in order
to explore intersections between domains, and the authors reason that NVIs
could represent three characteristics within the mSpace paradigm: The
number of items in the next column to the right, the number of items in the
column that the data is centered around, and the number of items in the
final column, if that column is understood to be the "goal" of the query.
The researchers organized a study to determine from these three cues
immediate expectations as to the information represented by NVIs in a
flexible exploratory search interface, and how various cues affect
participant motivation for interpreting the meaning of the NVIs. None of
the participants selected the hypothesis that based the figures on the
final column, suggesting that such an approach should be discouraged. The
researchers drew three potential conclusions from the experiment: Most
immediate participant expectations for NVIs are oriented around their
representation of a number of specific artifacts that represent a domain
focus; the introduction of ambiguity into exploratory situations causes
participants to consider different representations; and the combination of
visual and numeric size cues appears to yield additional advantages outside
of NVIs by themselves. The authors' research indicates little
understanding of how NVI expectations will shift during longer interaction
with Semantic Web-based browsers.
Click Here to View Full Article
to the top
The Enemy Within: Terror by Computer
New Zealand Herald (06/01/06) Shreeve, Jimmy Lee
If terrorists turn their attention away from the physical to the digital
world, there may be even greater damage than the Sept. 11 attacks, say
cyber-security experts. Computer network attacks are dangerous enough to
kill people and destroy companies, according to Scott Borg at the U.S.
Cyber Consequences Unit. "Up to now, executives and network professionals
have worried about what adolescents and petty criminals have been doing,"
says Borg. "In most cases, these kinds of cyber attacks aren't very
destructive. The reason is that businesses generally have enough inventory
and extra capacity to make up for short-term interruptions." In the past,
hackers focused on credit cards or personal information found on the Web,
but now they are starting to focus on databases. Borg gives examples of
possible scenarios such as the tampering of a pharmaceutical company's
database or changing specifications at a car factory, which may cause a car
to catch on fire. Those kinds of attacks could crash the economy with just
the click of a mouse, according to Borg. Officials say their biggest fear
is over electronic attacks that focus on the networks that make up the
critical national infrastructure. "People claim no one will ever die in a
cyber-attack, but they're wrong," says Richard Clarke, a former
cyber-security expert in the Bush Administration. "This is a serious
threat."
Click Here to View Full Article
to the top
Escape the Software Development Paradigm Trap
Dr. Dobb's Journal (05/29/06) Bereit, Mark
IRIS Technologies product development director Mark Bereit refutes the
assumption that software development will always be difficult and
bug-ridden, noting his suspicion "that these limitations apply, not to all
possible software development, but solely to the software development
paradigm that we've followed, unchallenged, for decades." He proposes
reworking the software development model and studying other engineering
disciplines for inspiration, eschewing habits that impose the limitations.
Bereit cites the commonly accepted view that software systems can fall
apart just from a single point of failure, which resides in each line of
code. He does not point to a shortage or surplus of code reuse, but rather
its employment to do something entirely different from what developers
think they are doing, namely the construction of massive algorithms instead
of components. According to Bereit, what is needed is a way to divide
software development into more workable segments so that the CPU is not
overtaxed, though he is not proposing multithreading. The author uses the
basic principles of mechanical engineering as a jumping off point in his
suggestion that software development should incorporate the involvement of
"trustworthy components, specifications, and margins; that it should allow
assemblies of increasing complexity to be built from trustworthy lesser
components; it should involve a team approach to performing complex tasks;
and it should be something that can be generally dependable and
trustworthy." Splitting up a task between multiple processors is the
optimal teamwork strategy Bereit recommends. The new software development
model must include a new framework for communications and management of
common resources, which should point to a way to enable the same processor
to execute different tasks at different times.
Click Here to View Full Article
to the top
GPL Patent Rule Pending
SD Times (06/01/06)No. 151, P. 1; Handy, Alex
Corporate patent holders are concerned about a provision in the draft
update to the GNU General Public License (GPL) that could limit their
ability to defend their software patents. The current draft states that
companies that initiate litigation to block others from copying changes
they made to a program protected by the GPL forfeit all rights to use that
GPL code. The Free Software Foundation (FSF) has found that patent
retaliation clauses are generally ineffective, and thus only included one
mention of the problem in the draft, according to Eben Moglen, president of
the Software Freedom Law Center. The FSF is likely to address the concerns
of corporate patent holders during the initial comment period, said Diane
Peters, who is serving as general counsel for the Open Source Development
Labs (OSDL) and discussing the changes to the GPL with large corporations.
Peters noted that the controversial clause would force companies to choose
between suing and running the GPL software in question, and that it is "the
first time FSF has reached in and controlled private behavior." Peters
warns of a situation where a company could unknowingly have GPL software
somewhere in its stack. If it filed a suit to protect a patent, the target
of the suit could conceivably dig up the GPL code and invalidate the
lawsuit. Peters hopes that that sort of uncertainty will be cleared up
when the next iteration of the draft document appears around the beginning
of July. Moglen also notes that the update makes GPL code compatible with
code released through other licenses, such as Apache, though it remains
unclear how it would be determined which license is dominant after a
merger.
Click Here to View Full Article
to the top
The User's View: Customer-Centric Innovation
Computerworld (05/29/06) Pratt, Mary K.
In an effort to bring a fresh perspective to the design of technology
solutions, some companies have begun hiring anthropologists to work with or
even lead their development teams. Companies value anthropologists because
they can look at technology from the user's perspective by asking questions
about how people work and the types of tools that they do and do not use.
While technologists can get wrapped up in adding more tools and automation
to an application, anthropologists can give them guidance on whether the
tools will actually be used or if they will just by an annoyance. While
observing systems administrators at IBM, anthropologist Jeanette Blomberg
found that they typically create their own local tools to help in the
management of their systems. IBM's Eser Kandogan then built a program to
support the systems administrators' tools based on Blomberg's observations.
"Technologists tend to look at the user and the user's relationships to
the technology. It tends to be very task-focused," says consultant
Patricia Sachs. "Anthropologists look at the missing layer." Research
about the way people interact and communicate with each other at Intel led
to a program for virtual collaboration that facilitates multiple methods of
communication, such as instant messaging and a shared white board. IT
anthropologists are still a rarity, though companies are increasingly
realizing the value of multiple perspectives when developing new
technologies. Adding an anthropologist to an IT department can also create
a cultural clash, as many IT workers might have difficulty accepting the
validity of an anthropologist's methods.
Click Here to View Full Article
to the top
Designers Wrestle Media Fragmentation
EE Times (05/29/06)No. 1425, P. 45; Merritt, Rick
The digital media explosion is plagued by fragmentation that extends
throughout networking standards, security standards, and Linux standards,
making the meshing of these various elements a difficult proposition. "The
No. 1 challenge is to enable interoperability across a range of platforms
where consumers can enjoy content," notes CTO of Intel's digital home group
Brendan Traw. "There's a huge set of things engineers have to put in
place--digital rights management, media formats--and you have to have all
the pieces implemented before the content flows." Digital media
interoperability is the goal of standards-setting efforts by the Digital
Living Network Alliance (DLNA), the Universal Plug and Play group (UPnP),
and Intel's Networked Media Products Requirements (NMPR). DLNA is focused
on the referral rather than the creation of standards, and the group
usually adopts all of UPnP's work to create application programming
interface standards, with the aim of addressing issues UPnP and other
groups overlook; Intel, meanwhile, is developing its own slate of
interoperability standards through NMPR. Groups addressing DRM
interoperability issues include the Coral Consortium, whose approach to
defining how a device can negotiate for rights to content on another device
DNLA recommends because it enables interoperability even without the
involvement of major DRM vendors. Apart from a few companies such as
Intel, developers see little chance of a unified standard for consumer
Linux. Other areas calling for common standards include remote user
interface support over a home network, automated Wi-Fi set-up
configurations, and display interfaces.
Click Here to View Full Article
to the top
Keeping U.S. Leadership in Engineering
Chief Executive (05/06)No. 217, P. 26; Khosla, Pradeep
The United States should focus more on managing the global process of
innovation than simply trying to boost its number of engineering students
in the years to come, writes Pradeep Khosla, dean of Carnegie Mellon's
College of Engineering. The nation is expected to continue to see its
numbers fall due to outsourcing and security restrictions, but the country
has enough advantages, such as its research infrastructure and culture at
universities, to ensure that the most promising students from around the
world will be attracted to cutting-edge research opportunities. The United
States should continue to make a commitment to its research enterprise,
while improving undergraduate education in a manner that would better
prepare students for the higher level of work stateside. Universities
should come to view themselves as the research and development unit of the
industry, which also means intellectual property policies will need to be
reevaluated. Carnegie Mellon is making changes in its curriculum that will
prepare students to oversee the management of innovation in an environment
that is multilingual, multicultural, and multinational. At the same time,
talented foreigners would have an opportunity to maximize their knowledge
of the U.S. industry. Such an approach would allow universities to be more
proactive in their response to global pressures, and forge ties with global
businesses and the top talent around the world.
Click Here to View Full Article
to the top
From COM to Common
Queue (06/06) Vol. 4, No. 5, P. 20; Olsen, Greg
The specificity and niche applications of component software have evolved
into a seemingly boundless hodgepodge of products and methodologies,
although Coghead founder and CTO Greg Olsen notes that the basic rationales
underlying component software--reuse and integration--have not changed
much. He observes three trends that contributed to his reassessment of
component software throughout its decade-long progression from application
component to context-neutral, versatile, specification-written object: A
redistribution of complexity from custom-created components into the
framework; the emergence of special-purpose component frameworks and
frameworks contained in frameworks; and the increasing accessibility of
component technology to wider and wider audiences. Olsen notes that a
decade ago, a "broker" was the name of the core component of CORBA, the
leading component framework. "This terminology reflected a conceptual view
of communities of powerful and autonomous components interacting as peers,
with the framework acting online as a facilitator," he explains. "In 2006,
container is the descriptive noun of choice, and the conceptual view is one
where the framework provides a rich, nurturing environment to a community
of minimalist components that perform focused tasks and that live
blissfully ignorant of the many responsibilities that the framework is
managing for them." Olsen also points out that projects currently fueling
software development are smaller and more narrowly focused than they used
to be, creating an atmosphere where special-purpose component frameworks
and frameworks within frameworks thrive. The movement of software
components to the masses is a result of the growing simplicity and
increased ease of use of component frameworks. Olsen writes that with the
improvement of component software technology comes the challenge of
developing the techniques, training, curricula, and experience base
necessary for successful implementation.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top