Panel Backs Guideline Favoring Voting-Machine
Verification
Washington Post (12/06/06) P. A9; Barr, Cameron W.
After failing earlier this week to pass a measure recommending that
e-voting machines be required to allow audits independent of their
software, the Technical Guidelines Development Committee (TGDC) has
unanimously agreed upon a new version of the resolution, which grandfathers
in existing systems but states that the "next generation" of e-voting
machines should have such independent audit capacities. Electionline.org
director Doug Chapin says, "This seems to mark the end of an era" for
e-voting without a paper trail, but many point out that there is no money
left to be spent on new election systems. No timeline was given by the
TGDC, which advises the U.S. Election Assistance Commission (EAC), but many
in Congress and local politicians have pledged to begin exploring options
to carry out the recommended reforms. Virginia General Assembly Delegate
Timothy D. Hugo said that "the committee recommendations...will really make
people stand up and pay attention" to the changes that must take place.
The report also stated that all disabled voters should be able to verify
their votes, and that election officials and voting machine manufacturers
should be charged with ensuring security measures. The National Institute
of Standards and Technology's Michael Newman said the panel has until July
to create a set of standards to submit to the EAC.
Click Here to View Full Article
to the top
Spam Doubles, Finding New Ways to Deliver Itself
New York Times (12/06/06) P. A1; Stone, Brad
After being successfully foiled by anti-spam software to the point that
they were no longer a major concern at the beginning of the year, spammers
have found new techniques of flooding mailboxes and consuming bandwidth.
Spam filtering firm Ironport claims that spam volumes worldwide have
doubled from last year, and that junk email now makes up over 90 percent of
emails sent. By embedding text in images, spammers found a loophole in
spam-blocking technology, which scans traditional email text to detect
telltale signs of spam. The use of botnets now makes blacklists unreliable
as well as allows spammers to send more messages without being charged for
generating the data traffic. Ironport's Patrick Peterson admits that, "The
bad guys are simply outrunning most of the technology out there today." By
adding speckles or flowery patterns to images where text was imbedded,
spammers even confused programs designed to detect text in images. They
have also developed a way to change just a few pixels in each email sent
out, creating a unique "fingerprint" for each, so programs that identify a
message as spam and eliminate all copies no longer work. Linking violators
to incriminating Web sites has gotten more complicated, as the "pump and
dump" technique is now quite popular; where spammers purchase cheap stock
in an obscure firm, send out email advertising the stock, and sell when
enough unsuspecting people buy the stock. Today's spammers operate out of
Russia, Eastern Europe and Asia, according to expert, making them immune to
strict U.S. anti-spam legislation.
Click Here to View Full Article
to the top
IT-User Services Staff Take Top SIGUCCS Award
UDaily (University of Delaware) (12/05/06) Hutchinson, Becca
A podcast, created by University of Delaware's IT-User Services
department, that provides students having computer-related trouble with
explanations that couldn't be expressed over the phone or even by detailed
Web sites, received top honors in the promotional video/audio category at
the ACM-Special Interest Group on University and College Computing Services
(SIGUCCS) Conference, which took place Nov. 5-8, in Edmonton, Alberta,
Canada. The podcast, called "Consulting on Demand," was inspired by the
frustration the department faced as a result of outdated troubleshooting
practices, said IT-User Services manager Ronald Nichols. Begun last
November, the project resulted in a two-part Web video that walked students
through some of the more complicated aspects of connectivity in residence
halls, and received over 1,000 hits during the student move-in period at
the beginning of this school year. Given this initial success the
department made and published Web videos on various cyber safety subjects.
Videos submitted at the conference addressed subjects such as spam,
encryption, and establishing a Web proxy.
Click Here to View Full Article
to the top
Carnegie Mellon Researchers Uncover Online Auction
Fraud
AScribe Newswire (12/05/06)
By analyzing the publicly accessible transaction histories of online
auction sites, Carnegie Mellon University researchers have been able to
identify suspicious behavior and associations between users, using data
mining techniques. These fraudsters, such as those who take money for the
sale of an item and never mail it, accounted for 97,000 complaints passed
along to law enforcement by the federal Internet Crime Complain Center, and
can now be located and purged from auction sites. By identifying
accomplices, the emergence of new fraudsters can be prevented as well. The
system, known as Network Detection via Propagation of Beliefs (NetProbe),
gives a numerical rating of trustworthiness that cannot be manipulated the
way reputation systems used by the auction sites can be. Accomplices, who
do not commit fraud directly, use their favorable reputation to boost the
feedback ratings of fraudsters, but this can be detected using a graph of
transactions, where users are represented as nodes and transactions as
lines connecting the nodes. Researchers found that in such a graph the
transactions completed between accomplices and fraudsters shows a
"bipartite core," meaning one group has a great deal of transactions with
another but none within its own group; the accomplice group also deals with
honest users but mostly with fraudsters. This technique has been tested on
massive sets of data and is currently being used to examine about a million
eBay transactions.
Click Here to View Full Article
to the top
Civil Libertarians Protest Privacy Policy
Washington Post (12/06/06) P. A11; Nakashima, Ellen
New privacy regulations, and the board created to oversee them, are
drawing criticism from various civil liberties groups, who have cited the
protections guaranteed by the Privacy Act of 1974. Electronic Privacy
Information Center executive director Mark Rotenberg points out the "the
absence of transparency, the absence of oversight, and the inability for
individuals to know what information about them is being collected by the
federal government." The guidelines, drafted by the Office of the Director
of National Intelligence, state that information obtained on "U.S. persons"
be done so legally, and be shared only if it is relevant to terrorism or
law enforcement; however, the guidelines do not require that those affected
be notified. Markle Foundation privacy task force member James Dempsey
says the privacy regulations also fail to address data-collection standards
or establish appropriate methods to deal with those who have been
mistakenly targeted. The privacy board, which has only five members, is
part of the executive branch and does not have the power of subpoena,
causing civil libertarians to call for the establishment of a more
independent, capable body. ACLU legislative director Caroline Fredrickson
said the board has no power to alter policy on some of the most pertinent
issues. Broader oversight of the government's data-mining policies and
terrorism surveillance programs have been promised by Democrats poised to
take over Congress next year.
Click Here to View Full Article
to the top
Segway Inventor Scoots to Bigger Matters
CNet (12/05/06) Wenzel, Elsa
Dean Kamen says his goal is to give "people better lives...in a
substantial way," and his latest projects, a filtration system that can
purify water from sewage and small generators that can produce electricity
from any fuel, show his belief in technological innovation's power to make
such changes. He laments the obsession of today's youth with sports and
entertainment, hoping that projects such as the For Inspiration and
Recognition of Science and Technology (FIRST) tournaments for school
children he founded will make science and math equally exciting. He says
that getting clean water to developing countries is the most pressing
concern facing the world, and that venture capitalists currently have a
great interest in "clean tech," but he hopes that this is not "a fad or a
mood." The portable energy device, which is currently further along than
the water purification system, has already been used in two villages in
Bangladesh where cow dung, the only available fuel, was burned to provide
electricity for 24 weeks. When asked about future energy technology, Kamen
says that solar energy is only waiting for the transition that will make
its implementation less expensive and more reliable, stressing the
difficulty a single generation has in changing its ways. He says that his
biggest fear is those who are anti-technology, who want to "replace this
technology with no technology."
Click Here to View Full Article
to the top
Revival of the Supercomputer
EE Times (12/04/06) Merritt, Rick
DARPA is working with IBM's X10, Cray's Chapel, and Sun's Fortress
languages in order to find an ideal language to simplify the increasingly
intricate task of developing software for supercomputers. Sources indicate
that one language could be picked outright, or a hybrid could be developed,
but the decision should be made within 18 months. University of Tennessee
supercomputer researcher Jack Dongarra says the Message Passing Interface
(MPI) currently used on supercomputers "has just too much programming
complexity to get it all right." The three languages submitted are
intended to provide an enhanced view of complex systems to allow more
efficient communication between diverse processors. University of Illinois
professor Marc Snir, who helped design MPI, said of the three languages:
"They are still in a primitive state, and there is no evidence yet they
will be embraced by the application community." Snir says Cray's Chapel is
based on data parallelism, making it the most like MPI, IBM's X10 brings in
new ideas such as atomic data structures, and Sun's Fortress, which he
calls "Matlab on steroids," would require the greatest alterations to
compilers. Snir also points out that a programming language is only one of
many problems facing supercomputer development, and expresses doubt that a
hybrid of two or three of the languages in development will be the answer.
He says MPI has already proven its ability to scale up to today's large
systems, noting its deployment on one of IBM's BlueGene/L supercomputers.
"We may need to reconvene to write minor enhancements," he says, "but we
don't expect MPI to evolve significantly."
Click Here to View Full Article
to the top
UCF Researcher's 3-D Digital Storage System Could Hold a
Library on One Disc
University of Central Florida (12/04/06) Kotala, Zenaida Gonzalez
University of Central Florida researchers have developed a method dubbed
the Two-Photon 3D Optical Storage system that is able to record and store
at least 1,000GB of data on multiple layers of a single disk. "For a
while, the community has been able to record data in photocromatic
materials in several layers," said UCF chemistry professor Kevin D.
Belfield, who led the research. "The problem was that no one could figure
out how to read out the data without destroying it. But we cracked it."
The technology involves firing two different wavelengths of light onto the
recording surface. By using two layers, the specific image created is
sharper that any current technique can produce. The color (wavelength) of
the light determines whether information is written on the disk or read
from it, so a user can control what information remains intact. Beyond
application in storing library or museum data, Belfield's department is
experimenting with the ability of this technology to identify and treat
certain types of cancer.
Click Here to View Full Article
to the top
Q&A: Responsible Disclosure of Vendor Flaws and What It
Means
Computerworld (12/04/06) Vijayan, Jaikumar
Publicly disclosing vulnerabilities in software products is about
increasing the pressure on software vendors to improve the security of
their applications, according to vulnerability researcher H.D. Moore in an
interview with Computerworld. Moore, who has been involved with the
independent group of security researchers behind the controversial
Metasploit Project, says the various initiatives he has undertaken were
meant to raise awareness of flaws in software and the potential impact of
the vulnerabilities on an organization. Though Moore and other independent
security researchers have come under fire for making it easier for bad guys
to exploit software vulnerabilities, he says his critics are not facing
reality. He maintains that hackers exploiting the most problematic flaws
are often caught before word of the vulnerability goes public. And he
views responsible vulnerability disclosure as a flawed approach to software
security because not disclosing flaws publicly does not necessarily mean
software users will be safer. Moore believes his security efforts, from
posting vulnerability information to releasing the Metasploit Framework
tools, have largely been a success.
Click Here to View Full Article
to the top
Health Hazard: Computers Spilling Your History
New York Times (12/03/06) P. 3-1; Freudenheim, Milt; Pear, Robert
While health insurance companies, tech companies, and the U.S. government
are all pushing to computerize the health records of Americans in order to
improve the ability of the medical profession to share information in the
name of making valuable advances, many people are wary of the potential
risks of making such information widely available. A Markle survey found
that 56 percent of respondents were very concerned about abuse by
employers, though nearly all respondents were eager to experience the
benefits Internet technology could bring to health care. Some employees
fear they could lose their job due to expensive medical conditions, as such
instances have been found to occur. In many cases, employees have the
decision whether or not to submit their information to be put into an
electronic database, but some companies are even offering small sums of
money for those willing to cooperate. While most large companies claim
that personnel professionals do not have access to medical information of
employees, many suspect that there are companies where those in charge of
insurance claims also handle hiring and firing decisions. Unfortunately,
charges are rarely brought against those who illegally access medical
records. Due to fears of lawsuits resulting from sharing data, American
primary-care physicians make use of electronic health care information
systems far less than their counterparts in England or in the Netherlands,
according to the journal Health Affairs. The new Democratic Congress has
already pledged to address this issue of privacy once it takes power next
year, according to Rep. Edward J. Markey (Dem.-Mich.).
Click Here to View Full Article
to the top
Q&A: ITU Study Group Chair Talks About its Next
Generation Network Initiative
Network World (12/04/06) Marsan, Carolyn Duffy
John Visser, a Nortel exec serving as chairman of the International
Telecommunication Union's Study Group 19, says ITU's Next Generation
Network (NGN) global standards initiative is working to blur "the
boundaries between traditional voice telecommunications, data and broadcast
communications." With the blurring of boundaries between mobile and
Internet usage, Visser says the goal of NGN is to see that "the services
and how they function [are] not...influenced by the means of how you access
them." NGN, which most major telecoms have already signed on with, will
bring about "a coherent and well-integrated infrastructure," Visser says,
featuring integrated services including voice, and reduced costs. In order
for corporate networks to take advantage of NGN standards, Visser says "an
IP backbone is essential," as well as "good data rates to the individual,"
and "solid database access." As far as mobility goes, he stressed the
impact of tariffs and pricing set by providers, making this one concern
that still needs to be worked out. He also added that security is priority
number one for the NGN, and that it will support both IPv4 and IPv6.
Click Here to View Full Article
to the top
Supercomputers: Strength in Numbers
Age (Australia) (12/05/06) Karena, Cynthia
Australia made huge strides in e-research this year and hopes to
experience even more success next year. A simulation program that used
networked computers to boost computing power has enabled Monash University
professor Amanda Lynch to obtain more in-depth analysis of what-if
scenarios involving the impact of burning the savannah grasslands on the
summer monsoons. Lynch used the Nimrod software to tap into the computing
power of machines in the United States and Asia, and the massive number
crunching performed on 1.6 TB of data left her with 100 different types of
fire statistics. Meanwhile, the Victorian e-research strategic initiative
(VeRSI) got underway in October and will spend $10 million to develop
e-research applications. At Monash's e-Research Center, professor Ah Chung
Tsoi is researching ways to manage the enormous amounts of data that
supercomputers produce in an Archer project that would allow researchers to
access and share information from a single Internet portal. "The uptake of
e-research enables [scientists] to set up large databases in large
collaborative environments," says Paul Davis, executive director of VeRSI.
Australia has plans for eight e-research projects next year, including one
that would allow researchers involved in the virtual beam line project at
the national synchrotron in Melbourne to control instrumentation remotely
from their desktop via a private, high-speed network.
Click Here to View Full Article
to the top
Interview With Web Guru Tim O'Reilly: 'We're Moving Into
a New World'
Der Spiegel (12/04/06) Stocker, Christian
Tim O'Reilly, considered the father of the term Web 2.0, says he has
gotten sick of the term and wishes it was understood as less of a
dotcom-bubble-type fad, and more as "this idea of harnessing collective
intelligence." He believes that "open source communities create a lot of
value," and that programmers should not be, and are not, resentful about
not being paid for their work that companies use to make money because they
understand the benefit to their reputation. Wikipedia is an example of the
way Web 2.0 is built around trust: While anyone can submit an article or
make edits, there is a still an inner-circle of users who have proven their
loyalty and integrity. He praises online information gathering projects
for their attempts at accuracy, given that "anything we do is a selection
of reality. That's a great source of disorder in our society." O'Reilly
says the "wisdom of crowds" is somewhat represented by Google, which he
calls "the furthest we've come toward artificial intelligence," and while
individuals still make decisions based on the "quality of results," these
decisions, which go against the Google system itself, sometimes turn out to
be wrong. Pointing out that the best anti-spam measure is people's ability
to identify it as spam, he says, "We're moving into a world that's not just
about people expressing opinion--it is really about distributed data
gathering and real time intelligence." O'Reilly does not fear the entrance
of PR and advertising into sites like YouTube, because the success or
survival of such services rest on their quality. When asked about
predicting the next buzzword to have the impact of Web 2.0, he spoke of a
new magazine he is helping to create called "Make," which will focus on the
interaction of computing with the physical world, such as custom
manufacturing, synthetic biology, and the democratization of these types of
advances.
Click Here to View Full Article
to the top
Open-Source Spying
New York Times Magazine (12/03/06) P. 54; Thompson, Clive
There is a glut of chatter for intelligence agencies to sift through to
find evidence of terrorist plots or other kinds of criminal activity, and
it is hoped that wikis or blogs might help ease the burden and
revolutionize analysis. The idea was spawned from an essay written by
Calvin Andrus of the CIA's Center for Mission Innovation, which posited
that it is the explosion of self-publishing in which the real power of the
Internet resides; Andrus noted that blogs and wikis are self-organizing,
and theorized that if agents or analysts posted blogs and wikis on the
Intelink network, then mob intelligence would ensue and facilitate a
democratic process of information sharing. Perhaps even more
significantly, the blogs and wikis could substantially enhance Intelink's
search engines. With such an approach, clues of a terrorist plot such as
the one responsible for the 9/11 bombing would inexorably come together and
gain authority in the intelligence community, Andrus suggested. A wiki's
usefulness to intelligence analysis is being tested with Intellipedia, a
prototype wiki for intelligence employees; agents are encouraged to add to
the wiki's content, which consists of hundreds of articles from
nonclassified documents. Thomas Fingar with the office of the director of
national intelligence (DNI) admits that Intellipedia will not eliminate the
likelihood of false or erroneous reportage, but he thinks a sufficient
number of contributing analysts will catch major mistakes. Meanwhile, DNI
CIO Dale Meyerrose directed the creation of a test blog for intelligence
collection. New York University professor Clay Shirky says the success of
"social software" for intelligence agencies depends on convincing thousands
of analysts to start blogging and producing wikis, and key to this will be
shifting agents' secretive mindset to one that is more open to sharing.
But there are concerns that such an approach could expose potentially
dangerous information to the wrong people.
Click Here to View Full Article
to the top
Tomorrow's Security Today
InformationWeek (12/04/06)No. 1117, P. 45; Greenemeier, Larry
Under development today are future security technologies that stand out in
terms of their proactivity. The linkage between physical and IT security
technologies is a central component of video surveillance, and upcoming
innovations in this domain include IBM's Smart Surveillance middleware,
which embeds analytical capabilities into camera, chemical-sensor, radar,
and audio surveillance systems for the detection of suspicious activity;
3VR Security CEO Stephen Russell says the market for recording and managing
video surveillance was revolutionized by the storage of digital video on
hard drives. Jeff Platon with Cisco Security Solutions says the next few
years will see the availability of technology that can match images of
employees and visitors with video footage of people walking through a
business' front door, once facial-recognition software improves. Standards
for protecting systems and data from outside attacks and physical theft are
under development by the Trusted Computing Group: Examples include the
Trusted Network Connect standards for network access control technology and
the Trusted Platform Module for the special storage of user credentials off
the hard drive. Wave Systems CEO Steven Sprague predicts that within a
decade, "You will authenticate the human being to the machine, and the
machine will authenticate you to the network." Advanced fingerprint
authentication solutions from the likes of Nanoident Technologies are also
on the horizon. The biometric sensors Nanoident makes can reportedly scan
prints, tissue structure, and hemoglobin levels, while CEO Klaus Schroeter
says the wide implementation of fingerprint authentication technology
requires an upgrade in accuracy. Around the close of the decade, companies
will be capable of ascertaining whether criminals can blend together
seemingly harmless pieces of information about clients, employees, and
partners to access sensitive data through innovations pioneered by groups
such as the Palo Alto Research Center's security and privacy research unit,
which is working on privacy monitoring software with a data inference
assessment application.
Click Here to View Full Article
to the top
Software Fault Avoidance Issues
Ubiquity (12/04/06) Vol. 7, No. 46, Saha, Goutam Kumar
Center for Development of Advanced Computing (CDAC) scientist and ACM
Ubiquity associate editor Goutam Kumar Saha highlights various issues of
software fault avoidance, which is a methodology to generate fault-free
software via techniques designed to lower the occurrence of latent defects
in software programs. Software fault avoidance strategies outlined by the
author include verification and validation, software testing, and proof
methodology. Fault avoidance seeks the prevention of flaws cropping up in
the operational system through fault prevention (eliminating the likelihood
of faults occurring in a system before it is up and running), fault removal
(pinpointing and removing the causes of errors), and fault forecasting (a
series of methods for removing the presence, creation, and consequence of
faults). Fault prevention can be facilitated through quality control
techniques applied during the design and construction of hardware and
software, examples of which include structured programming, information
hiding, modularization, rigorous design rules, training, rigorous
maintenance procedures, and firewalls. Saha writes that reducing the
probability of faults occurring is the purpose of fault avoidance
techniques, while maintaining system operations despite the presence of
faults is the goal of fault tolerance techniques; tolerance techniques
include exception handling, watchdog timers, assertions, acceptability
checks, reasonableness checks, design diversity, and data diversity. The
author lists a number of software fault avoidance rules that should be
adhered to irrespective of the type of installed software-structure. These
rules include the specification and analysis of all requirements via formal
techniques; the debugging and stabilization of the specification document
prior to the component development; the creation of a problem-solving
protocol; the formalization of all verification, validation, and tests that
reveals the absence of correlated faults; and the rigorous testing of all
specifications, design, and code. The use of robust design concepts in
conjunction with fault avoidance techniques during software system design
can help avoid faults caused by environmental changes or failure caused by
latent defects, according to Saha.
Click Here to View Full Article
to the top
Send in the Terminator
Scientific American (12/06) Vol. 295, No. 6, P. 37; Stix, Gary
Microsoft has found several termination bugs in drivers in its upcoming
Vista version of Windows using its new Terminator tool. Developed by
Microsoft Research, Terminator is designed to check for bugs by proving
that a driver has completed its task. Terminator marks a breakthrough for
an industry that has mathematical proof that an algorithm verifying that a
program has run to completion could not be created, in the form of research
from mathematician Alan Turing. "Turing proved that the problem was
undecidable, and in some sense, that scared people off," says Byron Cook, a
theoretical computer scientist at Microsoft Research in the Cambridge
laboratory who headed the Terminator project. Microsoft has been using the
automated verification tool for nine months, but outside developers of
Windows device drivers have not had access to it. Proofs for 99.9 percent
of commercial programs that finish executing may be found using Terminator
one day, Cook believes. Still, Terminator does not prove that Turing was
wrong. "There will always be an input to Terminator that you can't prove
will terminate," adds Cook.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Philanthropy's New Prototype
Technology Review (12/06) Vol. 109, No. 5, P. 48; Surowiecki, James
The One Laptop per Child (OLPC) initiative echoes the spirit of
philanthropist Andrew Carnegie's library-building effort of encouraging
local participation in the hope of getting people to sign on for free
access to knowledge, in this case via a super-cheap laptop. The OLPC
project is the brainchild of MIT Media Lab cofounder Nicholas Negroponte,
who hopes to bridge the "digital divide" between Internet haves and
have-nots by giving all children in the developing world laptops that cost
just $100 to build and $30 a year to own and run. The design and marketing
of the $100 laptop is the responsibility of the OLPC nonprofit, while its
construction will be handled by an outside manufacturer. For the moment,
governments will be responsible for purchasing the computers. The
technical requirements of the $100 laptop include: Ruggedness,
functionality even in the absence of a steady power supply, easy Internet
access and networking, and a cheap, readable display; designers claim to
have met some of the most difficult challenges, including power generated
by a foot pedal or pull string, linkage of the computers into a mesh
network, and provision of an approximately $35 screen with a
high-resolution black-and-white mode as well as a backlit, lower-resolution
color mode. The first working models of the laptop will be tested in five
developing nations, but even successful tests are unlikely to ease
Negroponte's job of convincing governments to purchase the machines. But
if the OLPC effort works, it will help usher in a new philanthropic model
that stresses the funding of projects in areas where both business and
government have failed to provide a critical need, and that concentrates
more on the social returns that investments in charities yield. OLPC
embraces the "high-engagement philanthropy" model by working within the
market in its outsourcing of the laptop's manufacture to companies that
expect to earn profits, while its reliance on three distinct types of
enterprises--private companies, nonprofits, and governments--is another
differentiating factor; in addition, OLPC is taking on activist
responsibility in its goal of persuading governments to buy the
technology.
Click Here to View Full Article
to the top