Olde Fashioned Legal Loopholes Allow Rigging of Hi-Tech
Elections
VoteTrustUSA (01/30/07) Stanislevic, Howard; Washburn, John
Elections can be fixed by exploiting legal loopholes in election reform
legislation, leading software test professional John Washburn and computer
network engineer Howard Stanislevic to conclude that any such legislation
should be rated according to its ability and intent to lower the risks of
such exploitation. Election management servers (EMS) can be linked to the
Internet even though Internet connections may be prohibited for voting
machines on which votes are cast, creating a situation in which a Trojan
horse program can be introduced or the ballot definition corrupted,
facilitating the election of the wrong candidate or the disenfranchisement
of voters, among other things. A second loophole allows a high failure
rate for equipment while not disqualifying equipment from service. This
renders denial of service attacks indistinguishable from "normal" in situ
malfunctions that fall under federal standards, once again clearing the way
for voter disenfranchisement and the election of the wrong candidate.
Instructions that direct voters to confirm voter verifiable records are
insufficient or nonexistent, which means the wrong candidate could be
elected while discrepancies between the DRE Summary screens and voter
certifiable paper records cannot be spotted by voters even in the event of
a full recount. A fourth loophole is the lack of a requirement to conduct
statistically meaningful audits, which can lead once more to the election
of the incorrect candidate. Former ACM President Barbara Simons thinks
Internet access to election management systems is "a very bad idea," and
draws a distinction between Internet connections and the display of
election data on a Web site, which presents no danger. Washburn and
Stanislevic think that "anyone with a bona fide interest in election
integrity should be on the lookout for the above loopholes ... in any
current or proposed legislation and must fight to close them before it's
too late." For more information about ACM's e-voting activities visit
http://www.acm.org/usac
Click Here to View Full Article
to the top
A Lively Market, Legal and Not, for Software Bugs
New York Times (01/30/07) P. A1; Stone, Brad
Both hackers and security companies engage in the buying, selling, and
trading of software vulnerabilities, but for a researcher who has
discovered a bug, the black market is often more tempting. Companies such
as Microsoft encourage security researchers to report bugs rather than sell
them, but there is no monetary incentive to do so. "To find a
vulnerability, you have to do a lot of hard work," says Evgeny Legerov,
founder of a small security firm in Moscow. "If you follow what they call
responsible disclosure, in most cases all you receive is an ordinary thank
you or sometimes nothing at all." Instead, Legerov's company sells this
information directly to corporate customers at prices starting at $10,000
for periodic updates. In the 1990s, a sort of agreement was reached where
security researchers would inform software manufacturers of bugs and allow
time for them to release official patches before disclosing the flaw to the
public, as long as the manufacturer gave them credit for their discovery.
However, this era of researchers who were satisfied by simply being
recognized began to erode about five years ago as security companies began
to purchase vulnerabilities and provide clients with solutions, claiming
that both clients and manufacturers were notified before the public. The
criminal marketplace for vulnerabilities, and the high prices it can
generate, gained attention in January 2006, when a group of Russian hackers
were found to have sold a zero-day program aimed at Windows Metafile (WMF)
that led to spyware and malware being planted in tens of thousands of
computers worldwide. "You will always make more [money] from [selling
vulnerabilities or malicious code to] bad guys than from a company like
3Com," says eEye Digital Security co-founder Mark Maiffret.
Click Here to View Full Article
to the top
Took Time, But Finally Tech Jobs Are Rising
Investor's Business Daily (01/30/07) P. A5; Krause, Reinhardt
The U.S. tech sector has added a considerable amount of jobs for the first
time since 2001, largely due to venture capitalism. Joint Venture: Silicon
Valley Network released a study showing that the area added over 33,000
jobs in fields throughout the sector during the 12 months ending June 30,
2006. Startups and entrepreneurial activity contributed a great deal to
the growth, says Joint Venture President Russell Hancock. "VC is a leading
indicator, while employment still lags a bit. We're seeing a real upturn
in VC activity into new areas," Hancock says. He expects Silicon Valley
to add 30,000 to 50,000 jobs in 2007, numbers that he calls "hopefully more
sustainable" than previous bubble levels. Large companies such as Intel
and Hewlett Packard have been cutting jobs, but have been able to increase
productivity, while others have been moving work to lower-cost areas. The
U.S. chip industry seems to be rebounding, as it added 1,100 jobs in the
first half of 2006, and currently has 75 percent of the workforce it did in
2001 and a global market share of 47 percent. Texas' "telecom corridor"
has seen recent success, thanks in part to RFID and VoIP companies.
Overall, the total jobs cut by tech and telecom companies reached a
six-year low of 131,200 in 2006, according to Challenger, Gray & Christmas.
Meanwhile, North Carolina is an emerging tech hotbed: Google will build a
$600 million data center in Lenoir that will employ over 200 people, and
Dell opened its Salem plant in late 2005, employing 1,100.
Click Here to View Full Article
to the top
New York Halts E-Voting Machine Testing
Computerworld (01/29/07) Songini, Marc L.
The New York State Board of Elections has suspended the evaluation and
certification of e-voting machines because Ciber, the company contracted to
do the job, had not met the requirements established in 2005. A Ciber
representative says, "The issues found in the audit do not reflect on the
accuracy of tests conducted before the audit. Ciber was accredited at the
time those tests were conducted, and they met all of the standards set for
testing and accreditation at that time." The U.S. Election Assistance
Commission (EAC) confirmed that the failure to meet requirements was due to
problems in the company's documentation process, although the specific
problems were not identified. Ciber had requested a special interim
accreditation available to businesses whose applications for 2005
certifications had not yet been processed, but the EAC turned down the
request. Ciber has sent a portion of the audit report conducted by the EAC
to the New York State Board of Elections for review. The Ciber
representative said the issues initially raised in the audit have been
addressed and the company is waiting for further notification. However,
EAC Executive Director Thomas Wilkey wrote a letter to Ciber's Wally
Birdseye claiming that the company had failed to follow its own quality
management requirements. The Ciber representative said "we don't feel it's
appropriate to comment further on the [audit] process while its still
underway. Our focus is to concentrate our resources on addressing any
issues, completing the process, and achieving accreditation."
Click Here to View Full Article
to the top
Women in Tech: A Call to Action
InfoWorld (01/29/09) Nobel, Carmen
The changing characteristics of IT are making it more conducive to female
employees, and while they should not be hired based simply on stereotypes,
women traditionally posses the attributes that many IT departments will
begin to need more and more. The U.S Bureau of Labor Statistics found that
in 2006 women held just 26.7 percent of U.S computer and mathematics jobs,
a percentage that has been falling for some time, and can be seen in nearly
every category of IT. Women have also been leaving IT jobs at an alarming
rate. Mobile technology allows for significantly increased flexibility for
IT employees, so women with children are not as restricted in their
abilities as an employee. "Goodness, if we can outsource call centers in
India, we can help people have virtual office and flexible work hours,"
says former Hewlett-Packard CEO Carly Fiorina. As IT becomes less of an
isolated area within a larger company, the range of skills required of IT
employees is widening. Fiorina says, "The ability to collaborate with
others, the ability to communicate clearly, and the ability to see the
forest and not get lost in the trees," are now more necessary than ever.
Fiorina says that when hiring, companies should look at previous experience
and increase training efforts in order to find and build employees that
provide valuable skills that are currently scarce within IT, and while
women should not be hired simply because they are women, most IT
departments could balance their overall skill sets by seeking out female
potential employees. The role of female IT professionals as mentors and
role models cannot be underestimated, as this is considered the most
effective ways to cultivate interest among young women and current female
employees.
Click Here to View Full Article
to the top
FBI Turns to Broad New Wiretap Method
CNet (01/30/07) McCullagh, Declan
A University of Colorado law professor recently detailed an FBI
surveillance technique where electronic information from thousands of users
is obtained and placed into a searchable database if a specific individual
or their IP address cannot be found. This "full-pipe" surveillance is
capable of recording all Internet traffic flowing through a network, with
interception occurring inside an IP's network at the junction point of a
router or network switch. "You intercept first and you use whatever
filtering, data mining to get at the information about the person you're
trying to monitor," said Paul Ohm, who shed light on the practice at the
Search & Seizure in the Digital Age Symposium held at Stanford last week.
This new system is being called more invasive than the FBI Carnivore
surveillance program, which was abandoned two years ago. Federal law
states that the FBI must perform "minimization," whereby agents must
"minimize the interception of communications not otherwise subject to
interception" and inform the supervising judge as to what is happening.
However, another section of the law states that if the information being
obtained is in code or a foreign language and no expert in that language is
readily available, "minimization may be accomplished as soon as practicable
after such interception." Since digital communication is all encoded,
investigators can record as much as they please, without sorting it out
until later. In 1978, U.S. v. Pine stated that investigators are allowed
to keep listening to a tapped phone line in order to prosecute other
illegal activities not originally mentioned in the wiretap order,
suggesting that the same could be done with information obtained through
full-pipe surveillance.
Click Here to View Full Article
to the top
Can Software Catch Up to ICs?
EE Times (01/29/07) Crawford, Catherine
Despite the continued advances in chip technology, the software necessary
to take advantage of them is not available, writes Catherine Crawford,
chief architect for next-generation systems software at IBM Systems Group's
Quasar Design Center. She writes that although the number of floating
point operations per second on processors and systems has not yet reached
its limit, the amount that most software can utilize is still behind
today's available capacity. Meanwhile, innovative multicore processor
architectures have shown ability to utilize considerable computing power in
entry-level servers with multiple processors. To make use of the computing
power, developers must focus on programming methodology enablement and the
corresponding application framework and tools. The world's first
supercomputer based on Cell Broadband Engine (Cell BE), known as
Roadrunner, is one such architecture that can reach over 1.6 petaflops.
Roadrunner's architecture, multiple heterogeneous cores with a multitier
memory hierarchy, is made completely from commodity parts. Opteron
processors will handle standard processing, and Cell BE processors will
handle the more mathematically and CPU-intensive tasks. The system is
based on a "division of labor" philosophy. A set of computational kernel
developers are maximizing performance from the microprocessor ISA; library
developers will use frameworks such as Roadrunner's to create multicore
memory hierarchy libraries from the kernels; and application developers
will then connect these libraries using standard compiler and linker
technology. The Roadrunner approach shows how costs can be significantly
lowered by using consistent API methodology across a variety of multicore
architectures without introducing new languages.
Click Here to View Full Article
to the top
New UNH Model Measures Cyber Threat
Foster's Daily Democrat (NH) (01/26/07) Kressler, Thomas R.
Students and researchers of the University of New Hampshire's Justiceworks
Technical Analysis Group have developed a computer model that can be used
to evaluate the threat of a cyber attack posed by a specific terrorist
group, individual, or nation. The Cyber Threat Calculator produces a value
indicating the level of threat based on several variables, the two most
prominent of which are intent and capability. Actual groups and countries
were used as case studies when the Threat Calculator was demonstrated at
the Department of Defense cyber crime conference in St. Louis. A friendly
but technologically advanced country would have a high variable for their
capability to harm the United States, but the threat level would be brought
down by their low variable for intent to do harm. The Threat Calculator is
designed to evaluate large-scale strategic threats, rather than tactical
threats such as viruses, worms, and identity theft. The United States must
prepare for attacks on targets such as power grids, emergency response
systems, financial services, and telecommunications, says UNH professor,
Justiceworks researcher, and contributing research for the Department of
Homeland Security Andrew Macpherson. Justiceworks plans for the Threat
Calculator to be on the Web by late summer so any organization can utilize
it.
Click Here to View Full Article
to the top
The Vanishing IT Woman--System i Women Respond
IT Jungle (01/29/07) Roberts, Mary Lou
Mary Lou Roberts says people are responding positively to her citation of
a Gartner study in an earlier story, which contends that the decline in
female IT workers is proceeding at a higher rate than their male
counterparts possibly because women's needs and desires are not being met
in the IT arena. "Many women ... are choosing to stay home, partly because
business has not understood that women have multiple priorities and that
women are still the primary caregivers," notes analyst Nate Viall, who adds
that women put less stock in their work than men. The deceleration rate
for females enrolling in computer science and IT management college
programs is three times faster than for males. Bytware President Christine
Grant partly attributes the lower number of women entering IT to the fact
that although computers are more prevalent in education now, the methods of
teaching computer use and counseling people for tech careers have hardly
changed in two decades. "Perhaps girls ... see computers as a means of
efficiency for improving a job and not so much as a career," she reasons.
"Only through continued exposure to the career opportunities will we see an
increase in girls pursuing IT college degrees and career paths." Though
Grant acknowledges that men and women will always be subject to general
stereotyping, she notes that each person should be considered according to
his or her individual skills in relation to the job he/she is applying for.
E.D. Smith & Sons IT director and past president of COMMON Beverly Russell
argues that many women choose not to enter IT because the after-hours
commitment to the job does not appeal to them; on the other hand, women
whose roles as caregivers and child raisers have been resolved are more
likely to devote more time to work and ascend the career ladder. One of
the distinctions Russell makes between men and women is their expectations
of technology, observing that women are less tolerant of failures and
outages, and a combination of this attitude and good problem-solving and
communication skills can make them outstanding candidates for i System
management positions.
Click Here to View Full Article
to the top
Fortress a Big Jump on Fortran
Australian IT (01/30/07) Gengler, Barbara
Sun Microsystems has released an open-source prototype of Fortress, a
high-performance, statically checked, nominally typed, and component-based
programming language intended for work requiring heavy use of mathematics.
Also available are a series of draft specifications of Fortress and
published formal calculi and soundness proofs of some of the language's
main features. Fortress is meant to be a secure replacement for the
five-decade-old Fortran language, although no backward compatibility with
existing Fortran versions has been built into the new language. Fortress
includes an object-oriented type system, first-class functions, a component
system supporting separate compilation, and secure upgrade of program
components, according to project leader Eric Allen. The language is only
part of a 10-year effort by Sun to design a supercomputer of the future,
says Sun's Guy Steele, who explains that too often the mistake is made of
trying to design a programming language that "that is all things to all
people." Fortress' dynamic compilation capabilities will allow
optimization to be done while programs are being written. Other features
such as Fortress's component system and test framework will allow program
assembly, while testing will allow compiler optimization across library
boundaries. The language will run on many different platforms such as
supercomputers that have large stores of addressable memory, commodity
clusters, and workstations. Sun's long-term goal is a compiler that can
improve performance by adjusting the compiler version of software as it
runs.
Click Here to View Full Article
to the top
Tech's Dark Potential Troubles Terror Expert
Mercury News (01/29/07) Davies, Frank
Former U.S. terrorism czar Richard Clarke's new book, "Breakpoint,"
pictures a world at the mercy of cyber terrorism, where biotechnological
advancements allow wealthy parents to create ideal children and brain links
offer enhancements that could change the very nature of humanity. At a
recent book signing, Clarke cited a Chinese general who claimed that "China
could turn off the U.S power grid (through a cyber attack) during a war."
Clarke, who said he spoke with futurist Ray Kurzweil about the technology
discussed in the book, claimed, "Some of these things sound like science
fiction, but they're not." He noted the difficulty in projecting when
certain technologies will emerge; even if they are banned in the United
States, they could be developed elsewhere. "We need to be aware of what's
coming, because sometimes new technologies burst on the scene before we
decide if we want them and what the consequences are,'' Clarke said.
Nanotechnological and neurological innovations currently allow brain
implants to provide hearing to the deaf and allow the paralyzed to move a
mouse-like device with their thoughts alone, and the military has already
tested brain-computer linkage and worked on exoskeleton body suits.
Bioethicist James Hughes, who admits that "the scenarios Clarke describes
are quite plausible," stresses the need to make such technology widely
available if it does become a reality, rather than allowing it to be
limited to those who can most afford it.
Click Here to View Full Article
to the top
Survey: The Demise of Unix Is Exaggerated
InformationWeek (01/29/07) Gaudin, Sharon
Enterprises still prefer Unix, especially high-end versions, even though
x86-based systems continue to make significant strides with regard to
capabilities, scale, and availability, writes Gabriel Consulting Group
analyst Dan Olds in a new report. The research and analysis firm surveyed
277 data center managers and found that Unix usage is on the rise at almost
70 percent of the companies, most of which had 1,000 to 10,000 employees.
The use of Unix was said not to be increasing or actually decreasing by 22
percent to 26 percent of data center managers. Also, usage of low-end Unix
systems remains steady. "We believe that the much discussed death of Unix,
like the death of the mainframe and the minicomputer, is more myth than
fact and will remain so for at least the next decade," writes Olds. "Unix
systems and their associated applications fulfill vital functions in many
organizations and the costs and risks of switching to a different system
architecture are too high to justify benefits that may not be as
significant as promised."
Click Here to View Full Article
to the top
A Computer Program Wins Its First Scrabble
Tournament
Chronicle of Higher Education (01/26/07) Read, Brock
The performance of the computer program Quackle at the Toronto Computer
vs. Human Showdown in November represents another high point for artificial
intelligence in gaming. The open-source program defeated the former
Scrabble world champion, David Boys, in a best-of-five series. Boys, a
computer programmer who won the championship in 1995, took the first two
games, but Quackle won the last three, inserting letters in opportune
places that excellent human players perhaps would not have noticed to spell
words. Massachusetts Institute of Technology student Jason Katz-Brown was
a chief designer of Quackle, which finished the event with an impressive
32-4 record. Quackle defeated another Scrabble-playing program, Maven, for
the opportunity to play Boys. Maven had a record of 30-6. Quackle's
victory follows the well-publicized loss of chess grandmaster Garry
Kasparov to Deep Blue.
Click Here to View Full Article
to the top
Daylight Saving Changes: No Y2K, But There Could Be
Headaches
Network World (01/25/07) Mears, Jennifer
IT shops must gird themselves for an earlier than usual switch to
daylight-savings time this year if they wish to avoid problems with
timestamp-reliant applications that could put record compliance assurance,
operating room scheduling, billing, contract deadlines, and other processes
at risk, according to industry experts. The Energy Policy Act of 2005
decreed an extension of the daylight-saving schedule by a month, with the
period commencing on the second Sunday in March and ending on the first
Sunday in November. This change could disrupt IT systems that
automatically implement daylight-savings time according to the old
schedule, and thus IT professionals need to closely examine their systems
and applications to ascertain which could be thrown out of whack when the
change takes place and then take remedial action. "My fear is that a lot
of people aren't going to realize this is a big issue until months down the
road when they say, 'Oops, why aren't these dates lining up,'" says
TrueCredit CTO Scott Metzger. The majority of major IT vendors have Web
pages detailing what fixes are needed for their products to comply with the
new schedule, while smaller vendors are also taking steps to update their
products. Metzger says most of the updates his team has been working on
are focused on Java virtual machines. Advances in Technology CIO Rich
Debrino notes that there should not be many difficulties for systems
connected to external network time servers. IT consultant Mike Sly is
nonplussed that so many people seem unaware of the daylight-savings
switchover problem.
Click Here to View Full Article
to the top
Researchers Pave the Way for Canada's First Intelligent
Home
CNW Group (01/25/07)
Researchers in Toronto have tested in clinical trials a home-based
computer prompting system and an emergency response system that use
artificial intelligence. The computer prompting system was found to
increase hand-washing by about 25 percent, while the emergency response
system was found to have detected 77 percent of staged falls. Developed by
researchers at the Intelligent Assistive Technology and Systems Lab, a
joint venture between the Toronto Rehabilitation Institute and the
University of Toronto, the home-based systems were created with older
adults who have cognitive impairments in mind. The computer prompting
system essentially acts as a "talking" bathroom that provides video and
verbal cues from a computer screen. The emergency response system makes
use of ceiling-mounted cameras throughout a house to feed pictures to its
computer system in order to analyze images, as well as a voice recognition
system that asks occupants if they need help. "Our systems use computer
algorithms that act more like a human in terms of rational thought and
decision-making," says Toronto Rehab researcher Dr. Alex Mihailidis, who
adds that they are not meant to replace assistance from caregivers.
"However, the results from our studies are encouraging and show that the
use of artificial intelligence in a home setting can provide safety and
security and enhance the quality of life for older adults who would like to
remain in their homes as they age."
Click Here to View Full Article
to the top
Adobe to Send PDF to Standards Group
CNet (01/28/07) LaMonica, Martin
Adobe Systems this week is expected to describe its plans for submitting
its Portable Document Format specifications to the International
Organization for Standardization (ISO). Although subsets of the PDF format
have already been standardized, Adobe has been told by customers that
making PDF an ISO-approved standard would increase confidence in the
product's longevity. "We've already been taking feedback and updating the
specification over time," says Adobe's Kevin Lynch. "Now we'll be doing it
in a more formal way, through a standards body." The specifications for
PDF Reader and Acrobat will be given to the Enterprise Content Management
Association, which will convene a working group and pass the specification
to a standardization technical committee hosted by the ISO. Lynch expects
the process to take from one to three years. Adobe plans to stay
compatible with any PDF standards in its own products, and for existing PDF
standards to comply with any ISO standard. "We are starting to see the
industry get more interested in these document formats being managed by
standards bodies," said Lynch. "We see Microsoft responding to that, and
we are certainly responding to that, too."
Click Here to View Full Article
to the top
Winning Ways; Artificial Intelligence
Economist (01/27/07)
Computers have proved their superiority over humans in chess, draughts,
Othello, and backgammon, but it seems that yet another game, Go, which
humans have steadily dominated, will be soon to fall to the machines. The
method used by Deep Blue to defeat Gary Kasparov in 1997 is known as "brute
force," where rather than analyzing a single position to figure out the
best moves, the technique humans use, the computer considers all of its
possible moves, all of the opponents possible replies, and all of its own
responses to that move. The computer then looks through the map it has
created, which has millions of branches, to find the one move that would
leave its opponent the fewest chances of winning. However, this method
does not work in Go, a game that has many more possible positions than
Chess. A technique known as the Monte Carlo method, originally designed by
the Manhattan Project, is now being used to let computers compete with
humans in Go. This method uses an algorithm that considers every possible
move and plays many random games to find out their outcome; a move is
considered good if it wins 80 percent of the time. Monte Carlo techniques
are actually a lot faster than brute force, since computers are very adept
at generating random sample games. New developments in the Monte Carlo
method have allowed programs using it to win computer tournaments on nine-
and 13-line Go grids. One program, MoGo has even defeated strong human
players on nine-line grids, an unfathomable feat only a year ago, and
reached a world ranking of 2,323 and a European top 300 ranking.
Click Here to View Full Article
to the top
Start of the Hologram Wars?
New Scientist (01/20/07) Vol. 193, No. 2587, P. 22; Graham-Rowe, Duncan
Competing brands of holographic discs that can store up to 300 times as
much data as current DVDs could fragment the market, which is already
smarting from the continuing format war between Blu-ray and HD DVD
standards. The holographic storage methods employed by DCE Aprilis and
InPhase Technologies follow the same principles, but the companies differ
in how they record and read data because the light-sensitive polymers the
companies use are different. Aprilis' polymer employs "cationic
ring-opening polymerization," and InPhase undergoes "free-radical
polymerization." As a result, Aprilis' drives can read and record discs
rapidly--more than 1 Gbps--while InPhase drives have a maximum read speed
of 20 Mbps and an even slower record speed. InPhase's product counters its
speed disadvantages with market readiness, although the cost of the drives
and disk cartridges will probably appeal initially to businesses and
governments that generate massive volumes of data every day. Although
further behind in terms of development, Aprilis and its partners are
pushing for a mass-market version of its product that replaces Blu-ray and
HD DVD. A lingering problem is the discs' lack of rewritability.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Cure for the Multicore Blues
IEEE Spectrum (01/07) Vol. 44, No. 1, P. 40; Goldstein, Harry
Writing code for multicore chips is the purpose of the RapidMind
Development Platform, the brainchild of University of Waterloo computer
science professor and RapidMind founder Michael McCool. A sore point for
programmers is the parallel processing inherent in multicore architecture,
and relieving the extra stress this entails is the goal of the RapidMind
platform. In theory, programmers could code their entire application to
run on multiple cores with the assistance of RapidMind, which takes the
most computationally heavy program sections, fragments them, and runs them
in parallel on several processor cores simultaneously. As a commercial
product, McCool's design for the RapidMind platform was "something that I
could teach in about 10 minutes, that you could use without mental overhead
so you can focus on the algorithms, not the details of the particular
processor." The first step in writing an application with RapidMind is the
identification of the components to be accelerated. Rather than writing
code using garden-variety C++ terms that stand for subroutines and
functions in a C++ library, the RapidMind user employs words from the
RapidMind vocabulary that represent subroutines and functions contained in
the RapidMind library, and that summon subroutines and functions that run
in parallel. RapidMind uses dynamic load balancing to accommodate
computationally intensive applications.
Click Here to View Full Article
to the top