Democrats May Give Voting Machines More Scrutiny
National Journal's Technology Daily (11/14/06) Martinez, Michael
Paper trails for e-voting machines may become a reality under the
newly-elected Democratic Congress. Concerns continue to be raised about
the security of e-voting machines, and the problems uncovered in several
races during the recent election gave voting-rights activists no reason to
abandon the issue. Rep. Rush Holt (D-N.J.) plans to reintroduce in the
next session his bill that would mandate e-voting systems leave paper
records, and the proposal has attracted more than 200 of his colleagues as
co-sponsors. The House got a late start in addressing the issue, as the
House Administration Committee did not hold its hearing until late fall.
Voting-rights activists are also optimistic because Rep. John Conyers
(D-Mich.), who was a major figure in investigating e-voting problems in
Ohio for the 2004 presidential election, is in line to become the next
chairman of the House Judiciary Committee. And in the Senate, Sen. Dianne
Feinstein (D-Calif.), who is poised to head the Rules Committee, plans to
introduce a companion bill to the House bill that would also require paper
records. "It will be a different environment," Holt's spokesman Patrick
Eddington says of Congress. For information about ACM's e-voting
activities, visit
http://www.acm.org/usacm
Click Here to View Full Article
to the top
Productive Petascale Computing
HPC Wire (11/15/06) Vol. 13, No. 3,
MIT Lincoln Laboratory's Jeremy Kepner, who chaired a panel discussing
"High Productivity Computing and Usable Petascale Systems" at SC06, said
government funding for the Defense Advanced Research Projects Agency's HPCS
program is necessary because there are needs that cannot be met by existing
commercially produced systems. Measuring innovations and determining which
ones feed into productivity is key to the creation of productive petascale
systems, according to Kepner. "I think from a technology perspective, the
biggest roadblocks are steep memory hierarchies and requiring heterogeneous
parallel programming approaches to achieve performance," he reported. In
Kepner's opinion, the most productive manner in which to balance the
various demands of HPCS is to blend technologies so that they yield the
flattest memory hierarchy and let the users perceive the system in the
least complicated and most persistent way. Kepner described the ideal HPC
programming language as one that boasts "strong support for
multi-dimensional array constructs, PGAS, good single thread performance,
and an integrated interactive development environment." He was confident
that once the most beneficial features of the programming language have
been specified, the optimal adoption strategy will be determined in a
straightforward way.
Click Here to View Full Article
to the top
Computer Industry 'Faces Crisis'
BBC News (11/17/06) Ghosh, Pallab
British Computer Society President Nigel Shadbolt has expressed concerned
that the U.K. is in danger of losing "its preeminent position as a
knowledge-based economy," he said in his first major interview since
assuming the title earlier this month. Shadbolt, who is a professor of
artificial intelligence in the School of Engineering at the University of
Southampton, cites the fact that demand for IT and computer professionals
has doubled but the number of university students graduating with these
degrees has decreased by a third. He fears an inability to compete with
India and China, because "They are equipping their younger generation,
their graduates with substantial amounts of skills particularly in
computing and IT and we do not want to be faced with the situation in which
major corporates who have traditionally sought skills of that sort in this
country look to supply that demand offshore." It is not only the IT field
that would feel the repercussions of such a loss; fields such as
pharmaceuticals and transportation depend heavily on IT abilities.
Shadbolt places blame on schools and public image that portrays IT
professionals as "geeky." The economy of the 21st century, as he describes
it, is one where "information is one of the primary assets. So really
understanding the consequences of the technology and the society on
business is fundamental."
Click Here to View Full Article
to the top
Cornell Robot Discovers Itself and Adapts to Injury When
it Loses One of its Limbs
Cornell News (11/16/06) Steele, Bill
A robot has been built by Cornell researchers that is able to learn how to
walk by analyzing its parts, creating the ability to adapt to any changes
it may encounter. Cornell assistant professor of mechanical and aerospace
engineering Hod Lipson explains, "Most robots have a fixed model
laboriously designed by human engineers. We showed for the first time, how
the model can emerge within the robot. It makes robots adaptive at a new
level, because they can be given a task without requiring a model. It
opens the door to a new level of machine cognition." The robot is only
given the knowledge of what it consists of and an objective. It begins by
creating models of how these parts may be arranged and developing
communications to send to its motors in order to test these parts; after
choosing commands based on which model is best, it sends commands and
analyzes its movements. This process is repeated 16 times before selecting
a method of moving forward. After completing its first task of reaching a
certain point, the researchers remove a leg, and the robot must go through
the 16 cycles that will let it find the most effective means of continuing
forward. The researchers consider the robot to have primitive
consciousness, because of its ability to consider actions before executing
them. They also believe that this project could provide insight as to the
way humans use images of themselves and the imagined result of specific
physical movements when learning to walk.
Click Here to View Full Article
to the top
Election '08: Vote by TiVo
Wired News (11/14/06) Axline, Keith
While electronic voting has met its share of critics and difficulties,
many believe the technology should be worked with rather than completely
scrapped. VoteHere founder Jim Adler believes that elections could, and
should, be made completely electronic, with voting taking place online. He
says, "The technology is done. It's really an issue now of politics and
people's will." Online elections have been held in Arizona and Michigan,
as well as Estonia, Switzerland, Canada, and England; and head of elections
for Swindon, England, Alan Winchcombe, said the system performed very well,
and that "People did try to hack it, but no one got through. The security
levels were very high." Those such as Adler believe that any voting system
would have inherent flaws, and that it is useless to assume that the
technology will fail outright. One way to solve the problem of the
vulnerability of home PCs would be a set-top box running open source,
verified, and digitally-signed software; voters would be given a receipt
containing a serial number by which voters can verify ballot-box results.
Several scientists interviewed agreed that this set-top technology would
quell many of their e-voting concerns. While it has been shown to increase
turnout, the idea of voting from home not only opens up issues of voter
confidentiality and coercion, but it makes the assumption that every voter
has Internet access. Meanwhile, other experts say online voting suffers
from a dependence on inherently insecure home PCs, the threat of
denial-of-service attacks, and database hacks. University of California at
Berkeley computer science professor David Wagner says voting "over the
Internet is crazy," while computer scientist David Jefferson says that
"there's really no way to secure the transmission of votes over the
Internet."
Click Here to View Full Article
to the top
NSF Chief Urges Colleges to Build Better High-Speed
Networking Tools
Chronicle of Higher Education (11/16/06) Young, Jeffrey R.
National Science Foundation director Arden L. Bement Jr. shared his vision
of cyberinfrastructure and its potential impact on technology and higher
education in the United States during a speech to U.S. college leaders.
Bement said cyberinfrastructure would be key to innovation, and added that
universities and colleges should invest in shared high-speed networking
tools and protocols for research as soon as possible. "Leadership in
cyberinfrastructure may well become the major determinant in measuring
pre-eminence in higher education among nations," Bement said during his
address, "Cyberinfrastructure: The Second Revolution," which was delivered
at The Chronicle for Higher Education's Technology Forum near Las Vegas.
Innovations in cyberinfrastructure have the potential to make the same
impact that the Internet has had on society, he said. Bement equated
leadership in cyberinfrastructure to America's ability to innovate and
compete globally in the future. For Bement, cyberinfrastructure is a
"comprehensive phenomenon that involves creation, dissemination,
preservation, and application of knowledge," and would also require new
"norms of practice and rules, incentives, and constraints that shape
individual and collective action."
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Not Lost in Translation
Technology Review (11/16/06) Ornes, Stephen
The National Institute of Standards and Technology (NIST) announced the
results of its annual evaluation of computer algorithms designed to
translate Arabic and Chinese text into English. Original documents are
sent to each competing team, which then uses their algorithm to create a
translation that is submitted for review by NIST's BiLingual Evaluation
Understudy (BLEU). Google's algorithm was ranked as the best translator
out of 40 entrants. Their system works by taking small groups of words and
looking at how they have been translated in past translations. The Google
software creates its memory by comparing the same document in both English
and Chinese or Arabic. The statistical approach it uses takes account of
the language as a whole, rather than referring to English language rules
for each word it encounters. Google's head of research, Peter Norvig,
says, "This is a more natural way to approach language." The team from
Kansas State University uses a combination of computer scientists,
anthropologists, language scholars, psychologists, and statistical methods
to establish a unique method of translation. NIST Machines Translation
Evaluations coordinator, Mark Przybocki, says that since some teams uses
statistics and other linguistics, the dialog provides experts in the field
with ample material for discussion and research. He says, "It's a hard
call to say any one technology is going to be the dominant force in the
future."
Click Here to View Full Article
to the top
New Computer Software Enable Rapid Response to
Time-Critical Emergencies
Newswise (11/16/06)
The U.S. Department of Energy's Argonne National Laboratory and University
of Chicago researchers presented specialized software at SC06 that allows
quick access to supercomputers and distributed computational grids in
emergency situations. The system, called Special Priority and Urgent
Computing Environment (SPRUCE), "makes massive resources available on short
notice for critical applications," including public health, safety, and
security emergencies, according to SPRUCE project leader and Argonne
National Laboratory computer scientist Pete Beckman. The demonstration at
SC06 displayed scientists demanding immediate access to the TeraGrid of
supercomputers at the University of Chicago in order to execute analyses of
a developing weather emergency in which time was of the essence. Resources
connected to SPRUCE are able to preempt current functions for emergency
response, or execute the emergency computations immediately after a current
function is finished. "Severe weather predictions can be computationally
intensive and naturally the workload is unpredictable," says NSF Linked
Environments for Atmospheric Discovery and University of Oklahoma associate
vice president for research Kevin Droegemeier. Beckman's vision of the
future of emergency response is that "all of the nation's supercomputers
will be ready to provide urgent computing to support and protect the
nation."
Click Here to View Full Article
to the top
IU Informatics Scientists Seek Tools to Shied Against
Wi-Fi Drive-Bys
Indiana University (11/15/06) Stuteville, Joe
Indiana University researchers recently conducted a study concerning the
weaknesses in the security of wireless routers used in homes and small
businesses. Hackers can gain access to vital information through bad
configuration or replacement of a router's firmware. Visiting research
associate at IU and computers science doctoral student Alex Tsow explained,
"Once compromised, a wireless router spoils Internet access for all
clients. Clients would be vulnerable to pharming, password sniffing, and
other man-in-the-middle attacks. Since a compromised router can victimize
all connected clients, public hotspots become a high value target for this
kind of attack." Tsow's team estimated that about 10 percent of household
wireless routers do not have any changes to their widely known default
security settings, and that about 33 percent use open-source firmware that
is widely available, making them easy targets for hackers. No known
anti-virus software kits are able to stop the kind of attacks the study
outlined, but the best defense is "to use a strong administrative password,
disengage wireless administration when possible, or use wireless protected
access encryption methods," according to IU associate professor of
informatics and contributor to the study, Mark Jakobbson. More can be
found out about attacks on wireless router networks through the use of
honeypots that emulate wireless servers with apparent vulnerabilities.
Click Here to View Full Article
to the top
Gender Gap: Women's Paychecks Still Lag Men's
Computerworld (11/13/06) Collett, Stacy
The job shortage in the information technology industry could worsen
considerably if the earnings gap between men and women is not closed.
According to Computerworld's 20th annual Salary Survey, women in IT average
$80,781 in total compensation, compared with $91,464 for men. Female CIOs
and vice presidents of IT make nearly $10,000 less then their male
counterparts, and women who are directors of IT earn an average of $109,446
while men make $114,045. The disparity in pay is a turnoff for many women
in the industry, says Gartner analyst Diane Morello. She warns that up to
40 percent of women in IT will leave the industry by 2012 if it does not
address its negative stereotypes about what women have to offer in the
workplace. But she also offers advice for them. "If I were a women trying
to advance, I would look at companies that have more global business and
put myself in positions for greater teaming and global projects," says
Morello. "Also, I would be asking if I have the right kind of
mentors--those who are tapped into business-based advancements and the
people who have high credibility."
Click Here to View Full Article
to the top
Spend More, Get More in Tech R&D? Not Always, According
to Innovation Study
InformationWeek (11/16/07) Gardner, W. David
A six-month Booz Allen Hamilton study of research and development at 1,000
global companies learned that less than 10 percent were getting full value
from money spent. "People think there are predictable black boxes out
there," noted chief investigator Barry Jaruzelski. "They think if you put
money in, innovation comes out. If only it worked that way." Jaruzelski
said high-tech firms with successful innovation models agree on one major
standard despite their various R&D approaches: An in-depth comprehension
of customer needs. They also utilized an end-to-end multifunction product
development process. The chief investigator reported that Apple Computer,
with its exceptional understanding of users, is the poster boy for solid
R&D maximization. Jaruzelski said Apple appears to be aware that it can
only make a few major innovation investments because its R&D resources are
limited. According to him, the outstanding innovators "all viewed
innovation as an end-to-end process." Google was also cited in the Booz
Allen Hamilton study for stimulating its engineers to spend a considerable
amount of their time imagining new concepts.
Click Here to View Full Article
to the top
Exterminating the Nuisance of Spam
CNet (11/15/06) McCullagh, Declan
The United Nations Internet summit in Athens, Greece, earlier in November
was beneficial because it brought NGOs, regulators, law enforcement, and
ISPs together and enabled the various stakeholders to share their ideas on
how to curb spam, according to Suresh Ramasubramanian in an interview with
CNet. Ramasubramanian, the head of antispam operations for Outblaze, says
convincing more email users not to click on attachments, persuading ISPs to
get involved in anti-spam mailing lists, and getting regulators and NGOs to
pass anti-spam laws would be a big help in reducing spam. He is also an
advocate for capacity-building for people, training sysadmins, promoting
open source, and improving connectivity. Developing countries have become
the source for a large percentage of spam, says Ramasubramanian. Outblaze
filters messages for sites such as Lycos, Mail.com, and Register.com, and
Ramasubramanian believes the ratio of spam to legitimate email is at least
10 to 1. He adds that a good spammer who launches 1 million messages a day
is likely to reach less than a fraction of legitimate email addresses.
Spammers' costs remain low by doing a botnet or an open relay, and they are
able to make money off of the 2 percent to 3 percent of people who decide
to buy their products.
Click Here to View Full Article
to the top
Metadata Labelling for Multimedia Content
IST Results (11/15/06)
The goal of the IST-funded aceMedia project is to devise a way to organize
massive volumes of multimedia content for easy location, retrieval, and
sharing by enabling self-analysis, self-annotation, and self-adaptation
through advances in knowledge, semantics, and multimedia processing
technologies. "People don't want to have to spend time managing their
content manually, they just want to be able to view it whenever and however
they want," notes aceMedia project coordinator Paola Hobson. "For that to
happen, multimedia content needs to become intelligent." Content
pre-processing, image recognition, and knowledge analysis tools are
employed by the aceMedia system to provide metadata annotations on still
and dynamic images, as well as specific segments of those images. The
image can then be identified when the user enters any of the annotated
words, similar words, or combinations of them into a search engine. The
central concept behind the aceMedia system is an Autonomous Content Entity
(ACE), and the system is comprised of three tiers of technology: A
scalable content tier that adjusts to the user's device and its manner of
viewing, a metadata tier that performs semantic analysis and annotation,
and an intelligence tier that offers programmability and facilitates
autonomous action by the ACE. Content providers could give their business
a shot in the arm through the use of the aceMedia system, particularly in
instances where content is sold over the Internet.
Click Here to View Full Article
to the top
Global Developers to Number 17M by 2009, Report
Says
SD Times (11/15/06)No. 162, P. 1; Morgan, Lisa
Evans Data's first-ever Global Developer Population and Demographics
Report, completed in late October, predicts a 46 percent growth rate for
the global developer population for a total of over 17 million developers
by 2009. The Asia Pacific region (APAC) is expected to experience an 81
percent increase in developer population between 2005 and 2009, while the
rate predicted for North America is 15 percent. Management consultant
George Gilbert says, "This is part of a 40-year trend to push application
development farther out into the community." America creates great demand
for less skilled developers for off shore tasks such as testing and
maintenance, but is also witnessing a decrease in skilled specialists.
Cost of labor in India is increasing, so countries such as Ireland, the
Philippines, and China are becoming more popular for outsourcing of
American work. "Ultimately, this won't be about labor costs," according to
Gilbert. "The quality of the engineering talent coming out of the key
universities in these countries will determine just how high up the ladder
of economic value-add they can move. And it's clear they won't stop at
maintenance." Engineers around the world see top software companies in
America as their goal and have become a staple for these companies, but
they are still about 15 years younger on average than American developers
and less experienced, but this gap will most likely close within five
years.
Click Here to View Full Article
to the top
Scholars Challenge the Infallibility of
Fingerprints
Chronicle of Higher Education (11/17/07) Vol. 53, No. 13, P. A14;
Monaghan, Peter
Scholars' warnings that fingerprint analysis is not faultless are falling
on mostly deaf ears, and key to their arguments is a dearth of scientific
scrutiny. University of California at Irvine professor Simon Cole, author
of "Suspect Identities: A History of Fingerprinting and Criminal
Identification," points out that because courts have not needed more
substantial scientific examination of fingerprint analysis techniques,
law-enforcement agencies "retain legal carte blanche to claim that
fingerprinting is validated and infallible. They have nothing to gain and
everything to lose from validation studies." Cole notes, for example, that
examiners use "latent" prints that often do not provide whole, undistorted
images, which are then compared to much clearer inked or scanned prints in
police databases. The obscuring of myriad details of the print can lead to
mistakes. Michigan State University computer science professor Anil K.
Jain believes fingerprint technology can only be improved upon, not
perfected, and he started a biennial competition to bring such improvements
to light. University of Southampton researcher Itiel Dror thinks
fingerprint examiners make mistakes because human cognition is not
infallible, and he has run experiments that show that the perceptions and
judgments of even expert analysts can be shaped and disrupted by cognitive
and psychological effects. Practitioners of fingerprint identification
have been nonresponsive to the researchers' findings, and Cole contends
that forensic scientists are convinced that the research "doesn't matter
because it doesn't hurt them. They operate in the courtroom, where the
scholarly literature is just ignored." However, there does appear to be
increasing pressure for reform.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Attack of the Bots
Wired (11/06) Vol. 14, No. 11, P. 171; Berinato, Scott
Autonomous software programs or "bots" can coalesce into networks that
execute all kinds of mischief on a global level, and this has emerged as
the latest threat to the Internet. Bots proliferate like viruses by
installing themselves on Net-linked computers; but while viruses follow a
rigid program and act individually, bots can be controlled externally from
a remote server and work in concert to perpetuate mayhem. Bots can
coordinate distributed denial-of-service attacks for the purposes of
extortion, distribute spam, facilitate identity theft and credit card fraud
by stealing passwords and other sensitive information via keystroke
logging, and automate the process of clicking on ads that generate
per-click revenue, to name a few strategies. Bots scan for susceptible
systems where they can spread, and command and control (C&C) software can
upgrade botnets with new abilities as they are devised. Former Arbor
Networks researcher Jeremy Linden says, "Bots are at the center of the
undernet economy. Almost every major crime problem on the Net can be
traced to them." Users usually rent botnets from an intermediary or
"bot-herder," whose forte is marketing. Without an effective defense
against botnet attacks, the Internet could become increasingly unfriendly
to online commerce, or spark more and more severe vigilantism by users,
fueling a botnet arms race. The continuing demand for better bots has
fueled an intense competition among bot software developers to innovate,
and their resulting code attracts a wide array of customers, including
organized criminals, political activists, and corporate spies. Meanwhile,
Symantec security director Vincent Weafer testified before Congress last
year that 20 nations now have ongoing computer attack programs.
Researchers are working on ways to defend against C&C programs, such as
alerting ISPs to disable the C&C, but many move too fast, as the bot
writers are far ahead technically, says SRA International's Adam Meyers.
Click Here to View Full Article
to the top
Seeing With Superconductors
Scientific American (11/06) Vol. 295, No. 5, P. 86; Irwin, Kent D.
Fields that range from antiterrorism to quantum communications security to
astronomy could be dramatically affected by minuscule superconductors that
can detect photons and other particles, substantially boosting the
sensitivity of measurements across the electromagnetic spectrum.
Superconducting photon sensors could help spot materials that could be used
in a nuclear weapon, analyze defects in microchips, and gather images much
more rapidly, for example. The new sensors come in two varieties: Thermal
sensors that rely on how a photon's energy raises the temperature of the
detector material, and pair-breaking detectors that sense how a photon
disrupts some of the electron pairs that generate superconductivity.
Practical imaging is executed by large detector arrays, but all the output
signals emitted by the detectors must be blended into a smaller number of
data lines via multiplexing. The key to superconducting materials'
suitability as sensors lies in the fragility of superconductivity.
Currently available superconducting detectors boast 10 to 100 times more
sensitivity than conventional detectors functioning at room temperature,
and these devices have applications in homeland defense and nuclear
nonproliferation, submillimeter astronomy, and cosmology, to name a few
areas.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Embedded Multicore Needs Communications Standards
Embedded Systems Design (11/06) Vol. 19, No. 11, P. 16; Levy, Markus;
Brehmer, Sven
Communications application programming interfaces (APIs) are a sticking
point in embedded multicore systems, and Embedded Microprocessor Benchmark
Consortium President Markus Levy and PolyCore Software founder Sven Brehmer
discuss the Multicore Association's effort to address the problem through
the development of communications standards. Many communications standards
are available, but none are made to support closely distributed embedded
multicore systems, and areas of programming multicore systems that must be
accommodated include resource management, communications, and
synchronization. When data must be shared between cores or core
synchronization is called for, there has to be a physical transfer of
messages, which can happen synchronously or asynchronously; the
availability of both shared and local memory can facilitate the creation of
efficient communications structures. Levy and Brehmer note that
proprietary implementations and APIs are most frequently employed for
shared memory architectures that use simple communications schemes. The
majority of standard protocols and APIs target widely distributed
architectures (such as the Internet, wide area networks, local area
networks, servers, and single-chip processing devices), given the wide
distribution of multiprocessing implementations; some form of message
passing for data and command transfer is used, but programming difficulties
and a lack of support for incremental parallelization of an existing
sequential program complicates the situation. The authors push for the
creation of an API specially developed for multicore systems with similar
characteristics to MPI, while minimizing overhead and enabling the ability
to leverage the proximity of multiple cores on one chip. The Multicore
Association has begun developing a message-passing API as well as a
resource-management API that offers the fundamental communication and
synchronization properties needed for embedded distributed systems. Levy
and Brehmer write that ascertaining what features and functions should not
be applied is the major challenge in creating such APIs.
Click Here to View Full Article
to the top