Computer Privacy Expert Warns of Growing Risks to Social
Security Numbers
AScribe Newswire (06/21/07)
Ana I. Anton, representing ACM's U.S. Public Policy Committee, testified
Thursday before the House of Representative's Subcommittee on Social
Security that the theft of social security numbers (SSNs) has become the
primary method used to steal an individual's identity, allowing criminals
to fraudulently access and open credit cards, banking accounts, and other
financial services. Anton urged Congress to strengthen SSN privacy and
reduce the nation's reliance on SSNs for personal identification. Anton,
an associate professor of software engineering at North Carolina State
University, cited the fact that more than 36 million Americans have had
their identities stolen since 2003, and more than 155 million personal
records have been compromised since 2005. Anton said, "Two key factors
have enabled the explosion of identity theft in today's environment. One
is the common use of SSNs as a de facto national identification number; the
other is current computing technology that enables the collection,
exchange, analysis, and use of personal information on a scale
unprecedented in the history of civilization." Anton urged banks, credit
agencies, and government agencies to require stronger proof of identity,
such as passports, military IDs, or licenses with a photograph to verify
personal identity, after which a secondary authenticator, such as a secret
shared password or PIN, should be used for subsequent transactions. Anton
also suggested removing and prohibiting the display of SSNs in public
records, requiring secure or encrypted transmission of records or documents
containing SSNs and other personally identifiable information, requiring
electronic security for files and devices containing SSNs, and substituting
a unique number generated by the database management system to replace SSNs
as the primary key in databases. To read Anton's complete testimony, visit
http://www.acm.org/usacm
Click Here to View Full Article
to the top
E-Vote 'Threat' to UK Democracy
BBC News (06/22/07)
An Open Rights Group (ORG) report says the risk involved with replacing
paper ballots for touch screens far outweighs any benefit that may result
from the change. The group, which based its conclusions on observations of
local elections' e-voting trials in May, said until e-voting is made more
reliable, easier to oversee, and has proven its integrity, it should not be
used. Observations made during local elections using e-voting in England
and elections using electronic counting systems in Scotland led the ORG to
express "serious concerns" about e-voting. In England, kiosks, laptops,
touch screens, and mobile phones have all been tested for e-voting systems.
The ORG's primary concern is that e-voting is currently a "black box"
system that prevents voters from seeing how their votes are recorded or
counted, which the ORG argues makes election oversight impossible and wide
open to error and fraud. The report criticized the lack of a rigorous
certification method to ensure hardware and software systems are well
protected. The report also called for usability testing to ensure the
elderly and housebound can easily access e-voting schemes. The ORG said it
was a serious mistake to accept the conveniences of e-voting while ignoring
the risk that such systems could destroy confidence in voting as a whole,
and that all e-voting trials should be stopped so problems can be fixed
before e-voting is more widely used.
Click Here to View Full Article
to the top
Workshop Explores Integrating CS Into Undergrad
Curriculum
HPC Wire (06/20/07)
A weeklong workshop at the Ohio Supercomputer Center will teach 19
professors from the United States and Puerto Rico how to integrate computer
science into undergraduate curriculums. The "Integrating Computational
Science into the Undergraduate Curriculum Workshop" is one of 11 hands-on,
summer seminars on campuses across the country sponsored by SC07, an
international conference sponsored by ACM and the IEEE Computer Society.
Each of the seminars address the use of computational science and
cyberinfrastructure in education. Steve Gordon, lead instructor for the
workshop and director of the Ralph Regula School of Computational Science,
an initiative of the Ohio Supercomputer Center, said computational
science's ability to solve complex business, technical, and academic
research problems has made it as important to scientific discovery as
theory and experimentation. "The diversity of the professors attending out
workshop showcases the pervasiveness of computational science," Gordon
said. "Attendees represent a breadth of disciplines, including astronomy,
chemistry, biology, pharmacy, engineering, computer science and natural
sciences." Workshop participants will prepare or adapt at least one
instructional module and develop an implementation plan for their
classroom. Participants will also receive continued support from a mentor
to incorporate computational science in their classrooms during the
academic year. For more information about SC07, visit
http://sc07.supercomputing.org/
Click Here to View Full Article
to the top
School Fills Need for Game Designers
Inside Bay Area (CA) (06/18/07) Aaronson, Sean
The video game industry has become one of the most lucrative fields in
entertainment, with game industry revenues rivaling Hollywood, but video
game education has not kept pace with the industry. "I think schools are a
little behind the times," said Josiah Pisciotta, owner of computer-game
company Chronic Logic. However, some schools have recently developed
programs designed to teach the art and science of video games. The
University of California Santa Cruz's four-year undergraduate degree in
game design was quickly embraced by students. Although fewer than 10
students graduated with a video game degree this year, 90 freshman have
already enrolled in the program for next year. By comparison, only 30
students have enrolled in the traditional computer science program.
University of Southern California also has a four-year video game program.
The UCSC program combines courses from computer science with digital arts
and films. The degree requires students to take five computer science
classes, five electives, complete a design seminar their senior year, and
an ethics course to address the often violent nature of video games. While
the programs have been successful, not everyone in the industry believes
that teaching video game design at universities is a good idea. Jack
Emmert, the creative director for video-game company Cryptic Studios, said
students are misguided if they think they can learn game design in an
academic setting. "It�s premature for universities to sell degrees when
the industry hasn't even figured out what the skill set is to be a
successful game designer," Emmert said.
Click Here to View Full Article
to the top
One Easy Fix for Immigration
Business Week (06/21/07) Herbst, Moira
Talented foreigners come to the United States looking for the chance to
start their own business, only to be frustrated by protracted waits to
receive residency status--sometimes six or seven years long--and tempted by
more lucrative opportunities in other countries. "Increasingly, they're
getting fed up and going home," observes Duke University Pratt School of
Engineering professor Vivek Wadwha, who adds that this situation is
particularly tough on aspiring tech entrepreneurs, who have a limited
window of opportunity to launch their businesses due to cutthroat
competition. Wadwha has studied the problem extensively, and concluded
that master's and doctorate degree holders--especially those with math,
technology, science, and engineering degrees--are most likely to start new
businesses. There also appears to be little correlation between these
entrepreneurs and the pedigree of the schools they graduate from. "What
was surprising is that it doesn't really matter which schools they come,"
Wadwha says. Wadwha calculates that 25 percent of the technology and
engineering companies founded in the United States between 1995 and 2005
had at least one key foreign-born founder, and produced $52 billion in
sales and employed 450,000 people. There is a major lack of consensus over
the ultimate fate of millions of low-skilled illegal immigrants already in
the country, which could only further erode the chances of improving the
situation for highly skilled workers. Experts suggest that the United
States should retool its immigration policy and make a greater effort to
draw highly educated workers, for instance by admitting all foreign-born
students who receive advanced degrees from U.S. institutions.
Click Here to View Full Article
to the top
SIGMETRICS 2007 Panel at FCRC
My Slice of Pizza (06/19/07) Metoo
At the SIGMETRICS panel at ACM's Federated Computing Research Conference
several industry experts spoke about the successes and challenges of
performance modeling and on being researchers in industrial labs. Cathy
Xia from IBM described the life of a researcher at IBM and the history of
queuing theory systems, providing performance modeling for Web services as
an example. The next speaker, Albert Greenberg from Microsoft, offered a
near-live demo of Microsoft tools and discussed how almost every aspect of
computer science research was represented and relevant. Shubho Sen from
AT&T expressed the difficulty in managing large IP networks. Arif Merchant
from Hewlett-Packard said that traditional disk modeling methods are
inaccurate for large data centers, and that detailed modeling is necessary.
The audience asked questions, among others, about how to succeed in labs,
does industrial lab work compromise research principles, and what are the
challenges in reaching product production. The panel answered that
basically research labs provide great research careers, but do not provide
the flexibility to work on problems that could win medals and prizes,
although most labs encourage researchers to develop their own ideas.
Click Here to View Full Article
to the top
Searching Sportscasts
Technology Review (06/21/07) Graham-Rowe, Duncan
Massachusetts Institute of Technology researchers have developed a new
visual-search engine capable of automatically searching sports footage for
specific types of action and events. MIT computer scientist Michael
Fleischman says that despite advances in visual-search engines, accurate
video search is still challenging, particularly with sports footage. "The
difference between a home run and a foul ball is often hard for a human
novice to notice, and nearly impossible for a machine to recognize,"
Fleischman says. Some systems use automatic speech recognition to improve
search accuracy by generating text transcripts, but search terms are often
repeated out of context, particularly in sporting events when commentators
frequently talk about previous events regardless of what is happening in
the current game. To compensate for this problem, Fleischman and Deb Roy,
director of MIT's Cognitive Machines Group, developed a system that can
associate search terms with video footage, not just the audio track. "We
collect hundreds of hours of baseball games and automatically encode all
the video based on features, such as how much grass is visible and whether
there is cheering in the background," Fleischman says. Using
machine-learning algorithms, the researchers analyzed the video clips to
find events that were defined by the type of camera shots used, like a
camera panning up and then back down for a fly ball. The system then tries
to match these events to words that appear in the transcript text by
examining its probabilistic distribution. Fleischman and Roy say the
initial trails, searching through six baseball games for home runs, showed
promise.
Click Here to View Full Article
to the top
How the U.S. Has Kept the Productivity Playing Field
Tilted to Its Advantage
New York Times (06/21/07) P. C3; Goolsbee, Austan
Recent evidence from the Center for Economic Performance at the London
School of Economics suggests that the United States makes better use of
information technology than any other country and as a result has the
world's most productive workers. The popular explanation for the United
States' high productivity is the low cost of information technology. Lower
computer prices foster a rapid adoption of technology, boosting
productivity. However, John Van Reenen, a professor London School of
Economics, notes that technology prices in Europe have dropped as well, and
technology has been utilized just as much as in the United States, but that
Europe has not seen a productivity boom. Van Reenen suggests that
Americans are simply better at adapting to and utilizing new technology.
Van Reenen's paper, "Americans Do I.T. Better: U.S. Multinationals and the
Productivity Miracle," examines the experience of British companies when
taken over by companies with headquarters in other countries. In the huge
service sectors, such as financial services, retail trade, and wholesale
trade, American takeovers caused a huge productivity advantage over a
non-American takeover. When an American company takes over a business, the
business becomes significantly better at translating technology spending
into productivity. American companies in the service-based economy have
proven to be incredibly able to adapt and incorporate new technology. The
main concern is if such an advantage will last. Van Reenen predicts two
possible outcomes. The first is that the productivity boom of the last 10
years was an aberration that allowed the United States to take advantage of
lower computer prices to pull away from the competition. The other
scenario is that the 1990s represent a fundamental shift in the global
economy and those best able to adjust to changing will flourish in the 21st
century
Click Here to View Full Article
to the top
Chip Maker Intel Shows Off R&D Projects
San Jose Mercury News (CA) (06/21/07) Boslet, Mark
Intel's annual research day, on June 20, presented projects that included
energy-saving technology for laptops, a chip to prevent online gamers from
cheating, more compact antennas for wireless computer network connections,
and an overall theme that computing is entering a more personal age with
new ways for humans and computers to interact. Justin Rattner, who helps
oversee Intel's $5.4 billion research-and-development budget, said he
believes that computers will soon be able to recognize human expressions
and react accordingly. "What we're starting to look at is fusing the
virtual with the physical," Rattner said. Speaking computers that can
listen and respond are also possible, even if the vocabulary might be
limited and content restricted. Rattner's research group is focused on
creating technology for "ultra-mobile devices," or small, lightweight
devices that have significant computing power. Despite a focus on new
products, the majority of the research projects displayed are still several
years away from the market. One such project was "Mashmaker," which offers
a way to combine data from multiple Web sites onto a single page.
Mashmaker could be used to combine apartment listing sites with the yellow
pages so an apartment hunter could see what apartments are available near
restaurants or other places of interest.
Click Here to View Full Article
to the top
Xerox Tool Analyzes Text to Improve Search Results
IDG News Service (06/20/07) Sayer, Peter
A new search tool from Xerox is able to yield better results by trying to
understand the content of documents. FactSpotter is designed to analyze
the grammar of a text so that it can determine which words ambiguous nouns,
verbs, or pronouns refer to, according to Frederique Segond, manager of the
parsing and semantics research group at Xerox Research Center Europe near
Grenoble, France. For example, FactSpotter would realize that the "he" and
"the head of Microsoft" in the same document is likely to refer to the same
person, "Bill Gates," and would not confuse "Bill Gates said" with "a
friend of Bill Gates said" and return irrelevant results like other search
engines. Xerox wrote FactSpotter in the C programming language, and it can
interface with other applications via modules in Java and Python. The
company is working to expand its analysis capabilities beyond the written
language to include searches of radio or TV archives when linked with audio
transcription tools. Xerox researchers created a metalanguage to describe
the grammar of various languages, including English, French, and Spanish,
and are developing a Japanese metalanguage description in conjunction with
Fujitsu.
Click Here to View Full Article
to the top
Hitachi: Move the Train With Your Brain
Associated Press (06/22/07) Tabuchi, Hiroko
Researchers at Hitachi's Advanced Research Laboratory in Hatoyama, Japan,
have developed the "brain-machine interface," a device that uses optical
topography technology to enable people to control electronic devices using
their thoughts. The device analyzes slight changes in the brain's blood
flow and translates brain activity into electric signals. In a
demonstration, a reporter wearing the device's cap, which connected to a
control computer and a toy train set, was able to start and stop the toy
train by performing calculations in her head. Optical topography sends a
small amount of infrared light through the brain's surface to monitor
changes in blood flow. Traditionally such technology has been reserved for
medical uses, but Hitachi's scientists are striving to refine the
technology for commercial use. The company is ready to develop a TV remote
that lets users turn a television on or off and change the channels only by
thinking, and the technology could eventually replace remote controls and
keyboards and help disabled and paralyzed people operate electric
wheelchairs, beds, and artificial limbs. The brain-machine interface does
not require an implant, like some earlier technologies did, but the
interface still needs to be adjusted to more accurately detect intentional
signals and ignore background brain activity.
Click Here to View Full Article
to the top
Security Study Pokes Holes in Advanced Authentication
Claims
Ars Technica (06/20/07) Hruska, Joel
A new study by researchers at Harvard University and the Massachusetts
Institute of Technology raises concerns about the potential effectiveness
of image authentication systems, which banks consider to offer better
security protection than simple passwords. Image authentication systems
reportedly offer an additional layer of security, as users are presented
with an image that was previously chosen, usually when passport input is
required. For the study the researchers divided the participants into
three groups. The first group was told they were doing normal banking
activities on a Sunday afternoon, while the second group was told to focus
on security. The third group used their own user ID and passwords at the
Web site of their own bank. The researchers tested the response of the
participants when the login showed "https://" rather than "http://," and
all 63 users provided their login data and password. Next, image
authentication images were removed and replaced with a generic "this
service is being upgraded" tag, and 58 out of 60 participants continued and
entered their data. Finally, the researchers created a dramatic warning
page that said the security certificate for the Web site may not be safe,
and 30 out of 57 people still proceeded to log in. Broken down by group,
the results reveal that 22 of the second group continued despite the
warning page, and eight of 14 using their own information did so as well.
The research shows that 97 percent of the participants entered their login
information and continued when they were provided with a clear message that
there were problems with the image authentication system and that it may
not be secure.
Click Here to View Full Article
to the top
Storing Light
Technology Review (06/20/07) Bullis, Kevin
High-speed computing and communications could be enabled by a minuscule
light storage device developed by Cornell University researchers led by
electrical and computer engineering professor Michal Lipson. The device
employs an optically controlled "gate" that can be opened and closed to
store and emit light, which could impose control over the order and timing
of data transmission. The gate mechanism is a pair of silicon rings
sandwiched between two parallel silicon tracks, and the ability to trap the
light in the rings--as well as release it--is derived from the researchers'
discovery that the rings can be tuned to divert different colors by
striking them with a brief light pulse. MIT physics professor Marin
Soljacic reports that the Cornell researchers' breakthrough is important
because it facilitates the storage of light in a very small device under
ambient conditions. Thus far the rings can only trap a portion of a pulse
of light, resulting in the loss of any data encoded in the shape of the
general pulse. Furthermore, the length of storage time is limited under
the current arrangement, according to Lipson. MIT computer science
professor Mehmet Yanik says the first problem can be addressed through the
compression of the pulse and the employment of a cascade of rings, while
Lipson says the amplification of the light signal following its release
from the rings may solve the second problem.
Click Here to View Full Article
to the top
Panelist Notes Politics of Putting Agency Information
Online
National Journal's Technology Daily (06/19/07) Sternstein, Aliya
OpenTheGovernment.org executive director Patrice McDermott participated in
a workshop to convince the technology community that politics plays a
significant role in the government's underutilization of the Internet. The
workshop, sponsored by the World Wide Web Consortium and the Web Science
Research Initiative and held at the National Academy of Sciences, united
government officials, computer scientists, academics, Web standards
leaders, and government vendors in an effort to facilitate the deployment
of Web standards across government Web sites, help create research agendas,
and guide officials in creating Web policy that increases access to
government information. Members of the technology community want the
government to explore the "Semantic Web" to provide deeper content
analysis, but McDermott said she would like to see the government use Web
1.0 first. McDermott said that at the workshop attendees told her the
government only needs to make databases available online, and the online
community will reformat the content so it is compatible with new
technologies. "What the people in there--mostly technology people--don't
understand is that it's not just a resource decision, it's a political
decision to expose that information," McDermott said. "It's really more
the politics than the policy."
Click Here to View Full Article
to the top
Distributed Sensing
CIO (06/14/07) Hapgood, Fred
The concept of distributed sensing involves networking large numbers of
sensing points, fixed, mobile, or both, to collect and analyze data,
sharing information and results among all sensing points. An example might
be networking home weather stations in a neighborhood to form a small-scale
weather reading. Another is using GPS in cars to monitor the routes driven
by people to provide better driving instructions online. One of
distributed sensing's greatest potential benefits is to make urban
planning, traffic engineering, and crowd control far more empirical.
Distributed sensing could help capture, preserve, and record historically
and scientifically important information that currently goes unrecorded,
such as the acoustic levels people experience throughout their day to the
way a city changes landscapes and buildings over decades, by integrating
sensor networks into devices that people already carry such as cell phones.
The Center for Embedded Networked Sensing at UCLA has been investigating
network architecture and has found that distributed sensors need to be
capable of monitor themselves. A remote sensing network needs to be able
to examine each sensor's position and settings, much like how a microphone
needs to be properly oriented and adjusted. The sensors need to be able to
collect and use information on themselves to evaluate inputs and reset
configurations when needed. The network should be capable of verifying the
time and location of collected information and be able to verify the
reliability and accuracy of the source.
Click Here to View Full Article
to the top
U.S. Science Policy: Congress Splits Over Plan to
Consolidate Intelligence Research
Science (06/22/07) Vol. 316, No. 5832, P. 1693; Bhattacharjee, Yudhijit
Director of National Intelligence (DNI) Michael McConnell has proposed
combining existing U.S. research and development programs at 14 agencies
into a new organization. Based on the Defense Advanced Research Projects
Agency (DARPA), the proposed Intelligence Advanced Research Projects
Activity (IARPA) would combine the Intelligence Technology Innovation
Center at the CIA, the Advanced Research and Development Activity at the
NSA, and the National Technology Alliance at the National
Geospatial-Intelligence Agency. McConnell said the unified research
program will stimulate long-term research for gathering and analyzing
intelligence that currently does not fit the mission of any particular
agency. "We are in a rut right now, turning the crank on the same
technologies," said IARPA acting director Steven Nixon. Although the plan
has received a mixed reaction in the House, it has more support in the
Senate. "We think IARPA can fill in gaps between the needs of single
agencies," said a Senate aide. "It's an invalid concern that IARPA is
suddenly going to become the program manager for all the science that's
done by the intelligence community." The IARPA would sponsor and provide
grants for basic and applied academic research on intelligence-related
issues such as machine translation of foreign languages, pattern
recognition, and quantum encryption. University of California computer
scientist Mark Steyvers says current programs funded by the CIA and NSA are
designed to produce immediate results, while he predicts IARPA-funded
research would stimulate "exciting new collaborations" among various
research fields that could offer broad applicability.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Murky Trade in Bugs Plays Into the Hands of
Hackers
New Scientist (06/16/07) Vol. 194, No. 2608, P. 30; Biever, Celeste
Computer security consultant Charlie Miller believes the security of the
Internet could be improved if researchers were offered financial incentives
to search for and report software bugs, as the increasing complexity of
software has made finding such vulnerabilities tougher and more
time-consuming. As a result, many "white-hat" hackers no longer feel
bragging rights alone are enough compensation for bug-hunting, which only
serves to improve the chances of "black hat" hackers finding and exploiting
the bugs for criminal purposes. Companies are offering money for zero-day
bugs, which they use to create patches for customers who use their
anti-intrusion products, but Miller says a typical payoff from these
firms--estimated by University of Cambridge researcher Andy Ozment to be
between $2,000 and $10,000--is not enough to coax the top researchers to
seek out bugs. Compensation for bugs is based on the severity of the
vulnerability as judged by the buyer, which requires the bug hunters to
disclose all their information on the bug to the company before an offer is
made. This is a situation where Miller says the researcher has "no
leverage at all." Compounding the problem is the existence of a black
market for bugs run by malevolent hackers willing to pay top dollar, which
can be a great temptation for researchers who feel they are not being
fairly compensated. One alternative to offering more money for bugs is for
companies to be more honest about how much they are willing to pay, giving
researchers a clearer picture of how much a bug is worth before attempting
to sell it. Rainer Bohme of Germany's Dresden University of Technology
says such a strategy could also encourage firms to produce less buggy
software.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top