Privacy Worries Over Web's Future
BBC News (05/24/06) Fildes, Jonathan
As researchers continue to develop the Semantic Web, major privacy issues
could arise because of the confluence of multiple sources of data about
people and places, according to Hugh Glaser of the University of
Southampton, though he admits that it will be several years before even the
Semantic Web programs that have already been developed become available to
the public. When Tim Berners-Lee invented the Web in 1989, it was
impossible to predict how integral it would become to everyday life. The
current Web has major limitations, however, including the fact that the
majority of its information cannot be read by a computer. Developers of
the Semantic Web are trying to bring order to the jumble of photographs,
calendars, public records, and other items so that computers can create a
coherent and composite statement of a person, place, or thing. "Imagine if
you can link real-time prescription data for flu remedies with geographical
data," said Nigel Shadbolt of the University of Southampton. "You can do
real-time epidemiology and see flu outbreaks as they happen." The Semantic
Web could also create personalized weather forecasts with information
provided by global positioning systems. While the extended reach of the
Semantic Web would make for a much smarter platform, it could also tap into
confidential information as it searches for multiple sources, such as
health records, purchasing histories, or contact information. "All of this
data is public data already," said Glaser. "The problem comes when it is
processed." That said, researchers will have many years to address the
security concerns, and some argue that rather than presenting an entirely
new problem, the Semantic Web merely complicates the security concerns that
already plague the existing Web.
Click Here to View Full Article
to the top
Politicos Ponder Patent System Changes
CNet (05/23/06) Broache, Anne
The highly publicized lawsuits involving eBay and the BlackBerry have
prompted Congress to reconsider the technology industry's calls to fix what
it describes as a broken patent system. Though much of the Senate has been
focused on a pending immigration bill, an intellectual property panel that
included independent inventors, representatives from a major technology
company, a pharmaceutical and biotech manufacturer, and academics convened
within the Senate Judiciary Committee, whose leaders admitted that they had
a lot of work ahead of them to reach a solution. A spokesperson for Sen.
Orrin Hatch (R-Utah), who chairs the panel, said the senator has been at
work on a bill since at least last year but has no plan for when it will be
introduced. The session dealt primarily with reforming patent litigation,
including the creation of a method for the public to comment on the
validity of recently granted patents outside of the courts, though there
was little agreement on how such a system should be implemented. Though
the technology and pharmaceutical industries share little common ground in
the area of patent reform, they did agree on the need for some form of that
system to be administered by the U.S. Patent and Trademark Office, known as
"post-grant opposition." Under the system, the public would have a
predetermined period of time to contest a patent that would hopefully cut
down on the amount of time and resources consumed by litigation.
Representatives from the technology and financial services industries
argued that there should be a second window for disputing patents because
the proposed times of six to nine months would not give companies in those
industries enough of a chance to comb through the thousands of patents that
could relate to their own products.
Click Here to View Full Article
to the top
Too Much for NSA to Mine?
Government Computer News (05/22/06) Vol. 25, No. 13,Wait, Patience
The controversy over the NSA's covert program of collecting data on
millions of phones calls placed by normal citizens begs the question of how
well the agency will actually be able to mine the vast quantities of
information it is amassing. Although the NSA is not revealing any details
about its databases or the technologies that it is using to maintain and
search them, the Electronic Frontier Foundation (EFF) reports that AT&T's
Daytona call detail record (CDR) database, which was reportedly made
accessible to the NSA, exceeds 312 TB. Assuming that figure is accurate
and that Verizon and BellSouth provided access to databases of similar
sizes, the NSA could have more than 900 TB of data on its hands, requiring
massive storage capacity, intense computing power, and sophisticated
analytical software. Access to the bulk of the database in real time is
critical for effective data mining, though some believe the NSA is frozen
out of much of its own information by virtue of its sheer size. "My
impression--strictly a professional guess--is that at least 75 percent of
what NSA 'knows' is...offline and not accessible," said Robert Steele, CEO
of OSS.net. "You cannot do good pattern analysis, including historical
comparisons, without massive online storage." SGI has begun developing
computers with terabyte-scale active memories, the largest containing 13
TB, which is not enough memory to handle even 1.5 percent of the three CDR
databases put together. Moreover, a computer's capacity for memory space
is limited by its amount of address bits on chips, according to SGI's Bill
Mannel. "Some of our customers who already have big-memory databases are
looking for something beyond [what they have], but they have power and
footprint problems," Mannel said, adding that the storage architecture must
be overhauled to incorporate enough RAM to access the entire database.
Click Here to View Full Article
to the top
Embedded Software Made Simpler Yet More Powerful
IST Results (05/22/06)
A European research team has used high-level Constraint Logic Programming
(CLP) languages to achieve a major advance in the development of pervasive
computing systems. As part of the ASAP project, researchers at the
Technical University of Madrid, Heinrich-Heine University of Dusseldorf,
and Roskilde University have used the high-level declarative language Ciao
in a series of case studies, including one in which pervasive application
kernels were written in Ciao for a wearable computer system. The
researchers have developed the CiaoPP toolkit in an effort to create
specialized programs that are automatically optimized for specific
processing and resource needs. "Software created with the toolkit is
comparable in terms of resource demands to code written in C if it is
designed to do the same thing," says German Puebla, ASAP project
coordinator and a researcher at the Madrid university. Until now,
low-level languages such as C have been the focus of researchers due to
concerns about efficiency and resource demands of code. Software will need
to be interoperable and efficient if multiple distributed platforms are to
communicate at a high level, such as having computers integrated and
embedded in appliances around a home. Puebla says within five to 10 years
pervasive devices will be efficient and affordable for widescale
introduction in everyday objects and environments.
Click Here to View Full Article
to the top
When It Comes to Privacy, Gender Matters
UW News (05/23/06)
Researchers at the University of Washington have found that women are more
concerned about security in public places than men are, challenging the
notion that people no longer expect their privacy to be respected once they
leave their homes. Indeed, almost a quarter of the men and women involved
in the study said that any amount of video capture is an invasion of their
privacy. Most people in either gender did not object to on-campus video
capture, though a majority of women found off-campus surveillance
unsettling. Of the nearly 900 people included in the survey, 780 were told
beforehand that a camera mounted at the top of a tall campus building was
monitoring their movements and relaying the image to a plasma screen set up
inside the building. Most men and women did not object to the display
within the office, but a majority of women expressed discomfort at the idea
of sending their images to an off-campus apartment or some other remote
location, suggesting that the university community is perceived as more
trustworthy than the outside world. Most men and women agreed that they
would not be comfortable with being recorded, as opposed to having their
images displayed in real time, though nearly twice as many women as men did
have reservations about real-time display. "Over half (55 percent) of the
participants we surveyed expressed some concern for having their image in a
public place collected and displayed elsewhere," said Peter Kahn, associate
professor of psychology and one of the lead authors along with UW
Information School professor Batya Friedman, both of which are co-directors
of the UW's Value Sensitive Design Research Lab. The study will be
published in next month's Journal of Human Computer Interaction.
Click Here to View Full Article
to the top
Think About It, This Will Make Turning on Your Computer
Much Simpler
Times (UK) (05/22/06) Ahuja, Anjana
Researchers involved in the field of EEG (electroencephalogram) biometrics
say a brain has its own unique signature and that brain waves can reveal
the identity of an individual. Ramaswamy Palaniappan, a computer scientist
at the University of Essex, is working to take accurate brain fingerprints.
He has conducted a test on alcoholics, and found that their brain output
generates a distinctive pattern of electrical pulses in the frequency range
of 30 Hertz to 50 Hertz, the gamma band. Julie Thorpe of Carleton
University in Ottawa believes "passthoughts" will one day replace passwords
for accessing computers, while Swiss Federal Institute of Technology
imaging expert Touradj Ebrahimi says brain fingerprinting will rival DNA
fingerprinting in the future. However, EEG biometrics researchers may have
to find a better way to measure brain waves if brain fingerprinting is to
replace conventional fingerprint or iris recognition biometrics.
Currently, volunteers are subjected to wearing a gel-smeared skullcap
sprouting electrodes that transmit the electrical pulses to a detector.
EEG biometrics potentially has high level security applications, such as in
military environments, says Palaniappan, who adds that "you can chop off
fingers but you can't forge a brain signal."
Click Here to View Full Article
to the top
Robot Mimics Tongues, Trunks, Tentacles
Discovery Channel (05/22/06) Staedter, Tracy
Researchers working on the OctArm project report that the unconventional
robot with flexible joints has successfully grasped objects and clung to
them while submerged in rushing water in field tests. OctArm resembles an
elephant trunk, in that it is wider at the base and tapered toward the tip,
and its movements are based on the muscles that control the tongue, trunks,
and tentacles. The scientists use a joystick to move the robot into a
coiled shape, or extend its position, through the manipulation of air
pressure. "These robots are invertebrate robots and are good at getting
into tight spaces and wriggling around," says Ian Walker, a professor of
computer and electrical engineering at Clemson University, where his team
has been involved in the project for about 10 years. More advanced
algorithms will be needed to get OctArm to curl around an object, lift, and
perform other complicated motions. The researchers envision a powerful
robotic arm that will be able to grasp objects like an elephant or move
like a snake, which would be useful in the rubble of a disaster zone or on
the surface of a distant planet. Walker's team consists of researchers
from eight other institutions, and they will focus on improving the robot's
precision and adding sensors and a camera this summer.
Click Here to View Full Article
to the top
Sex, Politics and the Internet
International Herald Tribune (05/21/06) Shannon, Victoria
Opponents of ICANN's oversight of Internet governance are holding the
organization's recent veto of the .xxx domain as further proof that the
U.S. government is meddling in the affairs of the Internet. "We see here a
first clear case of political interference in ICANN," a spokesman for
Viviane Reding, the EU commissioner for information society and media, said
after the vote. ICANN CEO Paul Twomey was quick to counter, admitting that
while objections to the domain were received from numerous governments,
including that of the U.S., what ultimately lay behind the decision was the
board's doubt that ICM Registry, the Florida company that applied to run
dot-xxx, could live up to its pledge to adhere to all international
regulations regarding pornography. "The question in the end was how do you
scale that?," said Twomey. "The nine votes against were not satisfied that
the applicant could do it...It became clear that if we approved this, ICANN
would end up being the world's censor," a position the board refused to
accept. ICM has appealed the decision. Meanwhile, opponents of ICANN
oversight say that potential congressional action on "Net neutrality" is
yet another example of the U.S. meddling where it should not. But World
Wide Web Consortium director Tim Berners-Lee can see both sides. "I'm not
a great enthusiast for legislation or governments trying to control the
Internet," he said. "But this legislation is not about government control
as much as about preventing corporate control." Nevertheless, Berners-Lee,
speaking before the WWW2006 conference in Edinburgh this week, says that
technology can affect policy issues. He says, "The only reason for
introducing technology is a social reason, to support society better. But
the implications of technology are not always obvious to people making
policy."
Click Here to View Full Article
to the top
'Google Hacking' Attacks Rising
Massey News (05/19/06)
Researchers at Massey University report that Google hacking attacks are on
the rise and that many Web sites in New Zealand are more vulnerable than
people suspect. Hackers who use Google's search engine to uncover
sensitive personal information pose a threat to businesses, governments,
and other organizations that store individuals' data. The study conducted
a vulnerability comparison of Web sites in New Zealand with those in
Australia, the United States, and the Czech Republic, and found New
Zealand's to be the least secure. Using carefully chosen keywords, the
researchers ran 170 queries each day for three months, and found that sites
with the organizational domain names .co and .org were the most vulnerable.
The vulnerabilities remained open for an average of 60.96 days, or 57
percent of the testing period, and the problem is not likely to solve
itself. "Security on the Web is likely to remain an ongoing battle," said
Ellen Rose, a senior lecturer at the Institute of Information and
Mathematical Sciences. "On the one side, hackers will continue to employ
new tactics, using tools like Google in unforeseen ways. Security experts
must try to minimize exposure by detecting problems and putting
countermeasures, such as security audits, in place. Google hacking
vulnerability should be included in these security audits."
Click Here to View Full Article
to the top
Hacking Your Prius
CNet (05/22/06) Terdiman, Daniel
Toyota Prius owners are increasingly finding ways to hack into their
vehicles' systems to alter factory specifications in an attempt to get more
miles per gallon. "In the 1950s, it was all about getting more speed.
Now, instead of getting more horsepower, it's about getting more miles per
gallon," said Phillip Torrone of Make Magazine. Rising gas prices and
concerns over an emerging energy crisis have exacerbated the tendency of
hybrid-car owners to override factory-set features. Hackers have been able
to modify the car so that it runs mostly on battery power, raising the
car's fuel efficiency to nearly 100 miles per gallon. Prius owners have
also executed hacks to alter other features, such as the beeping noise that
some late model cars make when put into reverse. Early Prius adopters have
formed a closely knit community to share information about methods for
hacking the car's systems, and as a class are likely to have the expertise
to execute such hacks, according to Dave Watson, president of Coastal
Electronics, a company that promotes Prius modification kits. Toyota
acknowledges that some owners will take steps to modify their cars, though
it does not condone the behavior, particularly the hack that enables users
to operate the GPS navigation system while the car is in motion. Watson
counters that Toyota made an arbitrary distinction when it decided which
features users could and could not operate while driving the car due to
safety concerns. The feature that enables the Prius to run almost entirely
on battery power at low speeds is available on models sold in Europe and
Asia, but Toyota claims that the U.S. regulation requiring it to offer an
eight-year warranty for its power system prevents it from including the
option in models sold in the United States.
Click Here to View Full Article
to the top
Champion of Cyberspace Faces Its Biggest Case Yet
San Francisco Chronicle (05/23/06) P. A1; Egelko, Bob
The Electronic Frontier Foundation (EFF) will face what could be the most
significant case of its 16-year history when a federal judge hears
dismissal motions from both AT&T and the Bush administration in a suit
alleging that AT&T broke the law by handing over tens of millions of
communications records to the NSA. The EFF has been alternatively praised
as a champion of the common man and condemned as the enemy of the free
market. "Their first instinct is to mistrust corporations, organizations
competing in the market, to not have faith that competition will solve
problems," said Patrick Ross of the Progress and Freedom Foundation. The
EFF counters that it is in favor of the free market, but it warns against
the alignment of government, private industry, and technology. "In
different moments, each of these are friends of civil liberties," said
Jennifer Granick, executive director of the Center for Internet and Society
and an EFF supporter. "Sometimes they conspire in some combination of the
three to be a challenge to civil liberties." The EFF has struggled to
convince the courts that it is attempting to safeguard essential freedoms,
such as the right to have a private conversation. The group was
outmaneuvered by the entertainment industry last June when it failed to
frame the argument over downloading music around stifling innovation, as
opposed to stealing intellectual property. The foundation has vigorously
campaigned against the stipulations of the 1998 Digital Millennium
Copyright Act, including the rule barring users from bypassing piracy
protections, though the law has generally help up in the courts.
Click Here to View Full Article
to the top
Quiet Slowdown in Computer Revolution
Fairfax New Zealand (05/24/06) Cleary, John
The speed aspect of Moore's law no longer appears to apply, with
researchers at Intel discovering that they were unable to make faster chips
at the end of 2004, writes Waikato University computer science professor
John Cleary. Meanwhile, he says other major chip manufacturers shut down
projects or stopped making announcements regarding increases in speed. In
1965, computer engineer Gordon Moore predicted that the clock speed of
semiconductors would double about every two years through 1975, but his
prediction held up for 40 years. Moore also predicted that the number of
transistors would double and prices would continue to fall, and those
factors still apply. The presence of more transistors on chips will make
some things go faster because they will be able to handle different tasks
at the same time. This will mean that the speed and quality of
applications such as games will continue to improve. Currently, the
fastest chips stop short of 4 GHz, and there is no indication that chip
makers will be able to produce anything that can top that speed. Computer
scientists will now need to focus on getting chips to do more things at the
same time, Cleary writes.
Click Here to View Full Article
to the top
The Fight Against V1@gra (and Other Spam)
New York Times (05/21/06) P. 3-1; Zeller Jr., Tom
As email filtering technologies have become more sophisticated, bulk
emailers have begun sending larger, image-based messages in an attempt to
slip past antispam filters. While end users are no longer inundated with
the same volume of unwanted email that they faced just a few years ago,
spam is still a major problem for network operators. Inbox filters nearly
eliminate the amount of bulk messages that users receive, though spam still
accounts for around 70 percent of all Internet traffic, in spite of the
numerous regulatory initiatives enacted throughout the world designed to
combat the problem. Between one-half and three-quarters of all spam is
produced by zombie computers. Spammers who work out of countries with lax
law enforcement such as Nigeria or Russia have little incentive to cease
their operations, particularly when they can turn handsome profits by
eliciting responses from less than 1 percent of the up to 200 million
messages that they send out daily. Antispam groups have developed
technologies to determine whether the borders of images in spam email have
been generated randomly, a tactic that bulk emailers have recently adopted
to evade filtering tools. "There are loads of different kinds of
obfuscation," said MessageLabs senior antispam technologist Nick Johnson.
"They've realized that people are looking for V1agra spelled with a '1' and
st0ck with a 'zero' and that sort of thing, so they might try some sort of
meaning obfuscation," he added, such as "referring to a watch as a 'wrist
accessory'" rather than a 'Rolex.' Johnson also described a particularly
impressive spam trick in which spammers used incorrect spelling and HTML
code in such a way as to evade detection by software programs but appearing
correctly to viewers. MessageLabs' Matt Sergeant says the company has also
developed a database of "scam DNA" which uses pattern analysis to find spam
that uses language common enough to avoid detection otherwise.
Click Here to View Full Article
- Web Link May Require Free Registration
to the top
Perils of Transitive Trust in the Domain Name
System
Cornell University (05/06) Ramasubramanian, Venugopalan; Sirer, Emin Gun
The complexity of the domain name system (DNS) is such that a
vulnerability in a little-known nameserver can have serious ramifications
while trust relationships are hard to particularize and weave together,
write Venugopalan Ramasubramanian and Emin Gun Sirer with Cornell
University's Department of Computer Science. A reliance on transitive
trust engenders a situation where trust relationships can change without
even the most assiduous name owners realizing it. The authors' survey of
the trusted computing base in DNS reveals its great extent and potential
inclusion of over 400 nodes; an average name relies on 46 nameservers,
while the average in certain top-level domains tops 200. One third of
domain names can be hijacked with publicly-known exploits through DNS,
enabling hackers to wreak mischief. The survey also finds that 10 percent
of the namespace is controlled by some 125 servers, one-fifth of which are
run by educational institutions that may lack sufficient inducements and
resources to practice integrity enforcement. Name security on the Internet
can be fortified by the implementation of DNSSEC, although the authors
caution that this solution still depends on the same physical delegations
as DNS during lookups. DNSSEC must be more widely adopted in order to be
truly effective, and even the support of DNSSEC by all nameservers cannot
eliminate the disruption of name resolution by denial of service attacks on
Web services. Ramasubramanian and Sirer reason that network administrators
must have more familiarity with DNS vulnerabilities and exercise greater
diligence over their trust relationships.
Click Here to View Full Article
to the top
Will Your Vote Count in 2006?
Newsweek (05/29/06) Vol. 147, No. 22, P. 14; Levy, Steven
With experts calling the recently reported vulnerabilities in e-voting
machines the most serious ever discovered, Americans' confidence in the
integrity of the election process is in jeopardy, writes Steven Levy.
Diebold claims that the flaw uncovered last month by Finnish security
expert Harri Hursti was designed to enable the machines to easily receive
software upgrades, though that feature also invites the possibility that
anyone with an elementary familiarity with the machines could install
malicious code in a matter of minutes. Hackers could program the machines
to fail on Election Day or, worse still, manipulate the ballot-counting
functions to switch votes from one candidate to another. That type of
software is capable of disguising itself so that even authorized
technicians would be unable to detect its presence. "If Diebold had set
out to build a system as insecure as they possibly could, this would be
it," said Avi Rubin, a professor of computer science at Johns Hopkins
University. Concerns over the security of e-voting machines have sparked
calls for including a mechanism to produce paper receipts in the event that
a manual recount is necessary. "When you're using a paperless voting
system, there is no security," said David Dill, a professor at Stanford
University. Twenty-six states have already moved to implement a
paper-recording mechanism, though a legislative initiative that would bar
paperless voting throughout the country is stalled on the House floor. Six
years after the disastrous election of 2000, U.S. voters will head to the
polls this year still uncertain if their votes will be accurately recorded,
Levy gloomily concludes. For information about ACM's e-voting activites,
including a recent report on Statewide Databases of Registered Voters,
visit
http://www.acm.org/usacm.
Click Here to View Full Article
to the top
Certified Reputation--How an Agent Can Trust a
Stranger
University of Southampton (ECS) (05/16/06) Huynh, Trung Dong; Jennings,
Nicholas R.; Shadbolt, Nigel R.
Current approaches to building computational trust models, interaction
trust and witness reputation, are limited, and the authors propose
Certified Reputation (CR) as a trust model that circumvents these
shortcomings by addressing agents' lack of direct experience and the
difficulty in finding witness reports. Through CR, agents can dynamically
supply third-party references about their earlier performance in order to
build up trust among prospective interaction partners. This allows the
rapid establishment of trust relationships while keeping costs to the
involved participants low. Certified reputation of a target agent involves
a number of certified references on how that agent behaves on specific
tasks supplied by third-party agents (ratings), which are collected and
retained by the target agent itself and made available to any other agent
desiring to assess its level of trust for future interactions; through
these ratings, the targeted agent can demonstrate its performance as judged
by previous interaction partners in order to earn potential partners'
trust. The authors acknowledge that the CR data will likely exaggerate an
agent's projected behavior because a rational agent, being able to select
which ratings to present, will only advertise its best ratings. Still, CR
spares agents various expenditures associated with tracking down witness
reports in terms of resources, time, and communication, and enables agents
to assess trust for themselves, removing the need for a centralized
service; this establishes compatibility between CR and open multi-agent
environments. The authors' evaluation of CR shows that the model can help
agents choose better interaction partners faster than with other
computational trust models. Their future plans include developing a
technique to automatically adapt the accuracy tolerance threshold while the
system is running via the analysis of recorded performance levels of
service providers with whom an agent has interacted to ascertain the
probable inconstancy of honest ratings.
Click Here to View Full Article
to the top
Women in IT Speak Out
MC Press Online (05/06) Ordonez, Sandy
In interviewing two women about their IT careers, Sandy Ordonez found that
while their jobs and experiences varied greatly, they had similar insights
into the opportunities and obstacles facing women in IT. Kristen Daebler,
a programmer for Quadrant Software, knew that she wanted to study science
and math from an early age, and decided on programming in high school. "I
always grew up thinking I was unusual for a woman because I was more
logical and less creative. My major in college was computer science and my
minor was math, and that was unusual for a woman," she said. While Daebler
never faced overt discrimination in the workplace, she allows for the
difficulty that woman can face when trying to climb the corporate ladder,
given the time demands of working in IT that can interfere with family
life. Daebler also admits that she refused to give up the time with her
kids that would have been necessary to advance her career. She advises
career-minded women in IT to put off having children, unless they are
willing to spend a lot of time apart from them. Maria DeGiglio, currently
employed as a consultant for Experture, has worked as a trainer, an
analyst, an author, and a project manager throughout her more than two
decades in IT. DeGiglio took her bachelor's degree in anthropology, and
only came to IT after working for a New York accounting firm in the 1980s.
DeGiglio says that she never suffered discrimination for being a woman, and
that her liberal arts background did not stand in her way. "When the
momentum started with the PC revolution, [it attracted] a lot of people
from various walks of life, and those that weren't 'scientifically trained'
brought to it a great deal of imagination that revolutionized the whole
movement." While Daebler and DeGiglio both say that being female did not
hold them back, the field of IT continues to struggle attracting and
retaining women. To learn about ACM's Committee on Women and Computing,
visit
http://women.acm.org.
Click Here to View Full Article
to the top
Creating and Operating National-Scale Cyberinfrastructure
Services
CTWatch Quarterly (05/06) Vol. 2, No. 2,Catlett, Charlie; Beckman, Pete;
Skow, Dane
The authors use the TeraGrid project as an example of the costs and
functions associated with the provision of a national cyberinfrastructure,
with a focus on the software infrastructure and policies that are necessary
to combine a variety of elements into a reliable and persistent
national-scale facility. A grid facility's software components include
science applications, middleware, infrastructure support services, and the
tools for configuring community-developed systems or "Science Gateways,"
which most often take the form of Web portals. "We have partnered in the
TeraGrid project not only with gateway providers but also with other grid
facilities to identify and standardize a set of services and interaction
methods that will enable Web portals and applications to invoke
computation, information management, visualization, and other services,"
state the authors. A national-scale grid facility taps software
infrastructure that supplies a set of common services, an architecture that
enables unique facility exploitation, and the infrastructure required to
coordinate the user-supportive efforts of resource providers. A successful
grid facility involves close collaboration and cooperation between all
participating organizations, and the identification of specific common
service coordination and provision responsibilities, which in most grid
projects is a function executed by a system integration team. A
general-purpose grid facility must adjust to its user community's changing
ideas and requirements, and the optimal model for user support is one that
fully harnesses all available human links to users and their problem
domains, most frequently by having the user support personnel local to the
resource providers. Each of the facility's resource providers will supply
documentation and training for locally provided resources and services, and
proactively integrating these materials requires a communication framework
that offers structure and common interfaces and formats for the materials,
as well as the curation of the general systems. A national grid facility
must use an operational infrastructure as its platform, while collaboration
systems and processes that support virtual and distributed teams must be
carefully attended to.
Click Here to View Full Article
to the top
While You Were Reading This, Someone Ripped You
Off
Wired (05/06) Vol. 14, No. 5, P. 166; Newitz, Annalee
Hackers are exploiting increasingly pervasive radio frequency
identification (RFID) technology to beat security measures and steal or
vandalize valuable information as well as physical items. The information
carried on most commercial, passive-emitting RFID chips is rarely encrypted
because it is so expensive, and this increases the danger that these chips
can be cloned or that the data they hold can be corrupted. Although
writable areas of RFID chips can be locked, many organizations fail to do
so because they are unfamiliar with the equipment's operation or because
the data fields must be regularly updated; using unlocked tags is often a
more convenient option. Examples of RFID hacking include the recording of
data on RFID-based price tags, which hackers can then upload to tags of
other items, and the disabling of car antitheft devices through the use of
a cloner to capture an encrypted RFID signal and a computer to crack it.
"The world of RFID is like the Internet in its early stages," explains RSA
Labs research manager Ari Juels. "Nobody thought about building security
features into the Internet in advance, and now we're paying for it in
viruses and other attacks. We're likely to see the same thing with RFIDs."
Next-generation, RFID-equipped digital passports will reportedly have
unbreakable encryption, but Juels thinks a brute-force attack could
compromise the data since the encryption keys rely on passport numbers and
birthdates that are structured and guessable.
Click Here to View Full Article
to the top