House Panel OKs Digital Licensing Bill
CNet (06/08/06) Broache, Anne
A House panel has approved a measure introduced by Rep. Lamar Smith
(R-Texas) to bring up to date copyright law that established a complex
system of "mechanical royalties" for the right to reproduce and distribute
music, deemed antiquated in the digital age. The new bill creates a
"blanket licensing" system in which a one-stop shop overseen by the U.S.
Copyright Office would be established for license approval. Currently,
companies wishing to sell music must negotiate separate agreements for each
song. Supporters hail the measure as a means to facilitate legal music
download service, which will eventually drive down prices and provide more
selection for consumers. "We now have the ability to give legal services
the tools to compete with and hopefully drive illegal music services out of
business," said Rep. Howard Berman (D-Calif.), a co-sponsor of the measure.
The Recording Industry Association of America, the Digital Media
Association, and the National Music Publishers Association voiced their
support for the bill in a joint statement but said it still needed to be
ironed out. Opponents of the measure fear it could eventually choke
consumer protections for material recorded for noncommercial purposes and
lead to duplicative fees for the "performance" of material and reproduction
or distribution of it, which now require separate licensing agreements.
They argue that the bill's intent could be shifted to other media and
adversely impact such technologies as TiVo.
Click Here to View Full Article
to the top
Pentagon Sets Its Sights on Social Networking
Websites
New Scientist (06/10/06) Marks, Paul
The NSA is funding research into technologies that could extract meaning
from the mountains of personal data posted on social networking Web sites.
The NSA research could bring the vision of the Semantic Web closer to
reality, as it could combine information from social networking sites with
other data, such as banking, retail, and property records to create
comprehensive profiles of individual users. The focus on social networking
sites comes as Americans are still reeling from the revelation that the NSA
has been collecting phone call records since shortly after the Sept. 11,
2001, attacks. The NSA plans to conduct similar surveillance of the Web,
piecing together a composite picture of individuals by analyzing their
contact networks. The Semantic Web will make the comparison of data in
disparate formats possible thanks to the common structure known as the
Resource Development Framework (RDF). "RDF turns the Web into a kind of
universal spreadsheet that is readable by computers as well as people,"
said David de Roure, an advisor to the W3C. Every piece of numerical data
would have its own tag, and different references to the same concept would
link to each other. While the Semantic Web is expected to transform
Internet search, it will also make it much easier to snoop into people's
private lives. Nevertheless, the organization known as the Advanced
Research Development Activity, charged with disbursing NSA funds, has taken
an active interest in harvesting social networking data to make meaningful
connections between people.
Click Here to View Full Article
to the top
Thomas Sterling Speaks to the Future of HPC
HPC Wire (06/09/06) Vol. 15, No. 23,Lazou, Christopher
In a recent interview, Louisiana State University computer science
professor Thomas Sterling shared his thoughts on the challenges currently
facing the high-performance computing environment. In his research,
Sterling is exploring the challenges of balancing efficiency, power,
reliability, and scalability. Sterling describes the dire need for
incorporating latency hiding features in the programming infrastructure.
He is developing a new computing model called ParalleX that he hopes will
lead to a programming methodology capable of operating on traditional
architectures, leading to improvements in scalability and hiding latency.
While federal funding for HPC research is scarce, Sterling is hopeful that
government agencies will soon begin to see the potential of the field.
Sterling believes that university research already under way will extend
the lifespan of silicon as the material that serves as the basis for
computing and take it down close to the nano-scale, though the studies will
require ongoing federal funding. The most likely successor to silicon will
be a new material that actually incorporates some silicon. While some
scientists are conducting intriguing research in optical computing,
Sterling believes that none of the technologies is close to making optical
logic and memory a viable alternative to conventional designs. Sterling
warns of the potential for creating impossibly complex architectures as
companies continue to explore heterogeneous systems, adding components such
as FPGAs and processors-in-memory. Most PCs are not programmed with
scalability in mind, Sterling says, and they are not equipped to deal with
problems such as latency hiding, parallel overheads, and resource
contention. Sterling believes that the language required to address these
issues has yet to be written.
Click Here to View Full Article
to the top
Specter Offers Compromise on NSA Surveillance
Washington Post (06/09/06) P. A4; Pincus, Walter
Sen. Arlen Specter (R-Pa.) has modified his position on the Bush
administration's surveillance programs, proposing legislation that would
make the procurement of a warrant from a federal court optional. Specter's
move backs away from his earlier stance that the NSA's warrantless
surveillance program targeting phone calls and emails of suspected
terrorists and associates should be subordinate to the secret court
mandated by the Foreign Intelligence Surveillance Act (FISA). The proposal
states that it cannot "be construed to limit the constitutional authority
of the President to gather foreign intelligence information or monitor the
activities and communications of any person reasonably believed to be
associated with a foreign enemy of the United States." The Bush
administration has claimed that its surveillance programs have
constitutional authority in their own right, arguing that there is no need
for additional legislation. Another provision in Specter's bill would
grant immunity to anyone who gave the order for warrantless surveillance
under presidential authority. Also, the 29 cases contesting the legality
of the NSA program pending in federal courts would be consolidated into a
single suit that would ultimately be reviewable by the Supreme Court.
Until this point, Specter had been unsuccessful in his attempts to secure
an opinion from the administration on a constitutional review of the NSA
surveillance program. "I think he [Vice President Dick Cheney] is serious
about trying to work something out," Specter said. "For the first time, he
said they are willing to consider legislation." Sen. Diane Feinstein
(D-Calif.) has said that she cannot support a bill that would authorize
government eavesdropping without a court order, and has introduced her own
legislation that would only permit government surveillance under the
auspices of FISA.
Click Here to View Full Article
to the top
43rd Design Automation Conference Features Leading-Edge
Theme on Multimedia, Entertainment and Gaming
Business Wire (06/06/06)
Multimedia, entertainment, and games (MEGa) will be the subject of a
number of demonstrations during the 43rd Design Automation Conference
(DAC), which is set to take place July 24-28, 2006, in San Francisco. The
best papers on multimedia from the International Solid-State Circuits
Conference will be featured in a special session at the conference, and
will include presentations from Renasas/DoCoMo, MediaTek, National
Chiao-Tung University, and Samsung. An invited session on CAD challenges
in multimedia design and a panel on design challenges for next-generation
MEGa platforms serve as complementary events. "There are rapid
advancements in home networking, digital-mobile TV, mobile-digital
convergence, and many other consumer-centric design areas," says Andrew
Kahng, DAC New Initiatives Chair. "Our 2006 MEGa theme is going to offer a
great chance for the DAC audience to hear about the roadmap for
applications, design challenges, and CAD challenges." ACM's Special
Interest Group on Design Automation (ACM/SIGDA), the Circuits and Systems
Society and Computer Aided Network Design Technical Committee of the
Institute of Electrical and Electronics Engineers (IEEE/CASS/CANDE), and
the Electronic Design Automation Consortium (EDA Consortium) are sponsoring
the conference. For more information about DAC, or to register, visit
http://www.dac.com/43rd/index.html.
Click Here to View Full Article
to the top
IMSC Team Sees Way to Improve Game Testing by Analyzing
User "Immersadata"
USC Viterbi School of Engineering (06/06/06)
Engineers at the University of Southern California have developed a tool
to capture and analyze the experience of a video game player to look for
flaws and weak spots in the game, and it could soon measure the player's
emotional involvement. Though essential to creating new games, user
testing is still a subjective and unstructured process. Normally, "game
companies hire teenagers, and turn them loose trying to find flaws and gaps
in the game," said Tim Marsh, a post-doctoral researcher at USC. Marsh
believes that his method, which analyzes "immersadata," is more scientific
and systematic. Immersadata refers to the machine-readable log of commands
that computers receive from controls such as keyboards and joysticks
measured alongside video of the player playing the game. The Immersadata
AnalySIS tool, or ISIS, indexes relevant data from the video, organizing it
into six categories: activity completion points, task completion points,
break points, wandering points, critical events, and navigation errors.
Developers could identify patterns where the same errors or design flaws,
such as running into a wall (a navigation error) or a long gap in the
action (a break point), occur in the experiences of multiple players. The
system does a good job of identifying the problems it is programmed to
search for, and Marsh and his colleagues are already working on
enhancements to the system, such as an application that could replay the
game from the player's point of view. Marsh is also exploring ways to use
immersadata to capture the emotional elements of the user's experience.
Click Here to View Full Article
to the top
Making Virtual Worlds More Lifelike
CNet (06/08/06) Terdiman, Daniel
Palo Alto Research Center (PARC) researchers have been studying the social
aspects of massively multiplayer online games (MMOs) with an eye toward
developing more realistic avatars and character interactions. The
researchers say the technology could be used by video game developers to
make their games more appealing, though they note that the video game
industry tends to favor content over the quality of the experience. "When
faced with the decision, 'Do I put in another dungeon or do I improve the
experience for (groups of players)?'" said PARC's Nicolas Ducheneaut,
developers will often say '"I'll put in another dungeon.' I think that's
incredibly shortsighted." The team believes that MMOs could be a valuable
platform for communication and socialization, as well as gaming. They
realize that retooling the structure of games to include a greater social
dimension could be costly and time consuming, but they feel that the
commercial potential is worth it. When looking at a cantina in "Star Wars
Galaxies," a place resembling a bar where players come to get healed, the
researchers found that players were not using the space to socialize. The
PARC team's Bob Moore believes that in designing the cantina, the
developers did not bother to include the small touches that would make it a
popular gathering spot. Chris Kramer of Sony Online Entertainment, which
publishes "Star Wars Galaxies," counters that the publisher's community
relations team spends a great deal of time soliciting feedback from
players. The researchers suggest that publishers might be out of touch
with who their paying customers are, and that many do a poor job of
analyzing the data that they do collect. Moore believes that the main
problem with designing games conducive to socialization is an issue of
"interactional realism," or the behavior and mannerisms of 3D avatars, and
that to effectively simulate real life requires skills in fields such as
sociology, politics, and urban planning.
Click Here to View Full Article
to the top
Getting Computer Grids to Talk to Each Other
IST Results (06/08/06)
While grid computing has been one of the most significant developments in
the computer industry, its success has led to the creation of numerous
systems hamstrung by their inability to communicate with each other. The
IST-funded UniGrids project set out to develop an interoperability layer to
overcome that incompatibility, a problem exacerbated by the varying
strengths of operating systems such as Globus and UNICORE. The UniGrids
Atomic Services interoperability function allows multiple grid systems to
exchange data and facilitates the creation of new grid applications
independent of the core grid infrastructure. The UniGrids project also
built out the UNICORE system, making it compatible with the Open Grid
Services Architecture standards. The UniGrids developers also coordinated
with the Organization for the Advancement of Structured Information
Standards, the Global Grid Forum, and other major standards organizations
to help develop the emerging standards for grid systems. To capitalize on
the economic potential of grid computing, the project is also developing
brokering software to allow commercial groups to rent vast computing
resources when necessary. The scientific community will be the principal
beneficiary of the project, though. "Scientists have told us that our work
now allows them to do things they couldn't do before. Where before they
might be able to study the genes of a fruit fly, now they can do genetic
studies on humans," said Daniel Mallmann, coordinator of the UniGrids
project.
Click Here to View Full Article
to the top
Robot Device Mimics Human Touch
BBC News (06/08/06) Morelle, Rebecca
U.S. researchers have developed a tactile sensor that feels with the same
level of sensitivity as a human finger, potentially leading to improvements
in minimally invasive surgical procedures. "If you look at the current
status of these tactile sensors, the frustration has been that the
resolution of all these devices is in the range of millimeters," said
University of Nebraska engineer Ravi Saraf. "Whereas the resolution of a
human fingertip is about 40 microns, about half the diameter of a human
hair, and this has affected the performance of these devices." By placing
electrodes at the top and bottom of a thin film made from layers of
semiconducting nanoparticles and metal, the researchers enacted the effect
known as electroluminescence, where current changes in the film and light
is emitted when the layers of particles are squeezed together. "The
beautiful thing is that we have managed to make the device in such a way
that the amount of current change, or light, that you get out is exactly
proportional to the stress that you apply," Saraf said. The film, in
addition to matching the sensitivity of the human finger, is flexible
enough to be used repeatedly without damaging it. Saraf hopes that the
technology could enable surgeons to conduct exploratory surgeries and
determine the condition of human tissue without physically touching it.
Saraf next hopes to develop a device capable of detecting temperature
changes to emulate human sensations even more closely.
Click Here to View Full Article
to the top
Robo Pups Vie to Be Top Dog
Age (Australia) (06/08/06) Hearn, Louisa
RoboCup is scheduled to kick off in Germany on June 14, and Australia will
be represented by teams from UNSW, University of Newcastle, and Griffith
University, who will join 21 other teams from around the world in pitting
their robotic dogs against one another in a game of soccer. Each year,
participants tweak software to give their Sony AIBO robotic dogs an
advantage in wireless communication and maneuverability during matches,
which are completely independent of any human control. Ahead of the 2000
RoboCup, UNSW changed the leg angles of AIBO, giving its robotic dogs a
speed advantage as its team claimed the title that year. Brad Hall,
development manager for the UNSW computing department, says the limitations
of available hardware often present the biggest challenges for the
participants. "Robots only have one camera and thus no depth perception so
finding the ball and the goal and the other robots 30 times per second is
very challenging," says Hall, manager of the UNSW Runswift team. "It
involves two-point triangulation, a lot of guestimation, and a lot of
sanity checks." There are some concerns about the future of the
Four-Legged League because of Sony's decision to discontinue the AIBO this
year in an effort to focus more on business robotics.
Click Here to View Full Article
to the top
The Future of the IT Organization: Software Trends for
the 21st Century
Computer Weekly (06/07/06) Richards, Justin
Processors will have multiple CPUs in the years to come because of the
power demands of modern cache sizes, according to Andrew Herbert, managing
director of Microsoft Research in Cambridge. "A large cache on a single
chip is the only way to improve performance when Moore's Law runs out,"
Herbert said during a lecture organized by the Royal Signals Institution
and the British Computer Society in which he discussed his vision of
software practices in the future. Although software is no longer limited
by hardware, Herbert said the quality of new code and help to maintain old
code need to be improved. In addition to improvements in processor power,
increased memory capacity and developments in model checking and theorem
proving will allow for more formal methods. Herbert believes everyone will
carry around a terabyte of storage space by the time he retires, and that
research on digital tapestries could lead to new ways of organizing data,
including videos and images. "We can envisage a future in which we combine
new display formats with machine perception techniques that allow input via
handwriting, gesture, touch, speech, or placement of physical objects to
create interactive surfaces," he said of the possibility of combining
microelectronics-based screen and projection technologies with wireless
networking. Artificial intelligence will gradually lose ground to
"intelligent" applications, he added.
Click Here to View Full Article
to the top
Hacktivists Mount Counter-Offensive to Internet
Censorship
IT World Canada (06/08/06) Arellano, Nestor E.
A group of socially conscious hackers, or "hacktivists," from the
University of Toronto has developed software to combat Internet censorship
in countries with repressive governments. The software, called Psiphon and
developed at the UT Citizen Lab, allows a third-party computer to function
as a proxy, enabling Internet users to view restricted content. The
Citizen Lab is currently focused on China and other countries that impose
censorship, though it is not ignoring Western countries. "Headlines like
the Great Firewall of China have spotlighted censorship in that country and
others such as Iran and Saudi Arabia, but filtering activities in Western
states or so-called democratic countries frequently fly under the radar,"
said Ron Diebert, head of the Citizen Lab. The researchers must act like
covert agents in their work as they coordinate with dissidents in
repressive countries where discovery is a constant danger. "Identities and
locations are kept secret and information is compartmentalized, just as any
spy agency would do because in most instances lives are at stake," Diebert
said. China maintains an elaborate systems of routers and gateways, using
advanced technology to control its citizens' Internet activity. Internet
activists in nonrestrictive countries install Psiphon on their computers
and create a list of trusted users in repressed countries who can use the
computer's IP address to access banned Web sites without being detected.
Psiphon encrypts data and travels on a secure network typically used by
financial institutions.
Click Here to View Full Article
to the top
Flesh and Machines
Weekly Dig (06/06/06)
When artificial intelligence failed to realize the lofty visions heralded
by its pioneers in the 1950s and 60s, the discipline entered a long cooling
period when funding dried up and many people lost interest. Though the
boom years of the 1990s breathed new life into the fledgling industry,
Rodney Brooks, director of MIT's Computer Science and Artificial
Intelligence Laboratory, had been able to thrive in the lean years by
recasting the objectives of artificial intelligence as a whole. Brooks
realized that it was fruitless to try to simulate human behavior in a
machine, and instead set out to model organisms such as amoebae and
insects. Brooks, who co-founded iRobot, created devices that took their
cues from actual perceptions of the real world instead of theories, and the
robots began to function. In the mid 1990s, the Mars Sojourner robot
autonomously explored the planet's surface according to its own
self-generated agenda. Brooks envisions the eventual convergence of
robotics and biotechnology, beginning with corrective devices for the
disabled. Brooks has also conceived of an interface that links human
neurons with the Internet, though he eschews the formulaic Hollywood notion
of robots becoming so powerful that they can dominate and enslave humans.
For the foreseeable future, Brooks believes that robotics will be confined
to helper robots that perform everyday tasks. He notes that manual labor
has not experienced a revolution similar to the effect that the
introduction of the PC had on knowledge workers 20 to 30 years ago, when
many repetitive and time-consuming tasks were automated. Developments in
graphical user interfaces have almost made that revolution a reality,
Brooks says, adding that prices just need to drop for robots to be deployed
on a much greater scale throughout industry.
Click Here to View Full Article
to the top
Nation's Largest Computing Grid Announces Inaugural
Conference
AScribe Newswire (06/06/06)
Scientists and researchers from across the country will present some of
the work being done on the largest computer grid in the United States
during the national TeraGrid conference in Indianapolis June 12-15. The
TeraGrid takes advantage of the computing power and resources of the San
Diego Supercomputer Center, Texas Advanced Computing Center, University of
Chicago-Argonne National Laboratory, National Center for Supercomputing
Applications, Purdue University, Indiana University, Oak Ridge National
Laboratory, and the Pittsburgh Supercomputing Center. The information
technology resources available to researchers includes more than 100
teraflops of computing, which is comparable to the computing capability of
28,000 desktop computers. "In addition to providing some of the most
powerful computing resources in the world and the high-speed networking to
make them accessible, the TeraGrid is working to make existing data
collections easily accessible online to serve entire research communities,"
says Scott McCaulay, conference program co-chair and TeraGrid site leader
at Indiana University. TeraGrid has recently been used to simulate a 7.7
magnitude earthquake along the San Andreas Fault, forecast thunderstorms
within 20 miles and within 30 minutes of when they occur, create 3D
animated visualizations of blood flowing through arteries, and to develop
computer models for the spread of avian influenza.
Click Here to View Full Article
to the top
Domain Name Price Hikes Come Under Fire
CNet (06/07/06) McCullagh, Declan; Broache, Anne
VeriSign received ICANN's blessing to hike fees on .com domains by 7
percent yearly in March, a move that has provoked a backlash in Congress.
Members of the House Small Business Committee described the price hikes as
unreasonable at a Wednesday hearing, and ICANN's decision must be passed by
the Commerce Department before it can go into effect. Domain registrars
that sell .com names have been lobbying fervently against the VeriSign deal
because it entails higher prices for them. "I have no objection to
VeriSign's continuing to run the .com registry," said Network Solutions CEO
W.G. Mitchell at the hearing. "What I do have is an objection to it being
done in a manner that gives a perpetual monopoly to a company with
unregulated price increases." Mitchell says VeriSign could potentially
reap $1.3 billion in new revenue through the price hikes, and more than 50
percent of that amount would come from 10.5 million small businesses that
use the Internet; he added that the sanction of rate hikes is not in
keeping with the deal reached with VeriSign last year over the .net
registry, which lowered the domain name base price. Rep. Rick Boucher
(D-Va.) has already asked the Bush administration to scrutinize the
proposal because it could have "serious anti-competitive implications," and
last month Sen. Orrin Hatch (R-Utah) requested an investigation by the
Justice Department. ICANN and VeriSign have supported the deal, citing the
issue of Internet security, while ICANN general counsel John Jeffrey said
the agreed-upon percentage gains mean that .com fees would increase just
$1.86 in the next six years. New laws must be enacted for Congress to
block the approval of the settlement, but the White House could attempt to
negotiate new concessions with just the threat of congressional action
looming.
Click Here to View Full Article
to the top
Data Grab
InformationWeek (06/05/06)No. 1092, P. 23; Greenemeier, Larry; Claburn,
Thomas; Hoover, J. Nicholas
As the federal government requests or demands more customer information
from businesses, ostensibly to battle terrorists and criminals, people are
becoming increasingly worried that the personally identifiable data
contained in such information could be exploited and abused. In addition
to the ethical quandary, many companies face additional costs, such as
money and manpower, in order to retain data for longer periods of time in
case it is called for in federal investigations. The government makes
requests for data via subpoenas or "national security letters," which are
used liberally, according to a 2005 Washington Post article. These
requests can entail significant costs for recipients, although the Justice
Department does not keep an account of those costs. Sometimes companies
will refuse to comply with government data requests, as Google did. The
company's refusal led to a court ruling that reduced the amount of data
Google had to share with Justice, and illustrated "that neither the
government nor anyone else has carte blanche when demanding data from
Internet customer," wrote Google associate general counsel Nicole Wong.
There are fears that the business and Internet data being collected by
government agencies will be fed into a database from which citizen profiles
can be extracted. Another source of concern is skepticism that the
government can effectively secure the data it collects.
Click Here to View Full Article
to the top
On the Right Track
New Scientist (06/03/06) Vol. 190, No. 2554, P. 32; Kleiner, Kurt
A new search engine model is required to handle the massive, ever-growing
volumes of digital music tracks online, and universities and companies are
working on software that can acoustically analyze any music it hears. This
ability could be enabled through algorithms that can render a signal as a
blend of sine waves with distinct frequencies and amplitudes, and that can
study note patterns to determine attributes of melody, tempo, and harmony
in order to assign genre to a piece as well as facilitate comparison of
different tracks and characteristic-based groupings. MusicIP's MusicIP
Mixer software can scan the music in a user's personal library and then
organize playlists based on any song the user cares to select, making
recommendations for similar-sounding tracks; the program's performance
partly rides on a database of 17 million songs that it refers to, according
to MusicIP CEO Matthew Dunn. Another company, Pandora, wants to make a
search engine capable of finding tracks a listener wishes to hear by
classifying songs according to a vaster range of parameters. "The best way
to describe it is a musical description, not someone's opinion about how
good a song is," explains Pandora founder Tim Westergren. Attributes are
assigned to each song by trained musicians, since computers cannot analyze
music with the sophistication of the human ear. The Semantic Interaction
with Music Audio Contents (SIMAC) project aims to enable computers to more
effectively pick out complex characteristics such as style, tempo, and
rhythm, and to assess how combinations of chords or notes generate
harmonies through a mix of human cognition studies, music analysis, and
artificial intelligence.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Netcentric in a Snap
Government Executive (06/01/06) Vol. 38, No. 9, P. 48; Perera, David
The Pentagon envisions a network-centric battlefield of the future where
information awareness is used to keep forces coordinated and organized in
the face of rapidly shifting circumstances, environments, and foes.
Military planning, training, equipment, technology, and principles must
adapt to this new paradigm, while backers of the service-oriented
architecture (SOA) software development framework say the code underlying
netcentricity must become equally flexible. "We believe this is the best
way, and quite frankly, the only way going forward, in terms of addressing
the complexity we're dealing with," explained Rob Vietmeyer of the Defense
Information Systems Agency (DISA) at a March industry conference. "The
models that we used in the past for systems development aren't going to cut
it." Complex processes can be broken down into smaller, simpler, and
modular components with SOA software, eliminating the need for
specially-tailored middleware to function as an intermediary between
incompatible programs. Thus, SOA allows data to migrate horizontally
across networks, going to people who need it faster and enabling the fast
construction of new data applications in response to changing battlefield
conditions. Key to employing SOA interoperability is the military's
willingness to cede authority for data collection, and RABA Technologies
chief scientist John Reel says the interdependability this implies is a
tough challenge in life-or-death situations. Other obstacles to
battlefield netcentricity via SOA include the difficulty in articulating
the development framework's advantages to non-technical people, whose
support is often critical for investment.
Click Here to View Full Article
to the top
Inside the Spyware Scandal
Technology Review (06/06) Vol. 109, No. 2, P. 48; Roush, Wade
Sony BMG's inclusion of a "rootkit" on their compact discs enabled the
company to spy on its customers while giving hackers an exploit through
which they could hijack people's computers, and has become symbolic of the
increasing distrust media companies seem to be exhibiting toward consumers.
This distrust threatens to strangle business as consumers view these
companies with equal suspicion. Assessment of the rootkit the record
company utilized to conceal copy protection software on CDs--to thwart its
location and removal--showed that it could mask other files, such as worms
and Trojans, just as easily. Playing the CD on the computer allowed such
files to be installed on the computer in secret; and indeed, hackers
devised malware to exploit the Sony BMG rootkit shortly after its existence
was publicized. The scandal this revelation ignited has re-opened the
debate on how consumers should be permitted to use copyrighted digital
information, and just how far copyright holders should be allowed to go to
protect their intellectual property from unauthorized duplication. "When
you build computer systems where you're not protecting the user, but
something from the user, you have very bad security," argues Counterpane
Internet Security CTO Bruce Schneier. Princeton University computer
scientist J. Alex Halderman alleges that the rootkit's designers must have
known that malware writers were familiar with the masking technique they
were using. Computer security professionals say the debacle points to the
need for digital rights management (DRM) software that is transparent and
computer friendly, respectful of users' privacy and security, user
serviceable, and above all, flexible.
Click Here to View Full Article
to the top