Tempting Data, Privacy Concerns
New York Times (08/23/06) P. C1; Hafner, Katie; Zeller Jr., Tom
The three months' worth of search data inadvertently released to the
public by AOL researchers poses a conundrum for researchers such as Cornell
University computer science professor Jon Kleinberg: They could sift
through a dataset that could offer academic researchers an unprecedented
glimpse into how people use the Web to retrieve information, or ignore the
data out of respect for the users' individual privacy. Kleinberg, whose
research focuses on algorithms for understanding and searching the Web,
downloaded the data immediately, but decided against using it due to
privacy concerns. The breach shines a spotlight on the long-running
frustration of academic researchers that raw data about Internet usage is
extremely difficult to come by, accessible only to a cadre of corporate
researchers working at the large companies where the data is locked up.
Many researchers claim that the data, which details the search queries of
some 650,000 AOL users, is too valuable to ignore. The users are not
personally identified in the data, but in some cases the search terms
reveal enough information to infer an individual's identity. AOL moved
quickly to take the data off of its research Web site, but numerous other
sites had already downloaded the data, reposted it, and made it searchable.
Academia, excluded from the fresh datasets routinely made available to
researchers at companies such as Google, has in essence made do with the
Alta Vista and Excite datasets for almost a decade, though they shed scant
light on the habits of today's users. "The way people use search engines
now is totally different," Kleinberg said. "Partly because what you
expected to get out of a search engine back then was much less, so people
didn't try anything too fancy." Everyone can agree that protecting privacy
is important, said Jamie Callan, an associate computer science professor at
Carnegie Mellon University and chairman of the ACM's special interest group
on information retrieval. But, Callan claims, "there's also a strong
belief that it is very important for the scientific community to have
access to data of this kind in some anonymized form."
Click Here to View Full Article
to the top
Wireless Pioneer Viterbi Headlines ACM MobiCom '06
AScribe Newswire (08/21/06)
ACM's Special Interest Group on Mobility of Systems Users, Data, and
Computing (ACM SIGMOBILE) is sponsoring MobiCom'06, which takes place at
the Marina del Rey Marriott in Los Angeles, Sept. 24-29, 2006. The 12th
annual International Conference on Mobile Computing and Networking will
draw researchers, industry professionals and executives, as well as
students from around the world for a panel that takes an in-depth look at
wired and wireless access. Workshops on wireless and sensor network
dependability and security, underwater networks, vehicular ad hoc
networking, and wireless mobile services on local area network hotspots are
also planned for the gathering. Wireless trailblazer Andrew Viterbi, an
electrical engineering professor at the University of Southern California
who co-founded Qualcomm, will discuss his expectations for wireless
communications in the years to come in his keynote presentation on Tuesday,
Sept. 26. The same day, ACM's Athena Lecturer Deborah Estrin, a computer
science professor at the University of California, Los Angeles, will
present "Wireless Sensing Systems: From Ecosystems to Human Systems." The
panel, "Wired Vs. Wireless Access: the Race to Higher Speeds," is scheduled
for Thursday, Sept. 28, and will include executives from Verizon Wireless,
Ericsson, Intel, Cisco, T-Mobile, Cingular, AT&T Research and Sprint as
participants. MobiCom'06 will also feature papers, demonstrations,
exhibits, and an ACM Student Research Competition for students at the
undergraduate and graduate level.
Click Here to View Full Article
to the top
Geeks Pray $100,000 Box Will Solve Software Crisis
Register (UK) (08/22/06) Vance, Ashlee
As the computing industry looks down the road to architectures with four,
eight, and even hundreds of cores per chip, fewer and fewer researchers
will have the resources to experiment with new software models on the
complex new designs. To address that problem, a group of researchers has
developed the RAMP (Research Accelerator for Multiple Processors) system--a
relatively inexpensive FPGA-powered machine consisting of 1,000 nodes that
can serve as a testbed for bleeding-edge system designs. "Little is known
on how to build, program, or manage systems of 64 to 1,024 processors, and
the computer architecture community lacks the basic infrastructure tools
required to carry out this research," wrote the RAMP researchers.
"Fortunately, Moore's Law has not only enabled these dense multicore chips,
it has also enabled extremely dense FPGAs." Today, a single FPGA can
accommodate up to two dozen cores, enabling the exploration of large,
sophisticated architectures. The RAMP project has received unusual
attention because it is partially led by David Patterson, the famed
University of California, Berkeley professor who was a driving force behind
the initial deployment of RISC designs. With multicore designs taking
over, Patterson sees the gulf between hardware and software designers
widening. "What's wrong with the multicore change is that no one is ready
for it," he said. "The pieces of the software stack are not ready for
thousands of CPUs per chip." Patterson argues that software developers
cannot take their customary approach of waiting until ample hardware floods
the market before adjusting their programming tactics. The performance of
today's code will suffer from having to crawl across numerous low-power
chips, and the problem will affect algorithms, programming languages,
compilers, libraries, and operating systems. Patterson says the RAMP
system should give developers a fairly accurate idea of how many clock
cycles a given operation takes to complete.
Click Here to View Full Article
to the top
GNOME and Google Reach Out to Women
NewsForge (08/21/06) Brockmeier, Joe
When the GNOME project received 181 applications for Google's Summer of
Code (SoC) program, and all of them were from men, GNOME's Chris Ball and
Hanna Wallach proposed that some of Google's SoC money be used to reach out
directly to women. The result was GNOME's Women's Summer Outreach Program
(WSOP), which has more than twice the number of projects underway than
originally planned, thanks to Google doubling its funding for the project.
Even though no women applied for the SoC program, WSOP received more than
100 applications, as well as more than 200 email messages from women who
wanted to contribute to GNOME somehow but did not have the necessary coding
skills. The call for WSOP applications reached a wide audience of women in
universities, computing groups, and online, whereas many of the applicants
had not heard of the SoC program. Confidence is also an issue for women,
Wallach said. "Many of the women who contacted us expressed concern about
their coding skills, yet were extremely well-qualified. Google's 'prove
you're the best person for the job' attitude may be off-putting to people
who aren't entirely confident in their skills," she said. In response to
questions about why GNOME would pursue women more enthusiastically than
men, Wallach says that because the majority of the world's population is
female, GNOME is initially targeting the group that could have the biggest
impact. Wallach hopes that GNOME's outreach efforts will serve as an
example for other free software projects. Already, several projects are
underway to boost female participation in Linux, Debian, Ubuntu, and other
open-source projects.
Click Here to View Full Article
to the top
The "Data Valdez" Versus the Privacy Ceiling
The Flowing Candy Bees (08/12/06)
With a group of researchers preparing to present a paper on the economic
limitations of privacy violation at the ACM 2006 DRM workshop, which takes
place October 30, 2006, in Alexandria, Va., the exposure of the search
queries of some 658,000 AOL users seems almost prescient. The concept,
known as the privacy ceiling, argues that forward-looking companies would
scrupulously guard against privacy violations due to the liability of
amassing large repositories of sensitive information. Liability can come
from many sources, such as vicarious infringement, which companies can be
liable for if it can be proven that infringement in fact occurred, that the
company benefited from it, and that the company could have stopped it.
Librarians reacting to the Patriot Act purged the records of their patrons'
reading habits, creating their own privacy ceiling. Similarly, companies
can be liable from their customers for privacy violations. AOL's case,
which appears to have involved a simple miscalculation by a few employees,
illustrates the principal that companies can limit their liability by
reining in their data-collection practices. To curb the potential
liability from disclosing customers' data, the authors of the report
recommend that companies implement architectures with built-in monitoring
capabilities to safeguard sensitive data. They go a step farther and
advise companies to actually control their users' activities to the fullest
extent that their architectures will allow. Finally, the authors recommend
that companies build their systems around privacy alone, rather than trying
to balance the demands of copyright holders.
Click Here to View Full Article
to the top
Chips Promise to Boost Speech Recognition
CNet (08/23/06) Shankland, Stephen
Carnegie Mellon University researcher Rob Rutenbar believes the key to
making speech recognition a practical reality is using a custom computer
chip. At this week's Hot Chips conference in Palo Alto, Calif., Rutenbar
said, "It's time to liberate speech recognition from the unreasonable
limitations of software." A custom processor could be used for speech
recognition similar to the way in which special-purpose hardware has been
used for graphics. Speech recognition software continues to be plagued by
speed limitations and power demands. The "in silico vox" project at
Carnegie Mellon consists of custom ASICs and FPGAs. Rutenbar provided a
videotaped demonstration of Carnegie Mellon's speech recognition technology
using a low-end FPGA, and it recognized short sentences about two times as
fast as researchers were able to speak them and matched the university's
Sphinx speech recognition software in accuracy. First-generation custom
chips are expected to be about twice as fast as the rate of regular speech
for a 5,000-word vocabulary. One custom chip is being developed to work at
10 times the spoken rate, and there are plans to reach speed factors of 100
and 1,000.
Click Here to View Full Article
to the top
Robotics Team Rolls Out Ballbot at Carnegie Mellon
Pittsburgh Post-Gazette (08/23/06) Templeton, David
Researchers at Carnegie Mellon University have developed a robot that can
travel across a room balanced atop a urethane-coated aluminum sphere. The
Ballbot is a self-contained robot that can travel in any direction,
compactly swiveling and turning without falling down. While there are
still many kinks left to work out, CMU research professor of robotics Ralph
Hollis said that it demonstrates that robots can capably operate with just
one wheel. Humanoid robots built with human-like legs are stable, but too
expensive and complex for practical use in the home or office. When he was
seeking to develop an alternative to the clunky three- or four-wheeled
robots that are prone to tipping over when they accelerate too quickly or
travel on ramps, Hollis concluded that a ball would be the best method of
propulsion, though he still had to figure out how to make a tall, thin, and
relatively heavy robot balance atop a soccer-ball-sized sphere.
Fiber-optic gyroscopes give the Ballbot an internal sense of balance by
measuring inertia, pitch, and roll angles, sending out hundreds of signals
every second to a computer that controls the rollers that turn the ball,
ensuring that the device is always in position to stand or roll, but not
tip over. "It's more stable than the typical robot," Hollis said. "It
doesn't like to tip over." Hollis is also working on a project involving
haptics, using techniques such as magnetic levitation to endow the computer
with a sense of touch. He kept the hardware as simple as possible,
shifting most of the Ballbot's workload to the software. The robot carries
the heaviest equipment at the top, drawing on the same principal that makes
it easier to balance a broom by holding the end without the bristles in
one's hand.
Click Here to View Full Article
to the top
US Government Lab Offers Grid Computing Toolkit
IDG News Service (08/22/06) Mullins, Robert
The Department of Energy's Argonne National Laboratory has developed an
open-source toolkit that harnesses the power of grid computing to improve
scientists' ability to collaborate remotely via the Internet. The Access
Grid Toolkit will enable real-time communication among scientists
throughout the world by developing programs that can share video, audio,
data, and text. Now in its third version, the toolkit supports wall-sized
displays and features detailed visualizations of simulations, a
consolidated user interface, and other enhancements. The toolkit has been
downloaded more than 20,000 times in 56 countries since its first offering.
This version of the toolkit, which is written in Python, offers improved
middleware that promises to make it easier to write applications to run on
the Access Grid, according to Thomas Uram, technical lead for the project.
The new version also includes some of the most commonly used Internet
protocols, such as SSL for security, FTP for transferring data, Jabber for
instant messaging, and XML for data description. InSORS Integrated
Communications has used the toolkit as a springboard for commercial
applications, including a research program at the National Institutes of
Health focusing on allergies and infectious diseases.
Click Here to View Full Article
to the top
Capturing Online Video Pirates
Technology Review (08/22/06) Roush, Wade
Popular online video-sharing sites have been fighting a losing battle as
they try to curb the posting of material copied from movies and commercial
TV broadcasts without the permission of copyright holders. For the most
part, these services have been removing unauthorized content when they
receive a complaint from the copyright holder, after it has already been
posted. However, new technologies are emerging that can preemptively
ferret out copyrighted material, such as a system in use on the
video-sharing site Guba that compresses a video file to a mathematical
expression and compares its "fingerprint" with a database of commercial
videos, excluding any matches from the site. Another technology embeds a
watermark in movies, enabling studios to trace bootlegged copies of a movie
taken with a camcorder back to the original theater and movie showing.
While fingerprinting may not be able to keep pace with the volume of TV
programming broadcast every day, and watermarking does not actually catch
pirates, the techniques could be a valuable defense for video-sharing sites
against the same type of legal challenges that brought down Napster and
MP3.com. Roughly one-fifth of movie content on video-sharing sites is
pirated, according to Guba founder Tom McInerney. While pirated content
can drive Web traffic and create advertising views, it can also attract
unwelcome legal attention. The nuisance of dealing with a steady stream of
takedown requests compelled Guba to develop its fingerprinting system,
which uses wavelet technology to condense the video into compact
mathematical representations. Computer vision technology measures the
frequency of scene changes to generate a form of time stamp. The system
screens every video uploaded to Guba, singling out those that match a
fingerprint in the database for human review. The system identifies
pirated content with a 99 percent accuracy rate, McInerney says.
Click Here to View Full Article
to the top
Computers Take Over 2006 Minn. Elections
Associated Press (08/21/06)
Minnesota will count all votes cast in the state's upcoming elections
using electronic machines from Election Systems & Software and Diebold.
State officials are confident that Minnesota will not encounter any
problems in accuracy or tampering of the election results, although
electronic voting has its critics. Secretary of State Mary Kiffmeyer says
the 2004 election had an error rate of nearly zero for ballots counted
electronically, compared to an error rate of about 1.5 percent for ballots
counted by hand. Minnesota plans to retain original paper ballots and
randomly choose precincts for mandatory hand recounts as safeguard
measures. There have been errors or poorly written code, but analysts have
not uncovered any efforts to plant malicious commands in software to fix a
vote, says Brian Phillips, president of SysTest Labs, which serves as an
independent testing authority for voting machine software. Nonetheless,
critics say there are a number of examples of faulty programming,
mechanical failure, and human error having an impact on elections. "The
security standards are practically worthless, as is the certification
process," says David Dill, a professor of computer science at Stanford
University. "At the end of the day, that computer is no more trustworthy
than if you had one person count all the ballots with nobody watching."
Click Here to View Full Article
to the top
Software Takes You Into World of Images
Seattle Times (08/21/06) Romano, Benjamin J.
Microsoft researchers working at the company's recently created Live Labs
division have developed a software program that cobbles together photo
tourism software, imaging applications, and a new display technology to
create an immersive, interactive tour of a remote location. Live Labs was
created to quickly develop new products in an attempt to compete with
Google and Yahoo in the arena of Web applications. A prototype for
Photosynth, for example, appeared only a few months after research began.
"It's kind of a new method for us for developing software and we're pretty
excited about the nimbleness that it will give us," said Microsoft's Adam
Sheppard. Live Labs is headed by Gary Flake, the architect of Yahoo
Research Labs before he was hired away by Microsoft. Photosynth, the first
project to emerge from Live Labs, begins with a collection of images of a
certain place and organizes it with the photo tourism software. The system
builds a basic 3D map of the space by analyzing the features that the
pictures have in common. The user can fluidly navigate through the images,
and clicking on a given feature will bring up other photos taken that
contain the same element. The software also allows users to add
annotations to images, and automatically applies one annotation to all
images that contain the same feature being described. The digital world
created through Photosynth could be linked to the point where the software
could recognize in a picture of someone's refrigerator a postcard from
Notre Dame, for instance, and, treating that image as a Web link, jump off
to a larger set of images of the cathedral. By only streaming data as it
is needed, the software conserves computing resources. While Microsoft has
yet to formulate a business model around Photosynth, the technology could
be used to enhance its Windows Live Local mapping software.
Click Here to View Full Article
to the top
FTC to Examine Net Neutrality
IDG News Service (08/22/06) Gross, Grant
The issue of whether proponents of Net neutrality are justified in their
concern of large broadband providers impeding Web content from rivals will
be studied by an Internet Access Task Force organized by the Federal Trade
Commission (FTC). Public Knowledge President Gigi Sohn said her group
welcomed this turn of events, noting that "We certainly look forward to the
analysis of an agency that exists to protect competition of the broadband
market in which 98 percent of customers receive their service from either
the telephone company or the cable company, if they have that choice at
all." On Monday, FTC Chairwoman Deborah Platt Majoras urged lawmakers to
exercise caution when considering Net neutrality legislation, as it could
ban broadband providers from charging Web sites higher fees for faster
service or prioritizing their own Internet content. In a speech at the
Progress & Freedom Foundation's Aspen Summit, Majoras said she had no
doubts about Net neutrality advocates' sincerity, but was questioning "the
starting assumption that government regulation, rather than the market
itself under existing laws, will provide the best solution to a problem."
Still, she promised that her agency will probe cases of discrimination by
broadband providers. In November the FTC will host a conference that
examines protecting consumers in the converged technologies era, with an
emphasis on trends, applications, products, services, and technology issues
expected to emerge in the next 10 years, according to Majoras.
Click Here to View Full Article
to the top
2006 Horizon Awards Winner: Stanford University's
Password Hash
Computerworld (08/21/06) Collett, Stacey
Seeking to stem the proliferation of phishing scams, researchers at
Stanford University have developed a technique that prevents a stolen
password from being used to access an authentic site. "Internet users
often use the same password at many sites," said Dan Boneh, an associate
professor of computer science and electrical engineering at Stanford. "A
phishing attack on one site will expose their passwords at many other
sites." The Anti-Phishing Working Group identified nearly 12,000 malicious
phishing sites in May 2006, up from 3,300 sites just one year earlier. The
technique, known as Password Hash, or PwdHash, simply adds "@@" to the
beginning of a password that a user types out when registering on a Web
page. That combines the password with the site's domain name in an
algorithm that creates a customized password that will not work even if it
is stolen and typed verbatim. Adding a cryptographic hash is not a new
idea, but the novel part of the researchers' work was to make it so easy
for end users to apply. Three years ago, Secret Service agents told
Stanford's engineering and computer science department that phishing was
the most serious threat that the researchers could help with. The most
difficult part of developing the application was training the software in a
browser to know when a site was prompting for a password. Today the
software is freely available, and versions exist for the Internet Explorer
and Firefox browsers.
Click Here to View Full Article
to the top
Knocking Down the Barriers to the $100 Laptop
eWeek (08/21/06) Lundquist, Eric
The One Laptop per Child group is close to clearing the most significant
technology obstacle to its $100 laptop: the display. The screen needs to
be durable, inexpensive, and readable in both dim and brightly lit
conditions. In commercial products, it often costs more to replace a
broken display than to purchase a whole new notebook. But Mary Lou Jepsen,
CTO of the One Laptop per Child initiative, claims to have developed a
display that existing LCD factories can mass produce that will have a
higher resolution than 95 percent of the laptops commercially available
today. The display, which can be read in sunlight or in a room without
backlighting, consumes just one-seventh of the power required by
traditional screens. Such a display could have a major impact on
commercial applications. Jepsen described the technology in a recent
interview. Noting that new research usually takes 20 years to travel from
the lab to the market, Jepsen said that she opted for a variation of LCD
production techniques to bring the device into mass-production in record
time. The technology is already in place for every other aspect of the
laptops, Jepsen said, noting that she hopes to ship 5 million to 10 million
of the devices next year, and 50 million to 100 million the year after.
Jepsen reexamined the cost structure of the LCD and, when considering the
needs of the laptop's users, found that many spend a lot of time outside
with sparse access to electricity. She developed two modes of display: one
that is color backlit with 1W MAX power consumption; the other is black and
white, reflects sunlight, and has a 0.2W MAX power consumption. The pixel
layout was changed to diagonal stripes of color so that the panel's
resolution could be adjusted both vertically and horizontally. Also,
Jepsen did away with a considerable amount of the costly interface
electronics and color filters.
Click Here to View Full Article
to the top
Can US Control Over the Web Be Untangled?
SDA Asia (08/17/06) George, Priya
The U.S. government has renewed its contract with ICANN to govern the
Internet, delegating ICANN exclusive rights to manage the IANA function of
the Internet until 2011. The contract is subject to review and renewal
each year. Many in the international community have called for an
impartial, global organization such as the United Nation's Working Group on
Internet Governance to take over stewardship of ICANN, and thereby the
Internet. While the U.S. Department of Commerce launched ICANN with an
advertised plan to privatize it eventually, the renewed contract means this
will not happen until after 2011, if ever. The U.S. government has made
recent statements that it is interested in ceding control of ICANN, but
these have not been echoed by ICANN's current overseer, the U.S. Commerce
Department. Commerce official John Kneuer says the department is trying to
ensure the Internet's stability, and that the IANA function is
"extraordinarily technical in nature, and very explicitly tied to security
and stability." GoDaddy vice president of corporate development and
policy, Tim Ruiz, welcomes the extended contract for ICANN, calling talks
of a transition "premature." Others believe that ensuring that today's DNS
can handle the Internet's expected exponential growth in traffic is more
important than solving global control issues, at least in the near future.
NetChoice executive director Steve DiBianco says, "ICANN has definitely
made progress towards independence, but more needs to be accomplished
before a complete transition is appropriate." SRI International's Marcus
Sachs says DNS scalability and security are the two main challenges facing
ICANN right now.
Click Here to View Full Article
to the top
Wrestle a Robot
New Scientist (08/12/06) Vol. 191, No. 2564, P. 39; Cho, Dan
Artificial muscle systems have not advanced to the point where a robot can
defeat a human in an arm-wrestling contest, although such a development
could be on the horizon thanks to recent advances in materials technology.
A key challenge is keeping the muscles reliable without the need for
constant repairs, heavy batteries, or large power input, and researchers
have been inspired by human muscles to come up with a fuel-powered
solution. Researchers at the University of Texas at Dallas' NanoTech
Institute were tasked by the Defense Advanced Research Projects Agency
(DARPA) to create such muscles, and their first project in this vein
employed muscular sheets assembled from carbon nanotubes and incorporated
into a fuel cell filled with sulfuric acid; electrons are consumed when
oxygen reacts with the acid to form water, resulting in a positive charge
that causes the nanotube sheet to contract. Another project the
researchers worked on involved a system with a wire built from a
platinum-coated "shape-memory" alloy that contracts when heated by exposure
to methanol vapor in air. Cooling down the wire causes it to return to its
original length. Winning an arm-wrestling contest cannot be accomplished
solely with a fueled-powered muscle; durability, precise control, and
biocompatibility also have to be added. More lifelike robots, lighter and
more nimble artificial limbs, and synthetic organs are some of the
potential breakthroughs that could be facilitated with the development of
reliable artificial muscles.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Collaborative Spam Filtering Using E-Mail Networks
Computer (08/06) Vol. 39, No. 8, P. 67; Kong, Joseph S.; Rezaei, Benham
A.; Sarshar, Nima
UCLA and University of Florida researchers have come up with a
distributed, message-based system for filtering spam that allows users to
query all their email clients to see if another user in the system has
already flagged a suspect email as spam. This scheme permits users to make
information queries without inundating the network, keeping bandwidth cost
to a minimum while achieving a spam-detection rate of nearly 100 percent,
at least in simulation. To address the performance, scalability, and trust
issues inherent in collaborative spam filtering systems, the researchers
turned to complex networks theory--which facilitates network dynamics
analysis with statistical mechanics--to effectively exploit social email
networks' topological properties; this was accomplished through the use of
the percolation search algorithm, which supports the reliable retrieval of
content in an unstructured network by analyzing only a small portion of the
network, and the digest-based indexing scheme. The use of social email
networks for the purpose of spam filtering proscribes the need for a server
as well as a traditional peer-to-peer system. Upon receipt of an email,
the client program first tries to ascertain whether the message falls into
the categories of DefinitelySpam or DefinitelyNotSpam, which can be done
via any traditional spam-filtering technique; if the message is determined
to be definitely spam, a digest for the message is generated and then
cached. If the email is suspected to be spam by the client program, the
system is queried to see whether the email has already been labeled as spam
by other network users through the implantation of each query message into
the digest, after which the query message is percolated through the email
contact network by nodes with an implanted query request. Hits are routed
back to the node from where the query originates via the same pathway by
which the query message arrived at the hit node, and then the client
program quantifies the volume of hits received, tagging the message as spam
if a constant threshold value is exceeded. The setup requires all nodes to
forward all messages exchanged in the system anonymously to prevent anyone
from employing the system to map out social connections.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top