A Move to Secure Data by Scattering the Pieces
New York Times (08/21/06) P. C5; Markoff, John
When Chris Gladwin, the software designer who sold his online music store
Music Now in 2004, set about trying to digitize and secure the 27 GB of
music, photos, and paper documents that he had been accumulating for years,
he turned to an old technique employed by early cryptographers. The result
was Cleversafe, an open-source project that secures data by breaking it
down into pieces so that the files can only be reassembled by the computers
that created them. The program could lower the cost of storing data on the
Internet, Gladwin claims. "If we distributed data around the world this
way, it would be a pretty resilient way to store data," said former ACM
President David Patterson, a computer scientist at the University of
California, Berkeley. Gladwin is banking on the continued proliferation of
digital data of all kinds, including new breeds of digital cameras that
will drive demand for more secure and private backup applications. In
developing Cleversafe, which will cut the amount of storage space required
for secure backup by more than half, Gladwin drew heavily on the landmark
paper "How to Share a Secret," written in 1979 by Adi Shamir, a designer of
the public-key cryptography algorithm. Gladwin designed a series of
software routines to copy PC data into fragments of distributed file
systems that could then be retrieved to reconstruct the original.
Currently, Cleversafe runs on an experimental research grid located at 11
sites throughout the world, though Gladwin hopes that eventually a
commercial network of tens of thousands or even hundreds of thousands of
sites will emerge. Unlike existing storage projects, Cleversafe
distributes data in encrypted chunks rather than making copies. The
approach is similar to the SETI@Home project, which collects idle
processing power from a network of computers to power a distributed
supercomputer.
Click Here to View Full Article
to the top
Paper Trail Flawed in Ohio Election, Study Finds
Computerworld (08/21/06) Songini, Marc
A new study funded by the Board of Commissioners of Cuyahoga County, Ohio,
has once again called into question the reliability of electronic voting
machines. The study claims that even the voter-verified paper trail
produced by the Diebold machines was not reliable, noting that 10 percent
of the paper votes were "either destroyed, blank, illegible, missing, taped
together, or otherwise compromised." The study was conducted by the
Election Science Institute (ESI), a San Francisco-based nonprofit dedicated
to promoting the development of accurate, auditable election systems.
"What we found is that when you take this [technology] out of the lab and
put it in a real work environment with real voters, you're going to have
some issues you need to resolve," said ESI's Steven Hertzberg. In a letter
to Cuyahoga County commissioners, Hertzberg wrote that the systems do
provide some benefit for the voters, noting that they are easier to use
than the old punch-ballot systems that they replaced. However, he also
warned that the county should view the machines as a calculated risk,
citing the 72 percent of polling places in which the study found a
discrepancy between the paper ballots and the record on the machines'
memory cards. Forty-two percent of those discrepancies entailed errors
with 25 or more votes. The study also reported that 87 paper rolls and 28
voting machines were missing, and warned that printer malfunctions could
cause serious election problems. A Diebold spokesman challenged the
study's methods, claiming that the discrepancies resulted from matching
paper records with the wrong memory cards. Diebold also expressed dismay
that it was not allowed to participate in the analysis of the election.
Ohio Secretary of State Kenneth Blackwell says the machines meet both state
and federal requirements for certification, and that any problems are the
result of flawed procedures or inadequately trained workers.
Click Here to View Full Article
to the top
San Diego Supercomputer Center Staff Help Nation's
Archivists With Digital-Preservation Expertise
University of California, San Diego (08/18/06) Mueller, Paul K.
Computer scientists are poised to work more closely with archivists in
helping the nation to preserve its digital records. At the recent joint
meeting of the Society of American Archivists (SAA), the National
Association of Government Archives and Records Administrators (NAGARA), and
the Council of State Archivists (CoSA), the SAA elected Richard Marciano,
director of the Sustainable Archives and Library Technologies (SALT) lab at
the University of California, San Diego, to a three-year term on the
steering committee of its Electronic Records Section (ERS). The election
of Marciano marks the first time a computer scientist will be a member of
ERS and help guide the archivist organization in its electronic
preservation initiatives. "These collaborations are a two-way street,"
said Marciano. "Not only do information technologists provide useful
insights for digital preservation, the problems archivists face in
preserving digital records are now also enriching computer science
research." Marciano introduced Chien-yi Hou, a colleague at the San Diego
Supercomputer Center (SDSC) who specializes in digital preservation, as the
type of professional today who can bring computer scientists and archivists
together. Reagan Moore, director of the Data Intensive Computing
Environments (DICE) group at SDSC, delivered the keynote address to the ERS
meeting, noting that data-grid technologies could be used to build
preservation tools and that iRODS (the Integrated Rule-Oriented Data
System) has potential for use as a data management application.
Click Here to View Full Article
to the top
Speedy Silicon Sets World Record
BBC News (08/17/06)
A team of researchers from the University of Southampton has developed the
world's fastest transistor of its type simply by adding fluorine, a tweak
that they claim could lead to faster, less expensive mobile phones and
digital cameras while still using conventional manufacturing techniques.
"It just takes a standard technology and adds one extra step," said
Southampton professor Peter Ashburn. They used a simple device known as a
silicon bipolar transistor, which consists of three layers of
semiconducting material laid out in a sandwich structure, with two types of
one material on either side of a filling made of a different material, in
this case silicon around a boron filling. This kind of chip is created
through a manufacturing process that heats and diffuses the boron layer,
which makes it thicker and slows the flow of electrons through it. Adding
fluorine implants to the silicon using a technique known as ion
implantation creates tiny clusters of missing silicon atoms that suppress
the diffusion of boron, leaving a thinner layer that keeps the electrons
moving quickly. "It's atomic engineering, even smaller than
nanotechnology," said Ashburn. The transistor tested at 110 GHz, which
means that a complete circuit built around the technology could likely
operate at around 11 GHz, accounting for the tenfold reduction from a
transistor's speed to the actual chips that they could power. That
eclipses the previous transistor record of 70 GHz held by electronics maker
Philips. Mobile phone circuits currently operate around 1 GHz.
Click Here to View Full Article
to the top
How Biotech Is Driving Computing
CNN Money (08/18/06) Taylor, Chris
Biotechnology is emerging as the most challenging frontier for the next
generation of supercomputers. Japanese researchers have built a $9 million
supercomputer that has broken the petaflop barrier, performing at a rate
roughly three times faster than IBM's BlueGene/L. Pharmaceutical companies
require supercomputing power to test the thousands of chemical compounds
that could become the next miracle drug, as well as the ways that each will
interact with the trillions of proteins in the human body. Proteins, the
enormously complex strings of amino acids, must be mapped in 3D. The
Japanese computer, MDGrape-3, is not officially the world's fastest
computer because it cannot run the software required by the official
rankings. Nevertheless, one of Merck's subsidiaries has already asked the
researchers for some time on the computer. IBM is renting out time on Blue
Gene to QuantumBio, a company that provides protein-testing services to
pharmaceutical companies. As a result of the research, DNA sequencing
could become a standard procedure at a routine visit to the doctor's
office. As the biotech industry develops, its demands could have a similar
effect on supercomputing as the space race did on the development of
mainframe computers. Ultimately, genetic science could reshape the nature
of computing itself, as researchers have already demonstrated the ability
to replace silicon-based materials with DNA as the logic gates that power a
computer. At present, the main obstacle to DNA computing is speed, though
by the time that Moore's Law runs into its inevitable endpoint, scientists
could have a sufficient understanding of DNA computing to make it a viable
replacement.
Click Here to View Full Article
to the top
'Electro-Spin' Trick Boosts Quantum Computing
New Scientist (08/16/06)
A team of researchers from Delft University in the Netherlands has created
a device that can manipulate the "up" and "down" spin positions of the
electrons in quantum dots using existing fabrication techniques. "This is
a breakthrough experiment," said Guido Burkard, a physicist at the
University of Basel who did not participate in the research. "The major
benefit of making a qubit using this method is that they are built upon
existing semiconductor technology." The resulting silicon chip could lead
to quantum computers capable of performing multiple applications
simultaneously. Using conventional lithography, the Dutch team created a
device with two electrodes that creates a circuit by applying voltage to
two semiconducting quantum dots, each 100 nm wide. The voltage prompts the
electrons to bounce back and forth between the dots, though each dot can
only accommodate one electron at any given time. Since electrons of the
same spin state cannot land on the same dot, electrons of different spin
states get jammed--one on each dot. Once the electrons were jammed, the
researchers isolated the dots from the circuit, and then altered the
electron's spin on the first dot using an electric field. Then, current
will only flow through the circuit if the spin state of the first electron
has been switched. Electro-spin qubits can now begin to catch up with more
mature areas of quantum computing. "I see no roadblocks to moving towards
the first implementation of small quantum algorithms using electron-spin
qubits," Burkard said.
Click Here to View Full Article
to the top
With Its Future Uncertain, Bell Labs Turns to
Commerce
Wall Street Journal (08/21/06) P. A1; Silver, Sara
Lucent's storied Bell Labs, which over the past decade has been reduced to
a third of its former size, is now under the direction of entrepreneur
Jeong Kim, who took the reins at the research facility last year with the
singular goal of making it profitable. Kim has refocused the efforts of
many of the labs' scientists on projects that could have immediate
commercial applications--projects that are expected at a minimum to recoup
six times the expected cost of research. Funding is doled out in
accordance with a project's financial potential. Kim is also looking to
bring in more government grant money to accelerate the conversion of basic
research into marketable products. Kim's corporate-minded management style
represents the new face of research at Bell Labs, which, like other major
industrial labs, has seen its funding drop for projects without immediate
commercial potential. Lucent's pending merger with Alcatel could also have
an impact on the operations at Bell Labs, as many of the research
facility's scientists fear that they could be among the 9,000 workers
expected to lose their jobs after the merger is completed. Lucent CEO
Patricia Russo claims that there is "absolutely no intention of separating
Bell Labs from the company," and that the labs' funding will not be
affected by the reorganization. "Bell Labs will be an integral part of the
combined company, and is critical to its future success," she says.
Alcatel executives have not commented definitively on the role they expect
Bell Labs to play after the merger, but have said that they will work to
balance basic research with commercial considerations.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Eye Tracking Technology Poised to Be Next Trend to
Immerse Gamers
Queen's University (08/10/06)
Video-game companies see eye-tracking technology as a potential tool for
enhancing the gaming experience of players. Eye-tracking technology has
been around since the late 1960s, and people with limited mobility, pilots,
and market researchers have largely put the application to good use.
According to a new study from researchers at Queen's University in Canada,
playing a game with your eyes allows gamers to feel more immersed and have
more fun in a virtual environment. School of Computing associate professor
Nicholas Graham and PhD candidate David Smith integrated a Tobii 1750
desktop eye tracker with several commercial video games, and found that 83
percent of gamers playing Quake 2 and 92 percent of those playing
Neverwinter Nights felt more immersed in the games using the technology.
"Eye-tracking technology allows us to build interfaces that respond to
users' intentions rather than just their actions," says Smith. Although
eye-tracking technology feels more natural than playing a game with a
mouse, the feature presents control issues because subconscious eye
movements make for inadvertent selections of items or directions. The
researchers presented the study at ACM's International Conference on
Advances in Computer Entertainment Technology in June.
Click Here to View Full Article
to the top
Network Time Protocol Works With IPv6
Network World (08/16/06) Marsan, Carolyn Duffy
A series of tests run last month by researchers involved with Moonv6, the
world's largest native IPv6 testbed, have demonstrated that the Network
Time Protocol (NTP) runs over IPv6, the upgrade to the Internet's main
protocol. As part of the tests, Moonv6 researchers set up a wide-area link
between the University of New Hampshire and the military's Joint
Interoperability Test Center in Fort Huachuca, Ariz., to run NTP--which is
used to synchronize the timing of network equipment for regular network
operations as well as anti-hacking and disaster recovery efforts--over both
IPv4 and IPv6. Capt. Jeremy Duncan, a communications interoperability and
integration officer with the U.S. Marine Corps, says Fort Huachuca had two
servers running NTP, one server running IPv6, and another running IPv4.
Both servers were performing updates via NTP, what was tested in native
IPv6 and dual-stack IPv4/IPv6 modes. Duncan said the tests went well.
Researchers also tested management services such as the use of Dynamic Host
Configuration Protocol (DHCP) version 6 with NTP running over IPv6, said
Glenn Burdett of Spectracom, a Moonv6 participant. "The bulk of the
testing was to prove out NTP, to make sure that machines could be synched
up over the LAN and the WAN," he said.
Click Here to View Full Article
to the top
The Organic Automaton
Scientific American (08/17/06) Musser, George
The vision articulated by futurist Ray Kurzweil holds that computers will
eventually be able to run software that can fully simulate the human brain.
Critics argue that the theory is flawed because software does not develop
according to Moore's Law, and that software's uneven track record is not a
strong predictor of reliable simulations of living organisms. Software has
also become so complicated that even minor changes to one area of code
affect other parts in unintended ways because of the interrelated
complexity of many of today's applications. Most modern software is
developed through an evolutionary progression where programs move in
unanticipated and increasingly complex directions. One line of thought
holds that improved testing will solve the software reliability problem,
while another argues that making computers more like organisms is the
answer. Computers patterned after living organisms could continue to
function even when they encounter a problem. The idea is that instead of
trying to prevent crashes, developers should design systems that can
recover from them more quickly. Going a step further, IBM is promoting
autonomic computing, which attempts to make machines self-aware and able to
monitor and repair themselves.
Click Here to View Full Article
to the top
Some Online Video Games Found to Promote 'Sociability,'
Researchers Say
University of Illinois at Urbana-Champaign (08/16/06) Lynn, Andrea
Online video gaming has come under criticism for encouraging isolation and
passive consumption of media, but new research suggests massively
multiplayer online video games (MMOs) should be viewed as places where
informal social interaction occurs. "Virtual worlds appear to function
best as bridging mechanisms, rather than as bonding ones, although they do
not entirely preclude social ties of the latter type," according to the
study "Where Everybody Knows Your (Screen) Name: Online Games as 'Third
Places,'" published in the early August issue of the Journal of
Computer-Mediated Communication. Constance Steinkuehler, a professor of
education at the University of Wisconsin at Madison, and Dmitri Williams, a
professor of speech communication at the University of Illinois at
Urbana-Champaign, came to the conclusion after playing games and conducting
random interviews of players on their reasons for playing, their in-game
social networks, and their life away from games. Although players may not
gain deep emotional support in a virtual third place, the venue provides an
opportunity for gamers to broaden their worldview. MMOs enable gamers to
interact with each other in-game through multiple real-time voice or text
conversations, cooperate as they play, and form long-term player groups in
which relationships can be built. More than 9 million people worldwide
spend about 20 hours a week playing MMOs, and they may be drawn to the
games because they have nowhere to hang out. "Perhaps it is not that
contemporary media use has led to a decline in civic and social engagement,
as many have argued, but rather, that a decline in civic and social
engagement has led to a 'retribalization' through contemporary media," says
the study.
Click Here to View Full Article
to the top
Romanian Rhapsody
Business Week (08/21/06) Matlack, Carol
Bucharest, Romania, is on the verge of becoming the Bangalore of Eastern
Europe because of the talent pool and low wages that the local market has
to offer. Companies such as Oracle, Microsoft, Hewlett-Packard, and
Accenture are investing millions of dollars in Bucharest by opening
offices, hiring thousands of IT workers, and buying startups and their
technologies. Romania is attractive because 60 percent of its workforce is
believed to speak at least one foreign language, and because a programmer
can be hired for about $500 a month, which is comparable to salaries in
India and is 50 percent less than those in Poland and the Czech Republic.
But what sets Romania apart is the excellent training and problem-solving
skills of its computer specialists, who can handle more advanced research
and development. Varujan V. Pambuccian, a computer scientist who is a
member of Romania's Parliament, says the country has about 16,000 software
engineers, and about half focus on research and development rather than
coding. Oracle will be increasing the number of people it employs for
software development and product support in northern Bucharest to 1,000 in
a few months, and the company also subsidizes IT courses at local
universities. Such investment is even prompting Romanians who have left
the country for employment abroad to return home.
Click Here to View Full Article
to the top
Inside the Robot Factory
PC Magazine (08/16/06) Ulanoff, Lance
Attendees of the recent RoboBusiness Conference were treated to a look
inside Carnegie Mellon's storied Robotics Institute. In addition to the
institute, Carnegie Mellon also maintains two other robotics facilities:
Robot City and the National Robotics Engineering Consortium. The institute
is Carnegie Mellon's largest department. Its 98 graduate students and
technical staff of 200 explore autonomous systems, vision, speech, and
manipulation. CMU computer science and robotics professor Matthew Mason,
the direct of the Robotics Institute, says, "Part of the fun of robotics is
that you're relating what machines can do to what humans can do."
Initially funded by private industry, the institute now relies more heavily
on government grants, though the Defense Department has cut its funding in
half since the Sept. 11 attacks. Among the projects underway at the
institute are robots designed to build and repair the International Space
Station, locate meteorites, and navigate the abandoned mines around
Pittsburgh. While the institute is home to experimentation and innovation,
the NREC is devoted to finding practical applications for technology. The
professional engineers working at NREC have developed a robot to strip the
paint off large ships and a system to inspect the conveyor belts in mines.
These days, their major focus is on military applications, such as
autonomous unmanned ground combat vehicles.
Click Here to View Full Article
to the top
An Evaluation of Information Quality Frameworks for the
World Wide Web
University of Southampton (ECS) (08/16/06) Parker, M.B.; Moleshe, V.; De
la Harpe, R.
The retrieval of relevant information from the Internet is beset by a lack
of information quality standards for Web publishers, and University of
Southampton and Cape Peninsula University of Technology researchers assess
World Wide Web information quality frameworks to identify what components
they have in common as well as elements they are missing. An evaluation of
13 frameworks unveils a series of common dimensions, including
accessibility, accuracy, objectivity, relevancy, consistency,
appropriateness, believability, representation, reputation, source,
security, speed, ease of manipulation, value-added, timeliness,
free-of-error, completeness, and understandability. The most frequently
occurring dimensions in the frameworks are accessibility and timeliness.
Accessibility focuses on technical accessibility and the issues of data
representation and data volume. The problem of technical accessibility
becomes apparent when security access and Web page permissions block
accessibility, while the data-volume issue deals with the provision of
applicable data that increases value to tasks in a timely way. Timeliness
and thus accessibility problems could crop up when large volumes of data
need to be updated to the Web site. The least frequently occurring quality
dimensions are ease-of-manipulation and value-added, and the low occurrence
of the value-added dimension dovetails with the lack of information quality
in individual Web pages. The researchers conclude that a World Wide Web
information quality framework should feature the accessibility, timeliness,
accuracy, relevance, believability, completeness, objectivity,
appropriateness, representation, source, and understandability dimensions,
at minimum.
Click Here to View Full Article
to the top
Who Said the Net Was Fair?
New Scientist (08/12/06) Vol. 191, No. 2564, P. 17; Biever, Celeste
Though there is a certain degree of relevancy in "net neutrality"
advocates' warning that the openness of the Internet could be endangered
without legislation requiring equal prioritization of all Web traffic, the
truth is that such equality does not exist, and may even be anathema to
innovation directly fueled by unequal treatment of traffic, writes Celeste
Biever. Net neutrality proponents' fears were sparked by the AT&T/SBC and
MCI/Verizon mergers, which gave individual companies end-to-end control
over data packets for the first time. This development has opened the way
for the companies to charge their richer customers to prioritize packets,
which could be a crippling blow to small startups that cannot afford such
service. Yet large Web sites have been paying for faster data delivery for
a number of years; many users are unaware that 15 percent of all Web
traffic is sent to the Web user's computer from servers owned by Akamai,
not from the site the user is visiting. Not only does this method offer
smoother Web browsing, but it can also defend a company's own servers
against denial of service attacks. A lack of net neutrality has proven
beneficial to some innovative Internet applications, such as spam
filtering. Andrew Odlyzko of the University of Minnesota's Digital
Technology Center reports that the blockage of malicious data packets would
be hard to enforce with net neutrality legislation in place. "How do you
define a spammer?" he queries. "Any kind of net neutrality legislation
would interfere with at least a few of the common practices on the Internet
today."
Click Here to View Full Article
to the top
Wireless Works Wonders in Tibet
Wired News (08/17/06) Jardin, Xeni
Yahel Ben-David--a former Silicon Valley dot-commer--and members of the
underground security group Cult of the Dead Cow are working with Tibetan
exiles in Dharamsala, India, to build a low-cost wireless network in this
high mountain village near the Chinese border. The network, called the
Dharamsala Wireless Mesh, is an example of "light infrastructure," a
concept that is gaining popularity among tech developers. Light
infrastructure networks are decentralized, ad-hoc networks that can deliver
essential services faster than conventional means. However, the Dharamsala
Wireless Mesh is not available to everyone. Unlike similar community
wireless projects in the United States, the Dharamsala Wireless Mesh is not
open to laptop users. And since bandwidth is limited, costly, and comes
from government-controlled telecom provider BSNL, access to the network is
restricted mostly to schools, government offices, and nonprofits, who pay a
small fee and host equipment to broaden the network's reach. Nonetheless,
many in Dharamsala are hoping that the network will eventually be made
available to others so that it can help the area's economy grow.
Click Here to View Full Article
to the top
Transformations of the Research Enterprise
Educause Review (08/06) Vol. 41, No. 4, P. 26; Braman, Sandra
The U.S. agenda for research and information technology continues to be
influenced by political, economic, and intellectual developments that
generate future computational, networking, and data challenges, which are
interconnected. The modern research environment is characterized by a
number of developments, including the erosion of the boundary between basic
and applied research; computation's stature as the "third branch" of
science, along with theory and experimentation; the expanding role of
computation in all disciplines; and the growing transdisciplinary nature of
large research projects. The context for IT engagement with research in
higher education is determined by themes that have emerged at the point
where national policy, disciplinary developments, institutional habits, and
the evolution of research techniques converge. These themes include the
advent of computation as a basic, distinct research step, the effects of
globalization, research projects' increasing scope, a movement toward
interdisciplinary research collaborations, research democratization, and
the identification of the importance of knowledge reuse. Tension between
centralization and decentralization, technological innovation speed and
institutional innovation speed, academic institutions' requirements and
external funding agencies' requirements, the needs of the many and the
needs of the few, faculty desires and administrator mandates, and other
competing factors stems from these themes. New opportunities for IT
specialists, researchers, and administrators of higher education
institutions can be found in these research enterprise transformations.
Despite the critical role national policies play in academic research,
economic support for academic institutions as centers of knowledge
generation and circulation does not always result from appreciation of the
knowledge economy, although institutional players are expanding. Future
research and IT challenges in the area of computation include the problem
of providing enough capacity, while networking challenges include
governance and dealing with assorted national and international grids; data
issues include the exponential growth of data and how it affects computing
capacity.
Click Here to View Full Article
to the top