Net Neutrality Hearing Hits Silicon Valley
Washington Post (04/18/08) P. D2; Kang, Cecilla
The Federal Communications Commission yesterday heard testimony from legal
scholars, Web startups, the Christian Coalition, and the Songwriters Guild
of America during a debate on the impact Web regulation would have on
high-tech innovations and investments, copyright protection, and freedom of
speech. The primary issue is whether the Internet needs rules to mandate
that it remain open and unaltered by network operators. Supporters of
openness say allowing phone and cable companies to restrict content could
unfairly limit consumers. However, allowing network controls could prevent
the illegal sharing of copyright material online. None of the nation's
largest service providers attended the hearing despite requests by the FCC.
FCC Chairman Kevin J. Martin urged the commission to evaluate the issue in
the narrower context of specific attempts to limit access, but the Silicon
Valley panelists took a broader view of net neutrality and how such
policies could affect consumers. Some expressed concerns that venture
capitalists may invest less if they feel the Internet will be controlled by
the government and corporations, while others said that industry proposals
do not go far enough to protect consumer interests.
Click Here to View Full Article
to the top
Paper Ballot Technology Drive Downshifts as House Nixes
Funding to Replace E-Voting Machines
Government Computer News (04/16/08) Dizard III, Wilson P.
The U.S. House of Representatives has voted against a bill that would have
provided funding to help states replace their electronic voting machines
with paper ballots. The White House and some Republican leaders expressed
concern about the Emergency Assistance for Secure Elections Act of 2008
because of its cost. The bill would have reimbursed states that convert
back to paper-based voting systems this year, and covered the cost of
recounting paper ballots to verify elections. "I'd like to ask the
opponents how much spending is too much to have verifiable elections in the
United States," says Rep. Rush Holt (D-Ohio), the chief sponsor of the
legislation. Some states have already gotten rid of their high-tech direct
recording electronic (DRE) voting terminals due to the rising number of
reports about their flawed software.
Click Here to View Full Article
to the top
Reality TV, Web 2.0 and Mediated Identities
University at Buffalo Reporter (04/17/08) Vol. 39, No. 29, Donovan,
Patricia
A study by University at Buffalo researchers examining television viewing
and communication patterns among young adults has found a relationship
between television viewing and "promiscuous friending" on popular social
networking sites. The researchers link this behavior to ordinary people
copying the behavior exhibited by reality TV celebrities. Researchers say
such people are creating "mediated social selves" that are intended to be
identities created for, presented on, and "known" through the media. The
study says that heavy reality television (RTV) viewers spend more time on
sites such as Facebook, have larger social networks, share more photos, and
are more likely to become online "friends" with people they have no
off-line relationship with, a practice known as promiscuous friending. The
study suggests an erosion of the distinction between the everyday world and
the celebrity world in which ordinary people claim intimacy with the
completely mediated identities of celebrities through celebrity social
network profiles. Heavy RTV viewers also produce a larger number of
mediated selves and have a greater intimacy toward, and an urge to interact
with, the mediated social image of others. The study used social cognitive
theory as the theoretical foundation for a survey of 456 young adults. The
researchers analyzed the amount of time subjects say they spent every day
watching RTV, news, fiction, and educational programming, and the amount of
time they were logged online daily, the size of their online social
networks, the percentage of their online friends they had met face to face,
and the number of photos they shared online. The study will be published
in the proceedings of the 2008 ACM Conference on Hypertext and Hypermedia,
which takes place June 19-21 in Pittsburgh.
Click Here to View Full Article
to the top
25 Radical Network Research Projects You Should Know
About
Network World (04/16/08)
There are many interesting network research projects of note that could
lead to significant breakthroughs in all kinds of areas, including a
Defense Department-funded initiative to create faster computers that
consume less power and are less costly to build through the development of
a hybrid material that blends computer memory functions usually carried out
by magnetic components and computer logic operations typically executed by
semiconductor components. A joint project between Penn State University
and Australia's Queensland University of Technology has yielded an
algorithm that can classify Web searches as informational, navigational, or
transactional, while Israeli researchers have furnished a topographical map
of the Internet using a program that tracks interactions between Internet
nodes. Air Force Institute of Technology researchers are engaged in a
project to spot and halt insider security threats and industrial espionage
using data mining and social networking methods, and the University of
Arizona's federally funded Dark Web project seeks to track and analyze the
activities of terrorists and extremists using the Internet to spread
propaganda, enlist members, and orchestrate attacks using a combination of
spidering, multimedia analysis, link analysis, and other techniques.
Easing the development and use of Web applications is the goal of the Fluid
Project, an effort by the University of Toronto, the University of
California, Berkeley, and others to build a software architecture and
reusable components. A model for measuring visual clutter has been created
by MIT researchers, while Penn State researchers say image searching could
be greatly enhanced via software that tags images upon uploading to Yahoo's
Flickr or other photo systems and also automatically updates the tags
according to people's interaction with the photos. Voice-based Web
navigation is the goal underlying University of Washington researchers'
design of "Vocal Joystick" software that uses sounds and voice volume to
execute tasks, and the National Science Foundation is underwriting research
to make computers sensitive and responsive to users' emotional states.
Click Here to View Full Article
to the top
Problem-Solvers of the World Unite in Banff
Business Edge (04/18/08) Vol. 8, No. 8, Keenan, Tom
At the 32nd annual ACM International Collegiate Programming Contest (ICPC)
in Banff, Canada, 300 of the world's top IT students gathered to solve some
extreme challenges. In addition to the competition, students heard
speeches from event sponsor IBM on how technology will help society, with
advancements such as instant language translation and improving mass
transit systems. The challenges ranged from trying to measure a city's
skyline to trapping someone in underground tunnels. "These scenarios--that
you're a SuperSpy and there's a villain trapped in a labyrinth--disguise
some very real problems," says IBM Lotus division director of strategy Doug
Heintzman. "What we're looking for is how people think through problems,
how they divine the most efficient ways to find the answer to a particular
problem." Kevin Waugh, from the University of Alberta, this year's host of
the competition, says teams do not really work together in terms of solving
a problem, but there is resource allocation, which frequently falls apart
before people start working together again in the last hour of the
competition. Heintzman says the challenge organizers face is to find
enough problems of varying complexity to fit into a five-hour period that
will allow the best team to distinguish themselves. However, he says "the
real problems ... are how do we solve global warming or reduce the impact
of a certain pandemic." For more information about ICPC,
https://cm2prod.baylor.edu/login.jsf
Click Here to View Full Article
to the top
UW to Lead $6.25 Million Project Creating Electronic
Sherlock Holmes
UW News (04/16/08) Hickey, Hannah
The University of Washington will lead the Multidisciplinary University
Research Initiative, a seven-university research effort intended to improve
computers' ability to interpret data and predict the behavior of complex
systems. The Defense Department is backing the five-year initiative with a
$6.25 million grant. "A complex monitoring system has far too many pieces
of information for any one person to look at," says UW computer science
professor and principal investigator Pedro Domingos. "This award lets us
do the research to develop a system for the military to look at all the
available information that might be valuable and use it to predict
behavior." The new system will use the power of reasoning much like
Sherlock Holmes. The military has millions of possible clues, including
sensors on soldiers, satellite maps, road monitors, unmanned aerial drones,
and reports from reconnaissance missions. The system will use this
information to make decisions and predict an adversary's next move.
Domingos says existing systems only look at a single type of sensor data,
but more complex situations require going to a higher level and integrating
different types of information. For example, a computer could combine
X-rays, photographs, test results, and patient information to make a
tentative diagnosis automatically. The diagnosis could also be based on
information from sensors that track a patient's movements and heart rate
for weeks at a time.
Click Here to View Full Article
to the top
ISPs Meddled With Their Customers' Web Traffic
InfoWorld (04/16/08) McMillan, Robert
Researchers at the University of Washington and the International Computer
Science Institute say that about one percent of the Web pages being
delivered on the Internet are being changed in transit, a practice that can
introduce security vulnerabilities. In July and August, the researchers
tested data sent to about 50,000 computers and found that a few Internet
service providers injected ads into Web pages on their networks. The
researchers also found that some Web browsing and ad-blocking software was
making Web surfing more dangerous by introducing security vulnerabilities
into Web pages. To obtain their data, the researchers wrote software that
would test if someone visiting a test page on the University of
Washington's Web site was viewing HTML data that had been altered in
transit. In 16 incidents ads were inserted into the Web page by the
visitor's ISP. The research also found that pages were sometimes changed
by popup blockers within security products, and that some products inserted
vulnerabilities into the pages they processed. The research will be
presented this week at the Usenix Symposium on Networked Systems Design and
Implementation in San Francisco. "One of the next steps for the community
is to create better and stronger mechanisms for understanding what is
happening," says UW professor Tadayoshi Kohno. "The Web is still very
young and we just don't know what's going to happen next."
Click Here to View Full Article
to the top
Chronopolis Project Launched Under Library of Congress
Partnership to Preserve At-Risk Digital Information
UCSD News (04/14/08) Zverina, Jan
The Library of Congress' Chronopolis Digital Preservation Demonstration
Project is a digital preservation data grid framework being developed by
the San Diego Supercomputer Center (SDSC) at the University of California,
San Diego, the UC San Diego Libraries, the National Center for Atmospheric
Research in Colorado (NCAR), and the University of Maryland's Institute for
Advanced Computer Studies (UMIACS). A key goal of the project is to
provide cross-domain collection sharing for long-term preservation. The
partnership is intended to leverage the data storage capabilities at SDSC,
NCAR, and UMIACS using existing high-speed educational and research
networks and mass-scale storage infrastructure investments to provide a
preservation data grid that emphasizes heterogeneous and highly redundant
storage systems. "Chronopolis is part of a new breed of distributed
digital preservation programs," says UCSD librarian and principal
investigator on the project Brian E.C. Schottlaender. "We are using a
virtual organizational structure in order to assemble the best expertise
and framework to provide data longevity, durability, and access well into
the next century." The partnership calls for each Chronopolis member to
operate a grid node containing at least 50 TB of storage capacity for
digital collections related to the Library of Congress' National Digital
Information Infrastructure and Preservation Program.
Click Here to View Full Article
to the top
Security From Chaos
U.S. Department of Homeland Security (04/16/08) Cleere, Gail
The Assistant for Randomized Monitoring Over Routes (ARMOR), a Department
of Homeland Security-sponsored project at the University of Southern
California, is improving security at LAX airport in Los Angeles by
predicting risk. The USC researchers have developed a computer model that
tells police where to go to conduct random checks based on calculated
probabilities of a terrorist attack at specific locations. The software
records the locations of routine, random vehicle checkpoints and canine
searches at the airport. Police then provide data on possible terrorist
targets, their relative importance, and any changing data such as security
breaches or suspicious activity. The software then produces random
decisions, creating security patterns that are difficult to predict. "What
the airport was doing before was not truly statistically random; it was
simply mixing things up," says computer science professor Milind Tambe at
the Center for Risk and Economic Analysis of Terrorism Events (CREATE), a
DHS Center of Excellence at USC. "What they have now is systematized, true
randomization." CREATE works with government agencies and researchers to
evaluate the risk, cost, and consequences of terrorism, helping policy
makers set priorities to find the best ways to counter threats and prevent
attacks. Tambe says humans cannot create purely random systems for an
extended period of time, as they will eventually make decisions based on
prior decisions and experiences. ARMOR recently completed a six-month
trial, and airport officials have given the university approval to transfer
the software to LAX on a more permanent basis.
Click Here to View Full Article
to the top
Predicting Stress
Sydney Morning Herald (Australia) (04/17/08) Randerson, James
University of Wisconsin-Madison professor Vadim Shapiro and his colleagues
have developed Scan and Solve, software that can determine the stress on an
object based only on its shape, which could help experts preserve pieces of
artwork or help treat people's physical problems. The software was used on
Michelangelo's David and determined the statue was under a significant
amount of stress, particularly around its left thigh, right shin, and
ankles. Scan and Solve's conclusions match the real cracks that have
started to appear in the marble sculpture. The program could help
archivists predict what areas of an ancient artifact may need to be
reinforced to prevent damage, even if the statue has not yet shown any
signs of stress or damage. "Understanding structural properties of
historical and cultural artifacts through computer simulations is often
crucial to their preservation," Shapiro says. The software converts a 3D
map of an object into a map of the stresses and strains it will experience
when subject to certain forces. Although the concept is not new, Shapiro's
software simplifies the process and eliminates a series of difficult,
error-prone calculations. Traditional computer simulations use a finite
element analysis, which breaks the object into a 3D mesh of tiny pieces
that approximate the shape. Shapiro's approach runs the analysis directly
on the 3D shape data. Shapiro says the technique could also be used on
scans of living bones in patients, for example by suggesting the best shape
for hip bone replacements.
Click Here to View Full Article
to the top
Ad Hoc Encyclopaedia for the Information Age
ICT Results (04/14/08)
The goal of the Diligent project is to tackle the processing challenges
posed by the enormous volume of raw data that virtual digital libraries
(VDLs) must contend with through the creation of a testbed to prove the
viability of VDL infrastructure on grid-enabled technology. A
grid-supported VDL would permit massive online data repositories to be
generated from distributed computing sources, but Diligent has also
established a system that blends digital libraries with grid computing to
deliver storage, content retrieval and access services, and shared data
processing capabilities. Diligent created an infrastructure through the
development of the g-Cube system, along with a pair of VDLs to validate the
infrastructure's functionality. One VDL was concentrated in the Earth
Observation community, while the other was centered in the Cultural
Heritage community. Through the Diligent system, scientists, engineers,
policy-makers, NGOs, and other experts or stakeholders will be able to team
up on an ad hoc basis to think about and exchange applicable information
around specific problems. "The system needs to be optimized to improve its
quality of service," says Diligent scientific coordinator Donatella
Castelli. "We need to develop a production infrastructure and deal with
issues like real infrastructure policies."
Click Here to View Full Article
to the top
'I'm Listening' - Conversations With Computers
Queen's University Belfast (04/16/08) Mitchell, Lisa
The European Commission-funded SEMAINE project is working to develop the
Sensitive Artificial Listener (SAL), a system that will be able to talk
with humans by reacting to signals such as tone of voice and facial
expressions. SAL will be able to adapt its own performance and pursue
different actions based on the non-verbal behavior of the user. SEMAINE is
led by DFKI, a German center for research on artificial intelligence, and
its partner universities, including Queen's University Belfast, Imperial
College London, the University of Paris VIII, the University of Twente in
Holland, and the Technical University of Munich. Queen's University
Belfast professor Roddy Cowie says when people communicate there is an
undercurrent of signals that show each other what interests them and what
bores them. He says computers cannot do that, which is one of the main
reasons communicating with them is so unlike communicating with a human and
is so frustrating. The SEMAINE project furthers the research developed by
Cowie's HUAINE (Human-Machine Interaction Network on Emotion) project,
which developed interfaces for communicating with computers more naturally.
"Today when we use technology we adopt a style of communication that suits
the machine," Cowie says. "Through projects like HUMAINE, SEMAINE, and
others linked to them, we will develop technology that will eventually
communicate in ways that suit human beings."
Click Here to View Full Article
to the top
Blind Users Still Struggle With 'Maddening' Computing
Obstacles
Computerworld (04/16/08) Wood, Lamont
For the blind and visually impaired, the world of computers was not
designed for them, but when the technology works they have access to more
information than anything previously available to them. Blind computer
users primarily rely on screen reader software, which describes the
activity on the screen and reads the text in various windows, says
consultant Gayle Yarnell, who is blind. There are freeware screen readers,
and screen readers often come with operating systems such as Windows XP and
Vista, but they are generally not powerful enough for serious use. A
screen reader's output can generally be sent to computer speakers as a
synthesized voice or to a Braille display. Yarnell says Braille displays
are better than speech for editing because individual characters can be
isolated, and they are a requirement for the deaf-blind. Knowing what the
screen is saying is only the beginning, because the user must then issue
commands using keyboard shortcuts, which involves a lot of memorization.
Microsoft at one point tried to make sure there was a keystroke for every
possible action in Windows, but in Vista that usability started to
disappear, largely because the effect of a keystroke in Vista depends on
the situation about a third of the time. However, Vista does offer some
very useful tools, including a Start function that begins with a search
field, allowing users to type in the name of an application, command, or
document instead of searching for it. Finding ways for screen readers to
process new display technologies, particularly on the Web, is a constant
struggle, as different standards create new difficulties. Screen readers
also often do not work with in-house applications, which can cause many
otherwise qualified blind job applicants to be unable to perform necessary
tasks.
Click Here to View Full Article
to the top
Google's KML Format Approved as Open Standard
InformationWeek (04/14/08) Claburn, Thomas
The Open Geospatial Consortium (OGC) has approved the Keyhole Markup
Language (KML) as an international standard and will take responsibility
for maintaining and extending it. The approval comes after Google's
decision to transfer ownership of KML to the international standards body.
"We believe that this is a major step forward for the OGC and for the
entire geographic information community, as it provides the first broadly
accepted standard for the visualization of geographic information," says
Galdos Systems CEO Ron Lake. Google uses the file format in Google Earth
and Google Maps. KML provides an XML schema for displaying and describing
geographic data in two or three dimensions. Microsoft started using KML in
its Virtual Earth geospatial application last October. The OGC includes
345 companies, government agencies, and academic organizations from around
the world with an interest in geographic data standards, including NASA and
the U.S. Department of Homeland Security.
Click Here to View Full Article
to the top
Ensuring Security for Cognitive Radio Networks Goal of
CAREER Award Research
EurekAlert (04/15/08) Crumbley, Liz
Virginia Tech researcher Jung-Min Park has received a $430,000 National
Science Foundation Faculty Early Career Development Program Award to
investigate improving the security of cognitive radio technology. Park
says that cognitive radio technology could one day be used for two-way
communications between tactical military forces or emergency responders.
However, the advantages gained by cognitive radio technology are countered
by new security threats. "In a civilian cognitive radio network, the
motive of a malicious user might be to simply cause mayhem to other users
or to receive notoriety," Park says. In a military setting, hostile forces
may try to disrupt or disable a network to interfere with communications
and gain a tactical advantage. Park plans to conduct an in-depth
investigation of critical security issues in cognitive radio systems and
networks. The research will include investigations into cooperative
spectrum sensing, which occurs when multiple cognitive radio devices
collaborate to identify unused radio spectrum bands; on-demand spectrum
contention, which are protocols that enable multiple devices to work
together with minimum interference; and spectrum etiquette mechanisms,
which would prevent the malicious use of cognitive radio devices. "We hope
our findings will help service providers and manufacturers develop more
secure technology, and also benefit regulators involved in the
standardization of cognitive radio systems," Park says.
Click Here to View Full Article
to the top
Cybersecurity Issues Misunderstood, Experts Tell
Congress
Defense News (04/07/08) Vol. 23, No. 14, P. 53; Matthews, William
Since the late 1990s, some cybersecurity experts have been saying that the
nation faces an impending all-out Internet attack on its critical
infrastructure--one that shuts down the electricity grid and water systems
and drains bank accounts. However, those threats remain largely
hypothetical, said James Lewis, director of the Technology and Public
Policy Program at the Center for Strategic and International Studies, at a
recent House Armed Services subcommittee hearing. Lewis said the U.S.
should worry about Internet-based crime and espionage instead, which are
more serious threats. Lewis pointed to the fact that last year U.S.
government computers were repeatedly broken into during attacks that appear
to have been carried out by the Chinese. In addition to collecting
information from the computer systems, which belonged to the U.S.
departments of Defense, State, and Commerce, the attackers also likely
planted malware in the computer systems, Lewis said. Meanwhile, Internet
criminals have created a black market for buying malware, hiring hackers,
and renting botnets. In order to protect U.S. computer systems from these
threats, the government should put existing security practices to better
use, said cybersecurity expert Seymour Goodman, who also testified at the
hearing.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Details Gel for New Supercomputer
Nikkei Weekly (04/14/08) Vol. 46, No. 2332, P. 17; Jimbo, Shinichi
More than $1.1 billion has been reserved for Japan's next-generation
supercomputer project by the Ministry of Education, Culture, Sports,
Science, and Technology, with the goal of building a machine that can carry
out 10 quadrillion calculations per second by 2012. The supercomputer will
boast a novel architecture that integrates scalar and vector processors,
and combining these processors' operations is the focus of recently
commenced studies. The machine will reside in a three-story building on
Kobe's Port Island with approximately 17,500 square meters of floor space.
The decision now facing Japan is how the 10-petaflop computer can best be
used by the industrial sector, and the Foundation for Computational Science
will provide help in this regard with a team of full-time technical support
staff. The foundation will also site a support center at the Port Island
facility, and the center will be open and ready for business at the same
time the supercomputer comes online. The Kansai Economic Federation will
hold seminars for association members focused on using the 10-petaflop
computer for new businesses and industries, and it will submit proposals
for helping industry utilize the machine to the Education Ministry. A
current-generation supercomputer will be used for training and research by
a University of Hyogo graduate school for computational science that will
be located in the same building.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
The Return of Ada
Government Computer News (04/14/08) Vol. 27, No. 8, Jackson, Joab
Lockheed Martin's successful delivery of an update to the FAA's
next-generation flight data air traffic control system under budget and
ahead of schedule is at least partially attributable to the Ada programming
language, which comprises approximately half the code in the En Route
Automation Modernization System (ERAM), according to the FAA's Jeff
O'Leary. Security and reliability are among the issues that Ada can
address, and AdaCore President Robert Dewar notes that a resurgence of
interest in the language is occurring. ERAM was required to be
fault-tolerant, easily upgradeable, and incapable of losing data; programs
had to be capable of recovering from crashes, and the system code must "be
provably and test-ably free" of errors, O'Leary says. Ada is distinct from
many modern and traditional languages through its strong typing feature,
which means that a programmer has to specify a range of all possible inputs
for every declared variable, thus ensuring that a malicious hacker cannot
enter a long string of characters as part of a buffer overflow attack or
that the program will not come crashing down as a result of an incorrect
value. Dewar says Ada is still utilized by the Defense Department,
especially for command and control systems, while NASA and avionics
hardware makers also use the programming language extensively. He points
out that in these instances component manufacturers are "interested in
highly reliable mission-critical programs," and Ada is also a solid
teaching language. Dewar says that while Ada originally had a heavy focus
on strong typing and provability, later incarnations have kept the language
up to date. The American National Standards Institute and the
International Organization of Standards have ratified Ada as a standard.
Click Here to View Full Article
to the top