Japan's Warp-Speed Ride to Internet Future
Washington Post (08/29/07) P. A1; Harden, Blaine
Japan has the world's fastest Internet connections, and delivers more data
at a lower cost than anywhere else in the world. In fact, broadband
service in Japan is between eight to 30 times faster than in the United
States and much less expensive. Faster Internet speeds in Japan, South
Korea, and much of Europe will lead to Internet innovations that are likely
to remain unavailable in the United States for many years. The high
Internet speeds in Japan allow Internet users to watch broadcast quality,
full-screen television over the Internet while all most Americans can
access are wallet-sized, grainy images. Other Internet applications in
Japan currently unavailable to Americans include low-cost, high-definition
teleconferencing, which has been used by doctors in urban areas to diagnose
patients, and advanced telecommuting. Analysts say Japan's advancement is
largely due to better wire and more aggressive government regulation. In
2000, the Japanese government compelled its large phone companies to share
wires with startup Internet providers. As competition grew, the cost of
broadband in Japan fell by about half and broadband speed increased
33-fold. In 1996, a similar measure to allow access to phone company lines
was strongly endorsed by Congress, but federal support fell through in 2003
and 2004 when the Federal Communications Commission and a federal court
ruled major companies do not have to share the share phone or fiber lines
with competitors, and the Bush administration did not appeal the decision.
"The Bush administration largely turned its back on the Internet, so we
have just drifted downwards," says former U.S. diplomat to Japan Thomas
Bleha.
Click Here to View Full Article
to the top
Point, Click ... Eavesdrop: How the FBI Wiretap Net
Operates
Wired News (08/29/07)
Documents recently declassified under the Freedom of Information Act
indicate that the FBI has constructed a point-and-click surveillance system
capable of instantaneously tapping into almost any communications device.
The Digital Collection System Network (DCSNet) links FBI wiretapping
stations to switches run by landline operators, Internet-telephony
providers, and cellular companies. The system consists of software that
captures, filters, and stores phone numbers, calls, and text messages, and
directly connects FBI wiretapping rooms throughout the nation to a
wide-ranging private communications network. The outposts are connected
via a private, encrypted backbone that is independent of the Internet and
is run by Sprint for the government. Telecoms' installation of
telephone-switching gear that meets wiretapping standards was mandated in
1994 with the passage of the Communications Assistance for Law Enforcement
Act (CALEA), thus giving the FBI the ability to log directly into the
telecom's network. CALEA's coverage was recently extended to require
broadband ISPs and certain VoIP companies to enable their networks for
federal wiretapping. Since telecoms became more wiretap-friendly, the
volume of criminal wiretaps rose 60 percent from 1,150 to 1,839 in the past
10 years, and in 2005 92 percent of those wiretaps targeted cell phones,
according to a 2006 report. CALEA wiretaps and the processing of all calls
collected by DCSNet have racked up substantial costs, and security experts
are worried that the system introduces new vulnerabilities to the
telecommunications network. The declassified documents point to numerous
flaws in DCSNet that Columbia University computer science professor Steven
Bellovin finds appalling, especially because they indicate the FBI is
ignorant of inside threats. "The underlying problem isn't so much the
weaknesses here, as the FBI attitude towards security," he says.
Click Here to View Full Article
to the top
ooPSLA 2007 Speakers -- An Embarrassment of Riches
Association for Computing Machinery (08/29/07)
The brightest minds in computer science will be at ooPSLA 2007, convening
in chic, sophisticated, metropolitan Montreal, Quebec. Peter Turchi,
author of "Maps of the Imagination," will use examples from writing and
cartography to explore the challenges of discovery, the challenges of
presenting those discoveries, and how the presentation itself is often the
key to discovery (think Impressionism). Jim Purbrick and Mark Lentczner,
also known as Babbage Linden and Zero Linden, will step away from reality
into the virtual world of Second Life. Two Turing Award winners, Fred
Brooks and John McCarthy, will be giving talks. Brooks will talk about
collaboration and telecollaboration in design. McCarthy will be presenting
Elephant 2000, a proposed programming language good for writing and
verifying programs that interact with people
(e.g., transaction processing) or interact with programs belonging to other
organizations. Gregor Kiczales will deliver a talk on how different
contexts affect developer perspectives on software. David Parnas examines
the problem of documenting the behavior of systems and their components,
and how precise documents can make validation and verification easier.
Patti Maes, honored with the title "Global Leader for Tomorrow" by the
World Economic Forum, will be speaking from her background in media arts
and sciences. Also planned is a special reprise of Gabriel and Steele's
keynote "50 in 50" from HOPL III. OOPSLA 2007 runs October 21 through 25,
with reduced registration rates available until September 13, 2007.
Click Here to View Full Article
to the top
Research at K-State, Partner Institutions, to Help
Homeland Security Make Sense of the Abundant Information in the Public
Domain
Kansas State University News (08/27/07) Barcomb-Peterson, Erinn
Kansas State University associate professor of computer and information
sciences William Hsu and other computer scientists with expertise in data
mining are contributing to a project to develop technology that would make
automated Internet searches simpler and more productive. "We're helping to
develop the next generation of Web search and crawling," Hsu says. "The
Department of Homeland Security wants to pull information that's available
to anyone in the public domain, like millions of articles from sources like
CNN and Al-Jazeera, and monitor them for security." Hsu's work for the
Department of Homeland security project will focus on eliminating ambiguity
in Internet searchers, including improved name recognition. The goal is to
create a search engine that could, for example, differentiate between
homeland security as a concept and Homeland Security as a government
agency. "The goal is to develop an automated system that can pick out
al-Quaida as an organization, Kandahar as a place, and Osama bin Laden as a
person, based upon rules developed from previously-seen documents," Hsu
says. The research will also work on solving another problem with finding
information on the Internet--inefficient crawling. Search engines provide
up-to-date results by looking through Web pages and archiving them, a
process known as crawling and sometimes referred to as "crawling in the
dark." Hsu says research in this area could create search engines that
could anticipate keywords and create virtual neighborhoods of information
by making connections between bits of information based on the results of
similar searches.
Click Here to View Full Article
to the top
US Suspends Vast ADVISE Data-Sifting System
Christian Science Monitor (08/28/07) P. 1; Clayton, Mark
The Analysis, Dissemination, Visualization, Insight, and Semantic
Enhancement (ADVISE) system was used from 2004 to mid-2006 by the U.S.
Department of Homeland Security as a data-mining tool to hunt terrorists,
weapons of mass destruction, and biological weapons using Americans'
personal data and with little regard for federal privacy laws. Now, the
$42 million system capable of processing trillions of pieces of data has
been put on hold and may be terminated following data-privacy reviews,
according to a report submitted to Congress by the DHS' Office of Inspector
General (OIG). The OIG found that ADVISE failed to account for federal
privacy laws during its design, and the system used live data, including
personally identifiable information, from multiple sources without taking
steps required by federal law, and DHS' own internal guidelines, to prevent
the data from being misused. The failure to implement privacy safeguards
in the ADVISE program appears to be the result of confusion and
miscommunication about privacy requirements by ADVISE program managers and
the DHS' privacy office. The privacy office argues that until the ADVISE
system was connected to data it was not a data-mining program that needed
privacy review. However, unknown to the privacy office, the ADVISE pilot
programs had been operational and using real personal data for about 18
months before the privacy office made the report to Congress, the OIG
discovered. The DHS has not disclosed what type and how much personal data
was used, but DHS science and technology directorate spokesman Larry
Orluskie acknowledges that ADVISE may have been "too zealous in its
testing." Orluskie says the ADVISE system is back on track, though he is
unsure if the privacy assessment was complete or if operation had resumed.
The damage may be done however, as the failure to follow privacy laws has
reduced interest within the DHS in ADVISE, the OIG reports.
Click Here to View Full Article
to the top
C.U. Researcher Develops Info Sharing Application
Cornell Daily Sun (NY) (08/29/07) Manapsal, Elizabeth
Cornell Computer Science Department researcher Sandy Payette is founder
and co-director of Fedora Commons, an open-source software application that
could potentially revolutionize how scholars, institutions, and libraries
share information. Payette plans to use Fedora Commons to build an online
system that supports open collaboration between software developers and Web
site designers that could be used as a template for storing and preserving
different types of data. Fedora, an acronym for Flexible Extensible
Digital Object Repository Architecture, was created as part of Payette's
research in the late 1990s and is currently used by libraries, museums, and
universities as a way to manage content-based systems. Payette now wants
to expand Fedora to include open access publishing, eScience, and
eScholarship. "The idea is that this software is built by a community of
stakeholders who have a personal investment in its ongoing evolution," says
Payette. Fedora-backed systems could provide students with a level of
scholarship not available on open-access sites such as Wikipedia. While
both sites can be accessed by anyone, Wikipedia is an open access content
Web site whereas Fedora is an open access software template for
content-management, but the content on Fedora sites undergoes rigorous
examination and scrutiny. "Fedora can enable content management and
preservation," says Fedora Commons communications and media director Carol
Minton Morris. "If a student has a good idea on how to build a Web service
of some kind and wanted to use it, they could use it like anyone else."
Click Here to View Full Article
to the top
Networking the Hudson River
Technology Review (08/29/07) Sauser, Brittany
IBM and the Beacon Institute will work with several other research
institutions to develop an environmental-monitoring system for all 315
miles of the Hudson River. The project entails deploying a network of
sensors that will collect biological, physical, and chemical information
and transmitting it to a central location. Some sensors will be suspended
from buoys or fixed in place on the riverbed, but a few will be mounted on
robotic underwater vehicles developed by Rensselaer Polytechnic Institute
(RPI) and the Woods Hole Oceanographic Institute, both contributors to the
project. "In terms of having an integrated network of sensors, and given
the magnitude of it for the Hudson River, this project is without a doubt a
huge advancement and on a much larger scale than anything that has been
done before," says Sandra Nierzwicki-Bauer, a member of the Beacon
Institute and director of the Darrin Fresh Water Institute at RPI. The
massive amount of data collected by the extensive network will be analyzed
by a new data acquisition and analysis system developed by IBM. The system
contains both distributed-processing hardware and analytical software, and
is designed to receive heterogeneous data from multiple sources and process
it in real time. The software is capable of learning and recognizing
patterns and can prioritize processing power for useful data. If a data
stream consists of minor variations, the system automatically redirects
resources to that stream. The system also has visualization technologies
to create a virtual model of the river and simulate its ecosystem in real
time.
Click Here to View Full Article
to the top
Are Drivers Ready for High-Tech Onslaught?
CNet (08/28/07) Lombardi, Candace
High-tech automobile options such as self-parking, auto-braking,
touch-screen displays, and Bluetooth communications, currently only
available in luxury cars, are becoming available for lower-cost models, but
some experts are concerned that all that technology could overwhelm
drivers. Northwestern University professor Don Norman says much of the
technology in cars is beneficial, but it can also be confusing, which is
dangerous in an automobile. Norman says automakers will have to teach
drivers how to use these new tools, and the interface will be drastically
different from familiar personal computer displays, often requiring no-look
coordination. "It's amazing how much of this is designed by engineers who
have no real understanding of the way average, everyday people behave,"
Norman says. Some early attempts to reinvent the driver-car relationship
have been disastrous, such as BMW's iDrive, which used a iPod-like click
wheel to control air conditioning, heating, navigation, and communications,
and was so complicated it spurred nicknames such as iCrash. "The last
thing you want to do is drive and push a bunch of buttons," says Volkswagen
of America's technical strategy manager Frank Weith. "If you can
manage--not through voice commands and keywords, but through natural
speech--that will be the most effective way to manage the information
that's in your vehicle." Weith also sees communications technology
changing the role of the car, including streaming real time traffic data
into cars' navigational systems. As technology makes the roads safer, it
could also create potentially hazardous scenarios. For example, drivers
may turn off safety devices because they feel they are a nuisance, similar
to how an organ transplant patient stops taking his medicine because he
feels healthy, Norman says.
Click Here to View Full Article
to the top
The Next Dimension of Digital
Paramus Post (NJ) (08/28/07) Sidener, Jonathan
ACM's SIGGRAPH conference is known for presenting the most advanced
technology in digital media, and the conference this year highlighted some
technologies that could become the future of television. A 3D holographic
image known as the jogger allowed attendees to walk around the display and
view the video from every angle. Another exhibit allowed people to put
their hand into an empty box and manipulate a virtual jack in the box using
a virtual copy of their hand. Also on display was a flat-panel television
that viewers could use to watch 3D images without wearing special glasses.
Texas Instruments showcased another television capable of 3D images with
the help of special battery-powered glasses. TI says the 3D capability can
easily be added to its DLP HDTVs and it expects to sell 1 million 3D-ready
TVs by the end of next year. Researchers at the University of Southern
California's Institute for Creative Technologies, Fakespace Labs, and Sony,
the creators of the jogger, say if hardware prices continue to fall at
current rates, the technology used in the jogger could be available for
home use in 10 years. The jogger, technically called an "interactive
360-degree light field display," is created using modified off-the-shelf
hardware, mainly a digital projector that sends 5,000 frames per second
onto a rapidly spinning mirror. A SIGGRAPH spokesman says it is not a
question of whether the future will be in 3D and holographic, but whether
holograms and 3D images will be viewed from the wall, like traditional TVs,
or from the middle of the room with 360-degree viewing.
Click Here to View Full Article
to the top
Feds Seeking Input on Networking Research Plan
Computing Research Association (08/27/07) Harsha, Peter
The National Coordinating Office for Networking and Information Technology
Research and Development (NITRD) is revising its draft plan for advanced
networking research and development, and is seeking comments, suggestions,
and additions from the networking research and development communities by
Sept. 30, 2007. The Draft Federal Plan for Advanced Networking Research
and Development will have a significant impact on the direction of
government networking research priorities over the next seven to eight
years. Comments from universities, federal laboratories, commercial
researchers, and developers will give NITRD a better understanding of
networking needs in the years to come and how to address those priorities.
The input will influence the final report, which will guide federal
agencies for fiscal year 2010 and the following budget planning cycles. A
Draft Interim Report appeared May 15, 2007. The report and instructions
for providing comments can be found at the NITRD Web site.
Click Here to View Full Article
to the top
Stereotypes Turn Girls Off to Math, Science
LiveScience (08/27/07)
The National Science Foundation's Research on Gender in Science and
Engineering program found that although pop culture presents an image that
girls are just as interested in science and math as boys, new studies
suggest otherwise, while several myths about girls and science endure. The
first myth is that by the time children start school, girls are already
less interested in science than boys. Although a recent study of fourth
graders found that 66 percent of girls and 68 percent of boys said they
enjoy science, another study found that by the second grade, when asked to
draw a scientist, both boys and girls tend to draw a male in a lab coat.
Any female scientists are drawn tend to look severe and rather unhappy.
This stereotype of isolated, stern, and unhappy female scientists is so
persistent in society that by the eighth grade boys are twice as interested
in STEM subjects as girls. Another myth is that interventions intended to
interest girls in STEM might cause boys to become uninterested. In
reality, such interventions have been found to interest both girls and
boys, because when girls are shown images of female scientists, boys
realize that they can succeed in science as well. Yet another myth is that
at the college level, changing STEM classes could water down important
"sink or swim" courses. In reality, the process of "weeding out" weaker
science students, especially in more quantitative disciplines,
disproportionately weeds out women. Not necessarily because women fail
more often, but because women often perceive Bs as inadequate grades and
drop out while men with Cs stay in the program. Effective mentoring and
"bridge programs" are needed to prepare students for challenging
coursework. Changing the curriculum often leads to better recruitment and
retention of both women and men in STEM programs.
Click Here to View Full Article
to the top
A Computer Simulation Shows How Evolution May Have
Speeded Up
Weizmann Institute of Science (08/28/07)
Researchers at the Weizmann Institute of Science's Molecular Cell Biology
and Physics of Complex Systems Departments have developed computer
simulations that mimic natural evolution, allowing the researchers to
observe and manipulate the evolutionary process. Nadav Kashtan, Elad Noor,
and Uri Alon simulated a population of genomes evolving over time toward a
given goal. The researchers found in the simulation that changing
environmental conditions sped up the evolution of the genome. More complex
and complicated goals, which would take more generations to reach under
fixed conditions, also accelerated the evolution process to reach changes
in the goal. The evolution simulations ran fastest when changes followed a
pattern similar to what the researchers believe may occur in nature. In
previous research, Kashtan and Alon showed that evolution may frequently be
modular, with individual parts changing rather than the entire organism
changing at the same time. The researchers theorized forces acting on
evolution may be modular as well, and that subgoals of evolution may change
as the forces do, without changing the primary goal. "We saw a large
speedup, for instance, when we repeatedly exchanged an 'OR' for an 'AND' in
the computer code defining our goals, thus changing the relationship
between subgoals," Kashtan says. The main purpose of the research was to
find answers to theoretical questions of evolution, but it may have some
practical implications as well, particularly in engineering fields where
evolutionary tools are used for systems design, and in computer science by
providing a possible way to speed up optimization algorithms.
Click Here to View Full Article
to the top
Microsoft's UI Ambitions Not Limited to Tables: A New
Windshield HUD Patent
Ars Technica (08/27/07) Haselton, Todd
Microsoft has filed a patent for an "adaptive heads-up user interface for
automobiles" that could make driving a car more like flying a fighter jet.
The automobile heads-up display (HUD) would replace the windshield and
display navigational information, car speed, weather information,
information on the driver's health, what music is playing, and possibly
where the nearest parking space is. The display could also be used to
control various functions in the car such as climate control, a
communication system, and a general input device. The HUD display could be
viewed and controlled similar to items on a computer monitor, with a hidden
state, a collapsed state, a preview, and a full screen view. Microsoft
says HUD is intended to improve the driver's awareness of his or her
surroundings by alerting the driver to information on road conditions or
accidents. The driver would be able to prepare and monitor road and
vehicle conditions without having to look down from the windshield. HUD
would also be able to display information about the driver, like heart
rate, blood pressure, and body temperature, and would be able to diffuse
potentially dangerous situations caused by the driver, for example, by
playing soothing music to calm road rage or sounding alarms to wake a
sleepy driver.
Click Here to View Full Article
to the top
Metasearch Engine Digs Deeper, Faster for News
University of Illinois at Chicago (UIC) (08/22/07)
A metasearch engine that searches more news sites over a wider geographic
area than other Internet search tools is now available. The new search
engine, Allinonenews, queries more than 1,800 news search engines in about
200 countries and territories. "A metasearch engine's results contain all
the content of underlying search engines, giving it much greater coverage,"
says University of Illinois at Chicago computer science professor Clement
Yu, who developed the metasearch engine. "Ours is designed to connect
automatically to each search engine and retrieve in a uniform format."
Allinonenews is able to produce timely searches because it filters out
exact duplicate results and automatically peruses news sites where the
answer is likely to be found. Allinonenews will tunnel through search
engine databases to find information from "deep Web" pages, and will also
perform a semantic match of retrieved documents.
Click Here to View Full Article
to the top
NSF Grant Funds UNT Study of Global Software Development
Teams
University of North Texas News Service (08/22/07)
A National Science Foundation grant will enable University of North Texas
(UNT) computer science professor Dr. Kathleen Swigger to study a team of
programmers as members located in different countries collaborate to write
software. The project involves students at UNT and in Turkey, Panama, and
the United Kingdom who are engaged in large software projects. By studying
how the students interact and actually work, Swigger expects to provide a
better understanding of the concept of distributed programming. "Students
need to know how to use the technology to work in culturally mixed and
geographically distributed work teams, because distributed software
development is becoming the norm," says Swigger. She plans to develop
curriculum materials that will teach students how to work with people from
different cultures who are in different time zones.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
The Color of Trustworthiness
Jerusalem Post (08/18/07) Siegel-Itzkovich, Judy
University of California, Santa Cruz associate professor of computer
engineering Luca de Alfaro has developed a program that analyzes
Wikipedia's entire editing history and estimates the trustworthiness of
each page. De Alfaro's program uses the longevity of the content to learn
which contributors are the most reliable. "The idea is very simple," de
Alfaro says. "If your contribution lasts, you gain reputation. If you
contribution is reverted [to the previous version], your reputation falls."
The program analyzes the user's editing history to assign a reputation
score. The trustworthiness of newly inserted text is computed as a
function of the reputation of its author. As more contributors examine the
text, their reputation contributes to the text's score. Working from
copies of Wikipedia the site distributes, the program is able to analyze
Wikipedia's seven-year editing history in about a week, and correctly flags
more than 80 percent of edits that turn out to be poor. After the initial
backlog of edits has been processed, de Alfaro says updating reliability
scores in real time should be relatively simple. The program prominently
displays the trustworthiness of each article, but keeps individual
contributor's scores hidden to avoid creating a competitive atmosphere that
would detract from Wikipedia's collaborative culture.
Click Here to View Full Article
to the top
The Tech Lab: Vint Cerf
BBC News (08/24/07) Cerf, Vint
The Internet is still very youthful in terms of being an intellectual
phenomenon, but its relative youth is greatly offset by its benefits and
people's reliance on it, argues Google's chief Internet evangelist Vint
Cerf. He contends that the Net is on the cusp of becoming "the greatest
communications platform humanity has ever known." The online population
will grow as the Internet expands to other corners of the globe, and as new
kinds of content that can be accessed by a wider variety of devices become
available, Cerf predicts. Likewise, greater reliance on the Net and its
services call for higher levels of robustness and security, and the author
maintains that key areas of focus for near-term Internet development will
be improving the resilience and resistance to attack of the Domain Name
System and other vital infrastructure; he expects a higher priority to be
assigned to the introduction of DNSSEC and the digital signing of address
space by the Regional Internet Registries. Cerf says the Web's expansion
faces dwindling capacity, which will require a transition to the IPv6
address space--which will be no easy matter. Cerf anticipates the
increased importance of search engines, while various factors will
contribute to the problem of "information decay," in which digital objects'
accessibility diminishes as their source software ages. "With home, car
and office appliances all online and rich sensor networks as part of the
landscape of the Internet, it is easy to predict that people will be
looking for online services to manage these devices and systems, regardless
of where they happen to be," reasons Cerf. He says the spread of mobile
devices and improvements in their capability to access the Web will step up
information access, especially in the developing world.
Click Here to View Full Article
to the top
The Bytes Behind Biology
Scientist (08/07) Vol. 21, No. 8, P. 44; Gawrylewski, Andrea
Hundreds of scientific papers on a wide range of subjects use data
produced by the Pittsburgh Supercomputing Center (PSC), which boasts a
system capable of performing 21 trillion calculations per second on the
strength of more than 4,000 processors. PSC's largest computer is BigBen,
which can execute in mere weeks tasks that would take the fastest current
desktop computers five to 10 years to carry out. BigBen was built for
close to $10 million provided by the National Science Foundation, while PSC
received $52 million to develop TeraGrid, a network of nine computing
centers with 280 teraflops of collective computing muscle. BigBen's
processor time is divided among the 2,000 researchers who utilize the
computer. PSC codirector Ralph Roskies says TeraGrid was used by a
scientist at the University of Illinois, Urbana-Champaign, to generate
detailed images of the nuclear pore complex, which is one of the most
significant findings the PSC supercomputer contributed to. However, PSC
researchers must be cognizant of the fact that the perfect images produced
through supercomputing are simulations and not reality. More than 600
papers published in 200 journals have been produced by TeraGrid. In
September, the NSF will disclose whether it has approved PSC's request for
$200 million to obtain the hardware for a new computing system with 1
petaflop of processing power, which if built would likely be the fastest
system on earth, according to PSC director of strategic applications
Nicholas Nystrom.
Click Here to View Full Article
- Web Link May Require Paid Subscription
to the top