'Radical Rethinking' of Internet Routing Under Way
Network World (09/28/07) Marsan, Carolyn Duffy
A new networking architecture that would upgrade the Internet's
scalability to support perhaps billions of new users in developing nations
is being sought by the Internet Research Task Force. "The new focus of the
[IRTF's Routing Research Group] is to work on a possible routing
architecture that includes new ways of addressing, new ways of doing
routing for the global Internet," says co-chair of the working group Tony
Li. "The IP address has both the identification of the node and the
location of the node. The question becomes: Can we separate the
identification from the locator semantics, and can we still run an Internet
with that kind of architecture?" Exponential growth in the Border Gateway
Protocol (BGP) routing table is a source of concern for experts because of
the strain it places on the processing and memory needs of the Internet's
central routers. "What CIOs really care about is the cost of their
Internet connections, and if the cost of the service providers goes up
because the routing table becomes unwieldy, that will lead to incremental
costs for everybody," notes Li. Among the benefits that a slowdown in
routing table growth would offer to enterprise network operators is a
simplified way for multihoming their networks. Two alternative routing
proposals have drawn the most attention at the working group thus far. The
Locator/ID Separation Protocol delineates a method for splitting Internet
addresses into endpoint identifiers and routing locators through the use of
tunnel routers, while in the Six/One proposal each service provider would
give provider-dependent IP addresses to the enterprise, while hosts would
employ addressing spaces from all providers on an interchangeable basis.
Another possibility is the Routing Research Group's complete jettisoning of
BGP.
Click Here to View Full Article
to the top
US Video Shows Simulated Hacker Attack
Associated Press (09/27/07) Bridis, Ted; Sullivan, Eileen
A video made by the Idaho National Laboratory for the Homeland Security
Department depicts an electrical turbine catching fire to illustrate what
could happen if hackers launched an attack on the U.S. electrical grid.
The videotaped simulation, known as the "Aurora Generator Test," was
produced by researchers probing a hazardous vulnerability in U.S. utility
companies' computers; the programming flaw has since been repaired.
According to experts, the electrical equipment that runs the country's
water, power, and chemical plants is "very old technology." Moreover,
security issues were not taken into consideration when such systems were
originally designed. Years ago, top telecommunications advisers to
President Bush asserted that an organization could electronically carry out
an attack on the electric power grid from a remote location and with a
great deal of anonymity. The Idaho National Laboratory confirmed such a
possibility, dubbing it "the invisible threat." However, other industry
experts note that criminals would require specialized information--such as
how to deactivate warning systems--to conduct such an attack. Regardless,
the Homeland Security Department and electrical companies have been
collaborating to improve security measures, and to date "we've taken a lot
of risk off the table," says Robert Jamison of the Homeland Security
Department. In addition, the Federal Energy Regulatory Commission put
forward a series of standards in July 2007 that, if implemented, would
safeguard the nation's electric power supply system from cyberattacks by
mandating the creation of plans and controls.
Click Here to View Full Article
to the top
Ohio to Test Its 5 Voting Systems Before Primary in
March
New York Times (09/27/07) P. A24; Driehaus, Bob
Ohio Secretary of State Jennifer Brunner announced that all five voting
systems used in Ohio, as well as next-generation systems, will be tested as
part of an overall effort to identify and correct serious problems with the
security and reliability of voting machines in time for the presidential
primary in March. The $1.8 million study will take place at SysTest Labs
in Denver, with assistance from professors and graduate students from
Cleveland State, Pennsylvania State, the University of Pennsylvania, and
the University of California Santa Barbara. "Part of what SysTest is doing
is studying operating procedures in 11 counties," says Brunner. "Cleveland
State will visit those same counties for a second point of view." A
bipartisan group of county elections officials will add an additional layer
of oversight. The Hart and Diebold systems decertified in California
following an extensive test ordered by California Secretary of State Debra
Bowen are also in use in Ohio, though Brunner emphasizes that with so
little time before the March 5 primary, much of the Ohio study will focus
on short-term solutions and safeguards. The results of the testing are
expected by mid December.
Click Here to View Full Article
to the top
MIT Launches Kerberos Consortium
MIT News (09/27/07) Richards, Patti
MIT on Thursday announced the launch of the Kerberos Consortium, a joint
effort on the part of industry and academia to create a universal
authentication program based on Kerberos to protect computer networks. "By
establishing the Kerberos Consortium, MIT seeks to permit Kerberos to
continue to grow and develop as a stable and universal 'single sign-on'
mechanism for the users of modern computer networks," says Kerberos
Consortium executive director Stephen Buckley. Kerberos Consortium chief
technologist Sam Hartman says the objective is to make Kerberos more useful
and available. "We foresee a day when Kerberos-based authentication and
authorization will be as ubiquitous as TCP/IP-based networking itself,"
Hartman says. One of the consortium's primary objectives is to provide the
solutions it promotes as open source reference implementations that can be
used by consortium members in their products and organizations without
licensing fees. "We see a number of our customers asking for open source,
stable, and interoperable single-sign on technology, based on the Kerberos
protocol," says Sun Microsystems director Kathy Jenks. "The MIT Kerberos
Consortium is an outstanding way to address our customers' requirements,
and a continuation of the work we have been doing within the Kerberos
community over the last several years."
Click Here to View Full Article
to the top
Yale Scientists Make Two Giant Steps in Advancement of
Quantum Computing
Yale University Office of Public Affairs (09/26/07) Emanuel, Janet Rettig
Yale University scientists have accomplished two major steps toward
achieving true quantum computing--sending a photon signal on demand from a
qubit onto wires and transmitting the signal to a second, distant qubit.
Applied physics professor Robert Schoelkopf and physics professor Steven
Girvin have spent several years exploring the use of solid-state devices
resembling microchips for use in a quantum computer. Their breakthrough
means that quantum computing has moved past simply "having information" to
"communicating information." Previously, information in quantum systems
was only able to move from qubit to qubit. Schoelkopf and Girvin have
engineered a superconducting communication "bus" to store and transfer
information between distant quantum bits, the first step to making the
fundamentals of quantum computing useful, according to Schoelkopf. The
first breakthrough is the ability to produce and control single, discrete
microwave photons as the carriers of encoded quantum information. "In this
work we demonstrate only the first half of quantum communication on a
chip--quantum information efficiently transferred from a stationary quantum
bit to a photon or 'flying qubit,'" says Schoelkopf. "However, for on-chip
quantum communication to become a reality, we need to be able to transfer
information from the photon back to a qubit." The researchers accomplished
that in their second breakthrough by adding a second qubit and using the
photon to transfer a quantum state from one qubit to another.
Click Here to View Full Article
to the top
Tech Giants Chart Research Goals
InfoWorld (09/26/07) Hines, Matt
Leading researchers from Cisco Systems, Hewlett-Packard, and Intel at this
week's EmTech Conference at MIT provided a glimpse into their top research
projects, which included work on power consumption, parallelism, and mobile
communications. Intel's vice president of research Andrew Chien says that
despite the rapid development of the mobile space, Intel feels that
wireless devices, applications, and service providers are not as intuitive
and seamless as they could be. Chien predicts that future mobile devices
will offer the "seamless presentation" of more useful information,
including tools that use geolocation and onboard sensors to give users
information about their surroundings. HP Labs senior vice president of
research Prith Banerjee says HP is working in mobility, green IT, and
parallelism simultaneously in an effort to make data centers more efficient
in general. Some of HP's research focuses on improving performance, while
other projects examine power consumption by building sensors into prototype
servers to reduce costs and problems associated with cooling massive
hardware systems. Banerjee also notes that HP is hoping to breakthrough
some of the barriers surrounding parallel programming. "We're very aware in
our research of the challenges of making parallel software applications; we
need engineers who can start writing code designed for multi-cores and to
help transition software designed to run on a single processor," says
Banerjee, adding that the possible benefits could be enormous. "We're
imaging a world where you plug in a computer and all the applications work
automatically, and users don't have to worry about patches and updates."
Click Here to View Full Article
to the top
Computer Science Faculty Explore Thermal-Aware
Computing
Virginia Tech News (09/25/07) Daniilidi, Christina
Virginia Tech associate professors of computer science Kirk Cameron and
Dimitrios Nikolopoulos will use a $350,000 National Science Foundation
Computer Science Research award to develop runtime software support for
proactive heat management in advanced computing systems. The reliability
of computer processors can degrade rapidly when a "thermal emergency"
occurs, or when the machine's temperature rapidly increases above a safe
level. Some high performance processors can consume up to 100 watts and
produce temperatures exceeding those of a hot plate. "What we want is to
reduce the heat produced by large systems with lots of components in close
proximity such as those in a data center," Cameron says. "By first
studying the way applications produce heat, our hope is to identify places
where we can reduce heat while maintaining the high-performance required by
users." Cameron's research focus is on determining the thermal effects
software has on a system, which is done primarily by observing the effects
of various power reduction strategies on processors and system thermal
behavior. Nikolopoulos is researching new thermal reduction techniques
applicable to parallel scientific applications and systems. The ultimate
objective is seeing if programs or system software can be modified to avoid
a thermal emergency or react to overheating and try to control it without
reducing system performance.
Click Here to View Full Article
to the top
System Enables Any Digital Camera to Produce Interactive,
Multibillion-Pixel Panoramas
Carnegie Mellon News (09/26/07) Spice, Byron; Watzman, Anne
Carnegie Mellon University researchers, working with NASA Ames Research
Center scientists, have developed an inexpensive robotic device that allows
any digital camera to take gigapixel panoramic photographs, known as
GigaPans. The technology is being used by students to document their
communities and by the Commonwealth of Pennsylvania to make Civil War sites
accessible on the Web. The system uses a tripod-like mount to allow
digital cameras to take hundreds of overlapping images of landscapes,
buildings, or rooms. Software developed by Carnegie Mellon and Ames is
used to arrange the images in a grid and digitally merge them together to
create a single image that could contain tens of billions of pixels.
Carnegie Mellon has also created a Web site so users can upload and
interactively explore the panoramic images in any format. "An ordinary
photo makes it possible to cross language barriers," says Illah Nourbakhsh,
an associate professor in the School of Computer Science's Robotics
Institute. "But a GigaPan provides so much information that it leads to
conversations between the person who took the panoramas and the people who
are exploring it and discovering new details." Nourbakhsh hopes that
GigaPan will help develop a community of producers and users. "GigaPan is
not just about the vision of the person who makes the image," Nourbakhsh
says. "People who explore the image can make discoveries and gain insights
in ways that may be just as important."
Click Here to View Full Article
to the top
Researchers Double Cell Phone Memory Through Software
Alone
Northwestern University (09/26/07) Fellman, Megan
Northwestern University and NEC Laboratories America computer engineers
have developed a technique that doubles the amount of usable memory on cell
phones and other embedded systems without changing the hardware or
applications on the device by altering the operating system software. The
researchers say the technique, dubbed CRAMES (compressed RAM for embedded
systems), has a minimal affect on performance and power consumption. "The
technology we've developed automatically takes data and reduces it to less
than half its original size without losing any information while the
embedded system is running," says Robert P. Dick, an assistant professor of
electrical engineering and computer science in Northwestern's Robert R.
McCormick School of Engineering and Applied Science. "It is like putting
twice as much memory in the phone without increasing its cost or power
consumption." CRAMES works by dividing the memory into two different
regions, one for regular data and one for compressed data. When an
application needs data from the compressed region the hardware pauses the
software while the operating system accesses and decompresses the data and
transfers it to the other memory region where the application can access
it.
Click Here to View Full Article
to the top
Gloomy Forecast for IT Work Force
eWeek (09/25/07) Mark, Roy
Participants in an Institute for a Competitive Workforce workshop this
week expressed serious doubt about America's ability to produce a skilled
technology workforce to remain competitive in the years to come.
Microsoft's Fred Tipson called the situation "dire," and panel moderator
James Whaley, president of the Siemens Foundation, said, "We can no longer
assume the talent pipeline will be here." Meanwhile, Judy Moog, national
program director of the Verizon Foundation, cited statistics that show the
reading skills of eighth graders, the quality of high school graduates, and
the literacy skills of the adult population are declining. Literacy is key
to competitiveness because of the vast amount of information people will
encounter over different devices, Moog said. Americans are failing to turn
basic digital know-how into advanced skills, and forcing companies to rely
on the H-1B visa program to find tech workers elsewhere. Still, some
participants said the business community could do more to support school
and training programs that focus on digital literacy, math, and science
skills. Whaley added that a lifelong "earning account" would make it
easier for workers to update their skills from time to time.
Click Here to View Full Article
to the top
Toward a Music Search Engine That Lets You Type in
Regular Words and Returns Songs
University of California, San Diego (09/26/07)
University of California, San Diego electrical engineers and computer
scientists are collaborating on a search engine that will allow people to
search for music using ordinary language descriptions. This "Google for
music" search engine allows users to enter searches such as "high energy
instrumental with piano," "funky guitar solos," or "upbeat music with
female vocals" and get corresponding music. Instead of manually entering
annotations for as many songs as possible, the UCSD researchers have
developed a series of algorithms that allow a computer to automatically
annotate songs. Before the computer can annotate songs, however, it needs
to be trained through a process of machine learning, which requires a
significant amount of data. The developers launched an online matching
game to collect data to use in the process. Much like image matching games
being used to create annotations for pictures, the Listen Game asks players
to describe a piece of music they just listened to, awarding points for
every matching answer. Annotations created by Listen Game players are used
to teach the computer how to annotate previously unheard songs. "If you
look at a music review, there are so many words that are not relevant, you
want to filter them out to get the quality training data, to get words that
are acoustically describing the song," says UCSD computer science master's
degree student David Torres, an author on the paper.
Click Here to View Full Article
to the top
Online Biometrics Flaw Gives Hackers a 'Fake
Finger'
New Scientist (09/24/07) Ananthaswamy, Anil
Researchers in Germany have discovered that the "fuzzy vault"
cryptographic scheme requires too much computing power and can be broken in
a day using a desktop computer. The biometrics strategy was seen as a way
for people to use their fingerprints to log into online bank, email, and
other accounts. A more advanced level of cryptography, the "fuzzy vault"
made the transmission of an encrypted fingerprint possible because the
print scanned by a user's PC would not have to look exactly like the match
stored by a Web site. The system is designed to store a user's fingerprint
on a secure database as a list of coordinates for specific features, create
a list of number pairs comprised of the real coordinates and their
encrypted partners, and generate thousands of fake versions to disguise
them. Researchers had believed that a hacker would not be able to pick out
the real coordinates among the numerous fake pairs. However, an analysis
by Preda Mihailescu at the University of Gottingen that involved about 500
fake versions suggests otherwise. A hacker could use the coordinates to
create a fake finger and impersonate someone "for a lifetime," says
Mihailescu.
Click Here to View Full Article
- Web Link May Require Paid Subscription
to the top
IBM's Booch: The Developer's Developer
CNet (09/26/07) Bridgwater, Adrian
Grady Booch, one of IBM's most respected authorities on programming and
one of the original authors of the Unified Modeling Language, is an active
participant in IBM's quest for next-generation software engineering,
including experiments with Second Life and mashups. Brooch believes that
Second Life and other virtual programming environments provide an
opportunity to change how companies do business, and could provide
significant economic and environmental benefits. For example, instead of
having to fly to conferences or meetings, a company representative could go
to the events virtually, saving the company thousands of dollars in airfare
and reducing the carbon footprint. "Mashups are on the edge, and
service-oriented architectures at the core are the economically viable and
technically viable choice for a large set of problems now," Booch says.
"Remember, the mashups themselves must be well-architected if they are to
endure, and remember also that SOA is really just a particular
manifestation of the classic message-based architectures." When asked how
he would respond to developers who distrust the simplicity of UML because
it does not convey the complexity of the underlying code, Booch responded
that "Models are always an abstraction of reality and, thus, to expect that
models address the complete truth of code and vice versa represents a
fundamental misunderstanding and misuse of models."
Click Here to View Full Article
to the top
Happy Birthday, Sputnik! (Thanks for the Internet)
Computerworld (09/24/07) Anthes, Gary
The launch of the Soviet Union's Sputnik satellite half a century ago led
to the creation of the Advanced Research Projects Agency (ARPA) to "prevent
technological surprises" by funding research and development into new
technologies. IT innovations cultivated by ARPA included computer
graphics, graphical user interfaces, workstations, very large-scale
integration design, time sharing, parallel computing, microprocessors, and
the Internet, because leaders such as inaugural IT research director J.C.R.
Licklider had the talent to attract the brightest minds, a large budget,
and virtually no bureaucratic restrictions. "Licklider set the tone for
ARPA's funding model: Long-term, high-risk, high-payoff and visionary, and
with program managers, that let principal investigators run with research
as they saw fit," notes Internet pioneer Leonard Kleinrock. Licklider
wrote an influential paper in 1960 that anticipated a "man-computer
symbiosis" that would be founded on such advances as "networks of thinking
centers," indexed databases, machine learning in the form of programs
capable of self-organization, dynamic linking of programs and applications,
speech recognition, tablet input and handwriting recognition, and improved
input/output techniques. Many experts warn that the United States may be
in for another technological bruising because the special culture that made
ARPA so successful in spawning revolutionary technologies is currently
absent from government. ARPA was renamed DARPA, and Kleinrock and other
researchers say the agency now concentrates primarily on practical,
short-term, classified military projects being developed by contractors
rather than university researchers. Such views are countered by Jan
Walker, a representative of DARPA director Anthony Tether, who insists that
"DARPA has not pulled back from long-term, high-risk, high-payoff research
in IT or turned more to short-term projects."
Click Here to View Full Article
to the top
Wanted: Foreign Tech Workers
Fortune Small Business (09/26/07) Zimmerman, Eilene
Companies at every level of the tech industry rely on foreign workers to
fill the skills gap, but small companies, unable to hire away talent from
bigger companies, are particularly struggling due to the talent void.
Elizabeth Charnock, CEO of Silicon Valley software company Cataphora, tried
to hire several foreign tech workers to keep her small business growing.
"We did everything you're supposed to do," Charnock says. "We hired an
immigration lawyer. We filed the first day. It went into a lottery. Five
of our eight hires got visas." Two of the three potential employees that
did not get visas had already sold their homes in Europe to move to
California. "Their lives were turned upside down. They are stuck,"
Charnock adds, "and so are we." Charnock says the ability to hire H-1B
workers is critical to small businesses. A survey by the National
Foundation for American Policy for the National Venture Capital Association
found that one-third of private venture capital-backed companies say the
lack of visas influenced their decision to place more employees in
facilities abroad, and among respondents using H-1B visas nearly 40 percent
say the cap has "negatively impacted their company when competing against
other firms globally." Foreign-born engineers and computer scientists are
critical to growth and innovation in the United States, according to a
report by The Kauffman Foundation, Duke University, New York University,
and Harvard. The study found that immigrants founded one in every four
engineering and technology companies between 1995 and 2005, and by 2006,
those companies employed 450,000 workers and generated $52 billion in
revenue.
Click Here to View Full Article
to the top
Purdue Project Will Help Attract Girls to
Computer-Related Careers
Purdue University News (09/19/07) Medaris, Kim
Purdue University professor and assistant head of the Department of
Computer and Information Technology Alka Harriger will lead SPIRIT
(Surprising Possibilities Imagined and Realized Through Information
Technology), a project designed to increase the number of young women
interested in computer-related studies. "Through the years, I and others
have witnessed the male-to-female student demographic shift from 50-50 to
90-10," Harriger says. "When our 2004 freshman incoming class of nearly
100 students included just one female, it was like a slap in the face to
wake up and take action." Funded by a $1.19 million grant from the
National Science Foundation, Harriger, along with associate professors Kyle
Lutes and Buster Dunsmore, will work to educate high school teachers and
counselors about a variety of options available to women in computer
fields, erase misconceptions people have about careers in the computer
industry, and instruct participants on how to use computer software to
create storyboards that convey technical material in an interesting and
engaging manner. SPIRIT will also include summer programs that will bring
high school teachers, counselors, and students together to teach them how
to use Alice, a 3D interactive software program created at Carnegie Mellon
University designed to help students understand concepts in science,
technology, engineering, and math. "We've found that a lot of girls are
turned off by careers in computer-related fields due to the misconception
that these jobs are boring, that they don't interact with anyone, that they
do the same work every day, and that people who hold these jobs don't
benefit society," Harriger says. "With this project, we'd like to present
a different view of information-technology careers and show girls that
there is a place for them in information technology and that they can make
a difference."
Click Here to View Full Article
to the top
Does Our Universe Allow for Robust Quantum
Computation?
Science (09/28/07) Vol. 317, No. 5846, P. 1876; Bacon, Dave
The problem of quantum decoherence is a thorny one for scientists seeking
a large-scale quantum computer, although a theoretical solution exists in
the form of a "threshold" theorem stating that multiple quantum systems can
be employed to simulate a single quantum system that is free from error,
writes Dave Bacon of the University of Washington's Department of Computer
Science & Engineering. But he notes that the theorem excludes the question
of whether robust quantum computation is permitted by the universe. This
question is compounded by the fact that the number of experiments required
to characterize the properties of quantum systems useful for fault-tolerant
computation increases exponentially with the number of quantum systems.
Bacon points out that some researchers have proposed a new scheme involving
the symmetrization of a quantum system's time evolution to eliminate
undesirable operations, so that what remains is a smaller subset of more
relevant data. The lab method for characterizing a quantum system's
evolution over time is called quantum process tomography. "As more devices
are fabricated in which quantum theory dominates, accurate understanding of
quantum processes becomes vital," Bacon concludes. "The quantum process
tomography techniques ... represent a first step toward accurately
assessing the powers and limits of these new quantum machines."
Click Here to View Full Article
- Web Link May Require Paid Subscription
to the top