Who Needs Hackers?
New York Times (09/12/07) P. H1; Schwartz, John
Though conceding that computer hackers are a clear threat, experts
maintain that some of the most serious and disruptive network problems can
be traced to non-malevolent sources, most notably a network's complexity.
"We don't need hackers to break the systems because they're falling apart
by themselves," says SRI International principal scientist Peter G.
Neumann. Nemertes Research's Andreas M. Antonopoulos says the transition
from relatively simple computing architectures to massively distributed and
connected networks has increased the difficulty of predicting, detecting,
and correcting flaws. A problem as simple as a defective network card can
have a cascading effect that leads to a network failure, such as the one
that shut down computers for the U.S. Customs and Border Protection Agency
and delayed flights at Los Angeles International Airport for hours last
month. "Most of the problems we have day to day have nothing to do with
malice," says Columbia University computer science professor Steven M.
Bellovin. "Things break. Complex systems break in complex ways." He
notes that it was a cascading series of failures that shut down the
electrical grid in the Eastern United States and Canada in the summer of
2003. The integration and interdependence of multiple computer networks
only makes system-wide vulnerability to a single weak link more likely,
according to Veracode CEO Matt Moynahan. Johns Hopkins University
professor Aviel D. Rubin says high-tech voting machines could be extremely
susceptible to glitches, and he entertains the possibility that the
emphasis on the hacker threat has eclipsed the threat of unintentional
problems. One way to minimize non-malicious disruptions is to strengthen
systems' capacity for recovery through backup protocols, while Neumann
believes the best strategy is to design security and stability into
computers from the very beginning. Peter G. Neumann moderates the ACM
Risks Forum;
http://www.risks.org/
Click Here to View Full Article
to the top
China's Eye on the Internet
University of California, Davis (09/11/07) Fell, Andy
Researchers at the University of California, Davis and the University of
New Mexico are developing ConceptDoppler, an automated tool that monitors
changes in Internet censorship in China. The tool uses mathematical
techniques to group words by meaning and identify words that are likely to
be blacklisted by the Chinese government. Many countries have some form of
Internet censorship, primarily using systems that block specific Web sites
or Web addresses, but China's system is unique in that it filters for Web
content or specific keywords to selectively block pages, according to UC
Davis graduate student Earl Barr. The researchers sent messages to
Internet addresses in China containing a variety of different words that
might be censored. Barr says if China's system was truly a firewall most
of the blocking would take place at the border with the rest of the
Internet, but some messages passed through several routers before being
blocked. A firewall would also block all occurrences of a banned word or
phrase, but banned words were able to reach their destination about 28
percent of the time. By filtering ideas instead of specific Web sites, the
system prevents people from using proxy servers or "mirror" Web sites to
avoid censorship, but because it is not completely effective the system
probably acts more as an unseen watchman, encouraging self-censorship, Barr
says. When users in China see a word or phrase that is normally blocked,
they might choose to avoid that page, assuming someone is monitoring that
site. Work on ConceptDoppler will be presented at the ACM Computer and
Communications Security Conference in Alexandria, Va., on Oct. 29-Nov.
2.
Click Here to View Full Article
to the top
NASA Building Silicon Chips That Can Handle Massively
High-Heat and Then Some
Network World (09/11/07)
NASA has developed a new circuit chip capable of withstanding
significantly higher temperatures for longer periods of time than any other
chip has previously achieved. Normally, silicon-based electronics begin to
fail at about 350 degrees Celsius and integrated circuit chips cannot
withstand more than a few hours of high temperatures before degrading or
failing. The silicon carbide (SiC) chip developed by NASA can operate at a
temperature of 600 degrees Celsius, 1,112 degrees Fahrenheit, and can
exceed 1,700 hours of continuous operation at 500 degrees Celsius, a
100-fold increase over previous technology. NASA says SiC chips could be
used for energy storage, renewable energy, nuclear power, and electrical
drives. The temperature resistant chips will also lead to increases in
power density, smaller heat sink requirements, small sizes and mass
overall, and higher frequency operation for filters and transformers.
"This new capability can eliminate the additional plumbing, wires, weight,
and other performance penalties required to liquid-cool traditional sensors
and electronics near the hot combustion chamber, or the need to remotely
locate them elsewhere where they aren't as effective," says Phil Neudeck,
an electronics engineer and lead researcher for the Aeronautics Research
Mission Directorate at NASA's Glenn Research Center.
Click Here to View Full Article
to the top
U.S. Patent Bill Still Faces Obstacles
IDG News Service (09/10/07) Gross, Grant
Patent reform legislation that recently passed in the U.S. House of
Representatives still faces significant opposition as it enters the Senate,
particularly from small inventors, pharmaceutical companies, several small
technology vendors, and labor unions. The bill would change how courts
assess patent infringement damages, considering the value of the infringed
patent and not the value of the entire product. It would also add new ways
to challenge patents after they have been granted. Opponents of the bill
say the changes will allow large companies to steal patented inventions
from small companies without fear of repercussions. Symantec CEO John W.
Thompson says passing the bill in the House was a major victory for
innovation and competitiveness in the United States. However, attorney
Bobbie Wilson says the House bill ignores the largest problem with the
patent system, the fact that the Patent and Trademark Office is drastically
under-funded, lacks examiners, and often loses experienced people because
of poor salaries. Ronald Riley, president of the Professional Inventors
Alliance, a trade group opposed to the bill, says opponents will focus hard
on the debate in the Senate, and will target lawmakers who support the
legislation during the 2008 elections. Meanwhile, Rep. Howard Berman
(D-Calif.), a leading sponsor of the patent reform bill, has introduced
another bill calling for increased funding at the PTO.
Click Here to View Full Article
to the top
Redefining the Architecture of Memory
New York Times (09/11/07) P. C1; Markoff, John
IBM research fellow Stuart S.P. Parkin may be on the brink of a
breakthrough that could increase data chip storage capacity anywhere from
10 to 100 times. Parkin's work could begin to replace flash memory in
three to five years, and would allow every consumer to carry data
equivalent to a college library on small portable devices and could allow
engineers to develop totally new entertainment, communication, and
information products. Parkin's previous research resulted in the ability
to manipulate the alignment of electronics to alter the magnetic state of
tiny areas of a magnetic storage device, allowing for devices such as iPods
and Google-style data centers. His new technique, known as "racetrack
memory," uses billions of ultra-fine wire loops around the edge of a
silicon chip. Electric current is used to slide infinitesimally small
magnets up and down each of the wires to read and write digital ones and
zeros. The magnets are capable of moving at speeds greater than 100 meters
per second, making it possible to read and write magnetic regions in a
single nanosecond. "Finally, after all these years, we're reaching
fundamental physics limits," Parkin says. "Racetrack says we're going to
break those scaling rules by going into the third dimension." IBM Research
vice president for systems Mark Dean says racetrack memory will not only
change our ideas of storage, but how we view processing information,
blurring the line between storage and computing. "We moving into a world
that is more data-centric than computing centric," Dean says.
Click Here to View Full Article
to the top
Primate Behavior Explained By Computer 'Agents'
University of Bath (09/11/07) McLaughlin, Andrew
University of Bath researchers used artificial intelligence-based computer
agents to simulate the complex behavior of primates. The simulation
provided insight into why some primate groups are despotic while others are
egalitarian, and also provided support for a theory on how dominant
macaques manage to stay in a safer position at the center of a group
without being completely occupied with doing so. The computer agents were
given two rules--to stay in a group for safety and to pester subordinates
until they move away--and found that the dominant agents naturally
congregated at the center, indicating harassing subordinates could be an
evolutionary mechanism to help protect the more dominant and successful
members of a group. "This kind of agent-based modeling is really a new way
of doing science," says Dr. Joanna Bryson, leader of the study from the
University of Bath's Computer Science Department. "Agent-based modeling
techniques let us invent and remove behaviors to test the explanations of
what we see in nature." Bryson says modeling makes it possible to change
the variables for various types of behavior and see their effect over
generations in just a few hours.
Click Here to View Full Article
to the top
A New Way to Read Hard Disks
Technology Review (09/11/07) Patel-Predd, Prachi
As the data density on disks approaches one terabit per square inch, the
necessity for smaller sensors pushes sensor size toward its physical
limits. Researchers at the National Physical Laboratory in Teddington,
U.K., may have found a solution in a novel sensor design based on a
magnetic effect that current read heads do not use. The new sensor would
use slightly less power than existing read heads and could improve speeds
by three or four times, according to lead researcher Marian Vopsaroiu.
Currently, laptops and computers use the magneto-resistance effect to read
hard-disk data, which, as the read head flies over the disk, the magnetic
fields of the bits cause a resistance change in the head's sensor. The
resistance cannot be directly measured, so it is first converted into a
voltage using a direct current, which must continuously be run through the
sensor. The new sensor does not require a constant current because it uses
the magneto-electric effect. Materials with a magneto-electric effect have
coupled electric and magnetic fields, which change in response to one
another. In the new sensor, a data bit's magnetic field directly generates
voltage instead of resistance. The sensor is also only seven layers thick,
compared to current sensors which are 15 layers thick. Vopsaroiu believes
the new sensing technique could lead to sensors thinner than 10 nanometers
capable of reading disks with a density of one terabit per square inch.
Vopsaroiu says that these numbers are only theoretical and that any
practical design would be challenging, but notes that current read heads
are just as complicated and manufacturers have found ways to produce them
easily.
Click Here to View Full Article
to the top
EAC to Release Draft Voting-System Guidelines
Government Computer News (09/10/07) Jackson, William
The Election Assistance Commission plans to publish a new draft of
guidelines for certifying voting systems in the Federal Register by Sept.
20. Described as a complete rewrite of the Voluntary Voting System
Guidelines adopted in 2005, the new revision bars wireless connections for
electronic voting systems, addresses software independence, and updates
requirements for a voter-verifiable paper audit trail. The standards are
voluntary, but most states use them to certify their e-voting systems. A
new set of standards is unlikely to be available in time for the 2008
primary and general elections because the approval process consists of
comment periods after two and four months, and the guidelines could be
rewritten two times. The National Institute of Standards and Technology
assisted the EAC in developing the draft. The commission was created in
the wake of the e-voting machine problems of the 2000 presidential
election, and was charged with overseeing certification standards.
Click Here to View Full Article
to the top
Governors Throw Support Behind H-1B Increase
CNet (09/11/07) Broache, Anne
The U.S. Congress should not let the fate of the immigration bill keep it
from addressing the skilled visa issue, according to a letter 13 governors
sent Tuesday to Senate and House leaders. The governors urged Congress to
take up the issue soon because there is "a critical shortage of highly
skilled professionals in math and science to fill current needs."
Governors from tech-heavy states, including Arnold Schwarzenegger of
California, Rick Perry of Texas, Deval Patrick of Massachusetts, Chris
Gregoire of Washington, and Eliot Spitzer of New York, were among those
calling for an increase in the limit of 65,000 visas annually for the H-1B
program. Silicon Valley companies say allowing more foreign students and
workers to stay in the country and find jobs is needed to end the
technology industry's skills shortage. However, advocates for U.S. tech
workers have criticized an increase in the cap, and some politicians
believe tech companies are using the H-1B program to drive down wages.
Click Here to View Full Article
to the top
Helping Computers to Search With Nuance, Like Us
New York Times (09/12/07) P. H5; Wayner, Peter
Private businesses, university researchers, and public search engines are
among the organizations trying to solve the problem of how to make richer,
more structured collections of data that can be searched and analyzed more
efficiently. Some efforts focus on ontologies, while other focus on
creating new technologies. Semantic Web technology allows everyone to
connect databases so customers and partners can integrate information.
David Beckett is a principal software engineer at Yahoo who is working on
adding structure to Yahoo's collected databases by using semantic Web tools
to make it easier for various silos to work together. "If you have a
recipe that mentions a chef, you can link to an article about the chef,"
says Beckett. "If you have a news article that mentions a record producer,
you can link across to the music site and see the records he's produced."
Other efforts focus on solving problems computers regularly have trouble
with, like nicknames, such a "Bob" for "Robert," or suffixes like "Jr." or
"III," which can be mistaken for last names. Creating a unified structure
for names helps in many different databases, such as those for banks and
insurance companies, but the problem is that the lists are constantly
changing and being updated. "We still haven�t found a good way to make
structured or semistructured data work perfectly, and the database
community has been working on it for 50 years," says University of Maryland
colleague in computer science Tim Finn. However, he says researchers
eventually will find ways of making machines understand these relationships
on their own.
Click Here to View Full Article
to the top
Researchers Investigate Tracking, Sensors to Assist Air
Force
Louisiana Tech University (09/07/07) Roberts, Judith
Louisiana Tech assistant professor of computer science Sumeet Dua has been
developing fast and accurate computer algorithms to help the Air Force
create sensors that are better at automatically recognizing, identifying,
classifying, and tracking targets of interest. "Algorithms can be applied
to national defense in a variety of ways, including missions involving
air-to-ground, ground-to-ground, surface-to-surface, and air-to-air
scenarios," Dua says. "The algorithm is unique in its ability to use a
system-level approach to define both a target's signatures and movement.
It uses sophisticated data-mining techniques, a class of computer science
algorithms used to discover embedded, hidden patterns and anomalies in data
which are previously unknown but very useful." Remote sensors such as
cameras and radars are used to locate targets. Software then determines
the positions and features of the target using rotational and transnational
variations. The algorithm uses patterns to obtain signature information on
unique targets. "The algorithm is novel in its ability to take a
system-level approach to achieve reinforced concurrent learning of both the
target's signatures and movement in a single run on the software program,"
says Dua, who notes that the algorithm can be used in metropolitan areas to
identify humans in irregular terrains or to identify and log the suspicious
movement of vehicles. Meanwhile, Louisiana Tech assistant professor of
electrical engineering Rastko Selmic has been researching the deployment
and control of wireless sensor networks, focusing on how to perfectly
position and deploy a large number of sensors to cover an area while still
providing extensive coverage on a specific target.
Click Here to View Full Article
to the top
Coming Soon: A Supercomputer for the Rest of Us
Computerworld (09/09/07) Ames, Ben
University of Maryland researchers have built a prototype of a desktop
supercomputer, and now plan to shrink the license-plate-size board running
at 75 MHz to a version that is about the size of a fingernail and runs
between 1 GHz and 2 GHz. The Explicit Multi-Threading (XMT) computer makes
use of parallel computing algorithms and the large number of transistors in
modern processors to run 100 times faster than a PC. The
three-programmable gate array chips from Xilinx represent a network of 64
ARM processors controlling dozens of threads of simultaneous calculations,
says Uzi Vishkin, a professor in the university's school of engineering who
built XMT with the help of his graduate students. IBM is now manufacturing
a CMOS silicon application-specific integrated circuit (ASIC) with an
on-chip data interconnect network for Maryland. Vishkin maintains that
XMT, which is at least three years away, would be easy for the average
person to program because the operating systems sees the XMT algorithm as a
single thread.
Click Here to View Full Article
to the top
'Smart Homes' Could Track Your Electrical Noise
New Scientist (09/10/07) Kleiner, Kurt
Instead of a house embedded with sensors, smart homes of the future may
track a homeowner's movements by monitoring the electrical noise made by
different devices throughout the house as they are turned on and off. "The
problem I see with a lot of ubiquitous computing research is that it
requires the creation of new infrastructure and technology," says Georgia
Institute of Technology computer scientist Gregory D. Abowd. "A lot of
what we have been focusing on is how you can achieve some of these things
without requiring Joe Blow to buy new stuff." Abowd and colleagues have
developed a device connected to a laptop that plugs into a standard wall
socket and monitors noise in the electrical supply caused by turning
devices on or off. Software analyses the frequencies of noise created in
the power line and is trained to recognize noise from specific appliances.
The system was tested on 19 different electrical devices in six different
homes with 85 percent to 90 percent accuracy. The system could be used to
automatically adjust temperature controls and sound systems as people move
about the house, or monitor the activity levels of older people living
alone. The only downside to the system is that it takes about four hours
to calibrate a typical house, but installing networks of cameras and
sensors takes a long time as well, Abowd says. The researchers also need
to prove that the device can distinguish between multiple devices running
at once. Abowd will present his research at next week week's International
Conference on Ubiquitous Computing in Innsbrook, Austria.
Click Here to View Full Article
to the top
Rogue Nodes Turn Tor Anonymizer Into Eavesdropper's
Paradise
Wired News (09/10/07) Zetter, Kim
Swedish computer security consultant Dan Egerstad collected thousands of
private email messages from embassies and human rights groups worldwide by
simply hosting five Tor exit nodes as a research project. Civil liberties
groups, law enforcement, and government agencies use Tor, which is a
privacy tool created to thwart tracking of where a Web user goes on the
Internet and with whom a user converses. However, many Tor users
incorrectly think Tor is an end-to-end encryption device, when in reality
Tor has an acknowledged weakness. Tor works by having volunteer-donated
servers bounce traffic around as it journeys to its destination, and
traffic is encrypted for all but the last leg of the route. When traffic
passes through the Tor network's final node, the communication must be
decrypted before it can be delivered to its final destination, which means
Web activity, instant messages, and email content are potentially disclosed
to any Tor server owner who is eavesdropping. The pool of potential
eavesdroppers is large, as the Tor network contains some 1,600 nodes, as
well as hundreds of thousands of users worldwide. Though the Tor Web site
cautions users about the last segment of unencrypted traffic, most users
seem to have ignored or missed this warning and have failed to take
necessary precautions to safeguard their Web activity, says Egerstad. When
Egerstad starting monitoring the traffic through his Tor nodes, he was
surprised to find that 95 percent was unencrypted, and that many embassies
and government agencies were using Tor incorrectly. Egerstad also believes
this oversight is currently being exploited. Shava Nerad, development
director for the nonprofit organization that supports Tor, asserts that
embassies and other high-risk organizations should be encrypting their data
independently.
Click Here to View Full Article
to the top
Spelman College Receives $2.5 Million National Science
Foundation Grant to Create 'Next-Level' STEM Disciplines
Spelman College (08/31/07)
The National Science Foundation has awarded a $2.5 million grant to
Spelman College that will enable the historically black women's college in
Atlanta to offer cross discipline opportunities and expertise in
informatics knowledge to its science, technology, engineering, and
mathematics students and faculty. The college will introduce the Advancing
Spelman's Participation in Informatics Research and Education (ASPIRE)
project in the 2007-2008 academic year, and will develop new
interdisciplinary informatics curricula to improve the computational
analytical skills of its students. "If you see what is happening with
companies such as Google or research in genomic medicine, there's a need
for students to be able to adequately analyze, organize, and extract
knowledge from data to work in interdisciplinary teams," says Andrew
Williams, an associate professor of computer and information sciences who
serves as a co-investigator for ASPIRE. For the project, Spelman partnered
with several research institutions and companies involved in informatics,
including the Georgia Institute of Technology, Indiana University,
University of Tennessee/Oak Ridge National Laboratory, Centers for Disease
Control and Prevention, and Coca-Cola. "Our long-range goal is to increase
the quality and quantity of African-American women who pursue STEM-related
advanced degrees and fields, particularly in interdisciplinary areas," adds
Tasha Inniss, an assistant professor of mathematics and co-investigator.
Click Here to View Full Article
to the top
Ready, Set, Go
Washington Technology (09/03/07) Vol. 22, No. 15, P. 36; Jackson, Jacob
Instead of using the Linpack test, the benchmark test used to judge the
speed of the world's fastest supercomputers, as a part of a procurement
process, the Department of Defense uses its Higher Performance Computing
Modernization Program (HPCMP) to judge which supercomputers to buy. The
HPCMP program issues a set of metrics that carefully codifies its
workloads. "We don't specify how big the machine is," says HPCMP head Cray
Henry. "We will run a sample problem of a fixed size and call the result
our target time. We then put a bid on the street and say, 'We want you to
build a machine that will run this twice as fast.'" The HPCMP allows
individual services in the DOD to buy a variety of machines that are better
able to handle a wide variety of tasks. HPCMP is unique because it defines
its users' workloads rather than a set of generic performance goals. Henry
says most workloads fit into one of about 10 categories, including
computational fluid dynamics, structural mechanics, chemistry and materials
science, climate modeling and simulation, and eletromagnetics. To quantify
a computer's performance on these jobs, HPCMP created a program, known as
the linear optimizer, that calculates the overall system performance for
handling each job and compares the performance to how often that job is
performed, factoring in the price of each system and any existing systems
that can already execute the jobs. Usability, though hard to quantify, is
also considered, using factors such as third-party software available for
the platform and what compilers, debuggers, and other development tools are
available.
Click Here to View Full Article
to the top
The Promise of Parallel Universes
Science (09/07/07) Vol. 317, No. 5842, P. 1341; Miller, Greg
Artificial worlds created in the computer are emerging as useful petri
dishes for investigating the formation of social networks and human
behavior in the absence of real-world physical and social limitations. One
advantage is that virtual worlds allow social researchers to carry out
experiments that are ethically or practically unworkable in the real world,
while Dmitri Williams of the University of Illinois, Urbana-Champaign,
observes that computer-generated representations of users, or avatars, are
emerging as a significant tool for future human interactions. Research has
shown that real-world behavior is also reflected in virtual worlds. For
example, scientists reported in CyberPsychology and Behavior their finding
that pairs of female avatars tend to stand closer together and make more
eye contact than pairs of male avatars. There is also evidence suggesting
that the freedom of virtual worlds is helping participants deviate from
their normal behavior, with the same scientists reporting in Human
Communication Research that undergrad volunteers assigned a visually
appealing avatar more readily interacted with an avatar of the opposite sex
than those given a less attractive avatar. Northwestern University
researcher Noshir Contractor points out that social studies research can be
conducted in virtual worlds for a fraction of the cost and time of
real-world research. He notes that the conclusions of a three-year project
on the formation of social connections were nearly identical to those of a
similar study carried out in the World of Warcraft online game environment
that was completed in only a few months. Contractor is one of a number of
researchers who hope their work will eventually yield practical
applications, such as improved disaster management and enhanced
organizational creativity and collaboration.
Click Here to View Full Article
- Web Link May Require Paid Subscription
to the top
Toward Recommendation Based on Ontology-Powered Web-Usage
Mining
Internet Computing (08/07) Vol. 11, No. 4, P. 45; Adda, Mehdi; Valtchev,
Petko; Missaoui, Rokia
Content adaptation on the Web shrinks available information to a subset
that aligns with a user's anticipated requirements, and recommender systems
depend on relevance scores for individual content items. Pattern-based
recommendation taps co-occurrences of items in user sessions to form a
basis for any conjectures concerning relevancy. The authors propose the
use of metadata about the content they speculate resides in a domain
ontology to augment the quality of the discovered patterns. Their
methodology is composed of a dedicated pattern space constructed atop the
ontology, navigation primitives, and mining and recommendation techniques.
"To achieve a better trade-off between recommendation flexibility and
precision, our approach feeds the mining process with knowledge about
semantic links between objects," the authors explain. "Our basic
assumption is that co-occurrences between objects often reflect the
existence of a link between them. Hence, manipulating links explicitly can
increase the focus of concept-based recommendation while preserving its
flexibility." The authors have developed an effective mining process for
the new pattern space, and their next objective is the optimization of the
mining procedure's performance by investigating alternative approaches. It
is also their intention to study reduced representations of the frequent
pattern family.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top