A Better Memory Chip
Technology Review (07/10/06) Greene, Kate
Freescale Semiconductor has unveiled the first commercial, magnetic-based
semiconductor memory. The new chip, based on magnetoresistive
random-access memory (MRAM), will compete with Flash, RAM, and other
established types of semiconductor memory, potentially leading to more
energy-efficient electronics. Freescale's announcement could signal a
watershed in consumer electronics, as it demonstrates that MRAM
manufacturing techniques and materials, after decades of research, are
finally ready for practical commercial deployment, according to Doug
Burger, professor of computer sciences and electrical engineering at Texas
A&M University. Unlike Flash memory and RAM, which store information as an
electronic charge, MRAM represents data through the magnetic orientation of
electrons. MRAM chips are comprised of hundreds of thousands of memory
cells, each containing one magnetic electrode with a fixed magnetic field
and one whose polarization can change. The binary number that a cell is
storing is a function of the resistance between the electrodes, which in
turn is dictated by the electrodes' polarization. The magnetic properties
create a "unique combination of characteristics that you can't get in any
other semiconductor material," said Freescale's Saied Tehrani. MRAM chips
are nonvolatile, meaning that they do not require a power supply to hold
data, and their data can be written and read an unlimited number of times
at rapid speeds. The property of nonvolatility and the potential to
eliminate a computer's boot-up time make MRAM an appealing candidate to
replace RAM. Its central advantage over Flash is that it can be used
forever. While MRAM could eventually emerge as the dominant form of
memory, Freescale's chip only has a capacity of 4 MB, which pales in
comparison to the current Flash chips with capacities of several
gigabits.
Click Here to View Full Article
to the top
FBI Plans New Net-Tapping Push
CNet (07/07/06) McCullagh, Declan
Sen. Mike DeWine (R-Ohio) intends to introduce legislation that would make
it a requirement for ISPs to set up wiretapping hubs for law enforcement
monitoring and for networking equipment manufacturers to incorporate
backdoors for surveillance, according to FBI agent Barry Smith in a private
conference with industry representatives on July 7. DeWine's bill would
amend the 1994 Communications Assistance for Law Enforcement Act (CALEA) to
the effect that any maker of "routing" and "addressing" hardware would be
required to offer upgrades or other "modifications" necessary to the
enablement of Internet wiretapping; extend wiretapping requirements to
"commercial" Internet services if the FCC believes it to be within the
"public interest;" coerce ISPs to filter their customers' communications to
spot, for example, voice over Internet Protocol (VoIP) calls only; and
jettison the current legal requirement that Justice must annually issue a
public "notice of the actual number of communications interceptions" as
well as the "maximum capacity" needed to handle all the legally authorized
wiretaps that federal agencies will "conduct and use simultaneously." The
FBI says CALEA must be expanded in order to beat terrorists and other
criminals who are exploiting technologies such as VoIP. "The complexity
and variety of communications technologies have dramatically increased in
recent years, and the lawful intercept capabilities of the federal, state
and local law enforcement community have been under continual stress, and
in many cases have decreased or become impossible," states a summary
accompanying the draft bill. However, critics say the legislation
infringes on Internet users' privacy, while the bill's political outlook is
also muddied by continued debate concerning supposedly unlawful
eavesdropping by the National Security Administration.
Click Here to View Full Article
to the top
Innovation and Competitiveness Authorization
Updates
Computing Research News (07/07/06) Camese, Erica
The Research Competitiveness Act, which would give the NSF and the Energy
Department a mandate to dole out career grants encouraging people to pursue
professions in the sciences, and the Science and Mathematics Education for
Competitiveness Act, which promotes mathematics, science, and technology
education through scholarships, stipends, and mentoring programs, have both
cleared the House Science Committee, though they failed to win a suspension
status that could have accelerated their passage. Instead, the two bills
will be subject to debate on the floor under open rule, making their
passage a more distant prospect. Both bills are a product of the American
Competitiveness Initiative, which has already won funding approval for
fiscal year 2007, raising the question in the scientific community of the
utility of further debate. Meanwhile, the Senate Energy and Natural
Resources Committee has approved the PACE-Energy Act, which promotes basic
research at the Energy Department through an experiment-based internship
program, grants, and satellite summer programs at national labs. The
PACE-Energy Act is expected to reach the Senate floor in the near
future.
Click Here to View Full Article
to the top
Researchers Teach Robots to Evolve Their Own
Language
InformationWeek (07/10/06) McDougall, Paul
Researchers in Europe are using the EC (embedded and communicating) Agents
program to teach robots linguistics and cognitive skills that they
ultimately will be able to develop on their own over time, without the
assistance of communications rules provided by humans. Scientists from
Sony's computer science labs in France are participating with researchers
from the European Commission's Emerging Technologies Initiative and the
Institute of Cognitive Science and Technology in Italy on the project, in
which Aibo robotic dogs are placed in a room with objects, some of which
are responsive to sound, to learn about their environment. Thus far, the
super-Aibos have directed more of their "barking" to the objects that
respond to them, and have learned which bark patterns generate certain
responses. Another application allowed the Aibos to develop their own
language, which would enable one dog to tell another the location of a ball
and ask it to go fetch. Such interactions are the building blocks for
advanced artificial intelligence that includes an innate language
capability, according to the researchers. Meanwhile, Viktoria Institute in
Sweden is using EC agents to give mobile devices the ability to talk with
each other, such as an MP3 player with a cell phone. And the Swiss Federal
Institute of Technology has embedded robots with EC agents, and the small,
wheeled units are being taught to provide assistance in search and rescue
operations. "We've managed to ground AI in reality, in the real world,
solving one of the crucial problems to creating truly intelligent and
cooperative systems," says Stefano Nolfi, coordinator of the EC Agents
project.
Click Here to View Full Article
to the top
UI Researchers Create Computer That Recognizes Body
Movement
News-Gazette (07/07/06) Kline, Greg
Nonverbal cues such as hand gestures, eye movements, and body language are
an integral part of the way that people interact with each other.
Researchers at the University of Illinois, recognizing that computers will
never be able to have pure human interactions without being able to infer
the meaning of nonverbal cues, have developed a system for computers to be
able to detect shrugs. "I guess shrug is the first step toward trying to
analyze body movement," said Thomas Huang, a professor of electrical and
computer engineering whose lab is also developing facial recognition
technologies that improve security and database search. "We're interested
in mainly audio and visual clues," he said, "but also body language."
Huang's software grabbed headlines last year when it was used to analyze
the mood of the Mona Lisa (she turned out to be mostly happy). One of the
researchers' goals is to improve computers' ability to understand what
their users want from them. In one project, the Illinois researchers are
using computers to track facial expressions in an attempt to help middle
school students who are learning some of the basic principles of science.
Huang's lab has also produced a hand-recognition system that reads and
tracks the fingertips and palms, enabling a user's hands to manipulate
objects in a virtual world. The shrug detector is comprised of a computer,
digital camera, and software developed by the Illinois researchers. While
the basic application of detecting a shrug was a relatively simple
programming task, getting the computer to do it in real time and repeatedly
was more difficult, Huang said, adding that more cameras may be required to
overcome the system's difficulty coping with poor lighting or a person
turned in profile to the camera instead of facing it.
Click Here to View Full Article
to the top
Seeking to Tighten the Net Against Attack
IST Results (07/10/06)
In an effort to shore up the Internet's defenses against cyberattacks in a
time of rapid broadband uptake, the IST-funded DIADEM Firewall project has
created a comprehensive security application for broadband services, with
particular emphasis on denial-of-service attacks and mitigating the effects
of an attack. Distributed denial-of-service (DDoS) attacks, which
overwhelm the target network by marshalling thousands of zombie computers
to make simultaneous requests of the network's bandwidth, affected more
than 13 percent of businesses in the United Kingdom in 2004. DDoS attacks
can impose severe customer service costs on broadband service providers, as
well as disrupting the broadband experience for residential users. "There
is no doubt that denial-of-service attacks are a growing issue as more and
more services, such as online games, IP telephony, television over IP, and
e-shopping are provided to broadband users through the Internet," said
Yannick Carlinet, DIADEM Firewall project coordinator. "It is a crucial
and vulnerable aspect of broadband security and will become even more so in
the future as more users move over to broadband connections." The project
created a network-based distributed detection and reaction system to be
managed centrally by network operators, unlike the current system, where
each user is responsible for his own security. In shifting the burden of
security back to the network provider, the DIADEM Firewall project
developed new intrusion-detection algorithms and policy-based techniques
that enable automated configuration and decision making. The main
difficulty that the project encountered has been to convince the major
network operators that they need to take responsibility for security
through centrally administered policies.
Click Here to View Full Article
to the top
DHS Lags in Appointing Cybersecurity Czar
National Journal's Technology Daily (07/05/06) Greenfield, Heather
Nearly a year has passed since Homeland Security Department (DHS)
Secretary Michael Chertoff created the position of a Cabinet-level
cybersecurity czar in an effort to make sure the department can address
possible past and emerging threats in the best way possible. The position
has yet to be filled since it was first announced on July 13, 2005. The
effort to appoint someone to the position started two years ago in
Congress. Some see this as indicative of the lack of attention that exists
in most senior levels of government. "The department is incompetent," says
Rep. Zoe Lofgren (D-Calif.). "When you say no one is home [at Homeland
Security] it's not a joke." Lofgren, along with a House cybersecurity
subcommittee, helped pass House legislation that would create a
cybersecurity czar with authority in Homeland Security. Now many members
of Congress and other groups are starting to question whether DHS National
Cyber Security Division director Andy Purdy can effectively manage the
Internet in a disaster situation. "What we concluded is if there were a
major cyber disruption, our nation would not be able to restore or rebuild
the Internet," says Tita Freeman at the Business Roundtable. "Our CEOs
feel that the Internet is vital to the exchange of information that's vital
to our nation's economic security and to our security in general." Lofgren
says there needs to be a cybersecurity czar present at Cabinet meetings to
successfully rebuild the Internet if a cyber disruption does occur.
Click Here to View Full Article
to the top
Virtual Reality Psychodrama Plays With Viewers'
Minds
University at Buffalo Reporter (07/06/06) Vol. 37, No. 40,Donovan,
Patricia
An interdisciplinary team of University at Buffalo academics from fields
such as computer science, media arts, and drama has developed "Human
Trials," a virtual-reality psychodrama "in which players immerse themselves
in multidimensional virtual space and undertake an 'absurd quest,'"
according to Josephine Anstey, assistant professor in the Department of
Media Study. The participant in "Human Trials" interacts with both a
virtual reality atmosphere and two human actors who improvise reactions to
the participant's actions; the other actors in the drama are
computer-controlled agents. The challenge for the participant is to
overcome the rigged games, duplicitous characters, and the attendant sense
of disempowerment as they try to make decisions, Anstey says. The
participant, wearing a head-mounted virtual-reality system, enters the
environment from a virtual-reality projection system, where he is met by
the human actors, who confront him with a series of seemingly meaningless
challenges. The Human Trials project is an extension of the work in the
areas of intelligent agents and virtual-reality drama that Anstey and
assistant media study professor Dave Pape have been engaged in with Stuart
Shapiro, a professor in the Department of Computer Science and Engineering.
Human Trials is geared for immersive virtual-reality environments with 3D
displays, one large screen or several screens, and head-mounted displays
such as CAVE.
Click Here to View Full Article
to the top
Kansas State University Professors Working on
Sensor-Based System to Monitor Livestock Herds
U.S Newswire (07/05/06)
Researchers at Kansas State University are developing a system to monitor
the health and behavior of animals on the range in an attempt to protect
against avian flu, mad cow disease, pneumonia, and other animal diseases.
"The primary goals of the project are to develop new technology to increase
meat quality by minimizing the impact of disease and to protect
human/animal populations by detecting disease early before local herds are
mixed with animals in large feedlots," said Steve Warren, associate
professor of electrical and computer engineering. It could be days before
a farmer notices that an animal is sick and calls out a veterinarian--ample
time for an epidemic to break out, says Kansas State's Dan Andresen. "We
hope through monitoring we can help the farmer detect disease earlier, have
fewer animals to treat, and positively impact national security," Andresen
says. The researchers are also looking at devices that can detect the
speed and direction of an animal's movement, a key to identifying when an
animal might be sick. The information collected by the system could enable
the government to quickly react to a disease outbreak, as well as helping
scientists better understand the effect of climate change and other
environmental factors on the health and behavior of animals. They have
developed two systems: a complex sensor application that would cost around
$100 per animal and likely be only deployed with a few animals on a ranch;
and a simpler system that would cost between $5 and $10 per animal. In the
near future, all animals will be required to wear electronic ID tags, and
Andresen notes that a temperature sensor could be added for around $2. The
researchers are still collecting raw data and optimizing the wireless
connection between the animals and the base station.
Click Here to View Full Article
to the top
Scientists to Automate Thought
Computing (07/06/06) Brown, James
Researchers at the Gatsby Computational Neuroscience Unit of the
University College London have completed a brain-scanning experiment
mapping the processes at work when humans make decisions, potentially
leading to machines that could eventually replace human involvement in
analyzing data. The researchers scanned volunteers' brains with a
functional Magnetic Resonance Imager while they selected from a series of
slot machines that paid out different amounts of money. "If we can
understand how people solve problems using past experience we can design
better decision-making machine algorithms that could be used in something
like an autonomous robot, or in perfecting systems such as those used by
Amazon.com to price books with," said Nathaniel Daw, one of the project's
lead researchers. Machines that could make better decisions could be of
tremendous benefit to businesses, according to Michele Bezzi of the
Accenture Technology Lab. Bezzi notes that while machines can sift through
enormous volumes of data, they do not have the intelligence to identify
patterns and make interpretations. Accenture has been developing an
intelligent surveillance system that could monitor multiple cameras more
effectively than a person. "An intelligent system could detect
abnormalities and suggest to a guard that something wrong is happening,"
Bezzi says. Futurist Ian Pearson believes self-aware machines could become
a reality as early as 2015, but cautions that they could eventually
displace people for many tasks.
Click Here to View Full Article
to the top
New Technology Could Measure the Pleasure of Playing
Games
Vancouver Sun (BC, Canada) (07/07/06) Griffin, Kevin
Electronic Arts is funding research that could help game companies
determine whether a new game is likely to become a hit with consumers. The
company has enlisted the services of Regan Mandryk, who is developing a
system that measures heart rate and facial muscle movements as people play
games. Attaching physical sensors, with electrodes sewn in Velcro strips
placed on the index and ring fingers of the hand of gamers, helps reveal
players' level of excitement, challenge, frustration, or boredom with a
game. Mandryk developed the system as part of her doctorate at Simon
Fraser University, and the computer scientist believes it could be helpful
in assessing the emotional response of players within an hour of finishing
a game. "If you have a systematic way of determining whether games are
actually fun and enjoyable to play in the developmental stage, developers
and companies would be able to save a lot of money and have a better,
risk-free way of developing good ideas," says Mandryk. Only 10 percent of
games in development are released to the public, and they do not all become
a success. Companies continue to develop games on the hunch of programmers
and production managers, and sometimes rely on interviews and focus groups
after introducing a new game. Mandryk believes her technology also could
be used to analyze the emotional level of people in stressful lines of work
such as air traffic controllers.
Click Here to View Full Article
to the top
Another Spec About to Hatch
SD Times (07/01/06)No. 153, P. 1; Connolly, P.J.
WSDL 2.0 will edge closer to reality later this month when about a dozen
software companies meet at IBM's Toronto Software Lab in the first of two
events aimed at validating their implementations of the specification,
which is presently considered a "Candidate Recommendation" by the W3C. One
of the new features in the specification is the notion of interface
inheritance, which not only enables WSDL developers to define interfaces,
but also to incorporate them into larger interfaces. The message
construction of WSDL 2.0 is simplified, with the bulk of the work being
performed by schema. The fact that WSDL 1.1 is still usable is an obstacle
to the W3C's effort to pass the new version. "The problem is that WSDL 1.1
has been around enough for a while, and is good enough for many purposes,"
said Jonathan Marsh, co-chair of the W3C's Web Services Description Working
Group, though that has given the working group the opportunity to make WSDL
2.0 as clean and functional as possible. A second working group event is
scheduled for the fall, giving developers two opportunities to test out
different implementations and pinpoint where the bugs are. Marsh is happy
with the amount of review that the specification has received, as well as
the coordination between the description and policy sides. HTTP binding
remains an open issue, as many substantive new features have been included
in the WSDL 2.0 binding. "The goal was not to provide a full HTTP
description language, but just to have the kinds of interactions you would
normally do in SOAP...[also] exposed through an HTTP Web service," Marsh
said.
Click Here to View Full Article
to the top
DOE's Federated Model Aims to Identify Security
Threats
Network World (07/05/06) Garretson, Cara
Last fall, Argonne National Laboratory started the Federated Model, an
information-sharing project to be used by government, research labs,
universities, and organizations that want to share or view information on
different attempts by IP addresses to access networks and how organizations
have dealt with the attempts. The Federated Model has about a half-dozen
members and is steadily growing. The lab, a division of the Department of
Energy (DOE), is trying to add features to the project such as an RSS feed
that notifies members when new information has been added, according to
Scott Pinkerton, manager of network services for the lab, which operates
out of the University of Chicago. Members will eventually be able to stop
an attack by following the examples and actions of fellow members. If a
member of the Federated Model is the victim of an attack from a particular
IP address, then another member will be able to block that IP address from
the network. "We're reinforcing the idea that we could be smarter, and
more prepared," says Pinkerton.
Click Here to View Full Article
to the top
A Virtual Roundtable
Fortune (07/10/06) Vol. 154, No. 1, P. 103; Hira, Nadira A.; Kirkpatrick,
David; Levenson, Eugenia
Director of MIT's Center for Collective Intelligence Thomas Malone says
inexpensive, Internet-enabled communication is allowing organizations to
enjoy the economic benefits of both large and small entities--economies of
scale in the case of the former and flexibility, freedom, creativity,
motivation, and innovation in the case of the latter--by giving people in
large organizations the information to make their own decisions. MySpace
CEO Chris DeWolfe foresees a gradual improvement of individual empowerment
through the enhancement of U.S. mobile networks and handset functionality,
noting that "With mobile access you're no longer limited to a static
depiction of a user's personality but have a real-time representation of
their lifestyle." WPP Group CEO Martin Sorrell says the nature of media
consumption is changing, arguing that it is folly to think you can
manipulate independent media such as blogging for your own benefit; he also
points out that traditional media users often have difficulty absorbing new
media. China Interactive Media Group CEO and blog writer Hung Huang has
found blogging to be a valuable tool for increasing awareness and
circulation of her firm's publications, and notes that Chinese people are
expressing themselves in a different way than they were before via
blogging. Director of Microsoft's Live Labs Gary Flake compares the
Internet's impact to that of the Renaissance or the Industrial Revolution
by boosting the fluidity of information exchange and creating new value,
and lists as examples the accelerated sharing of scientific knowledge and
the emergence of communal intelligence to make the vast volume of data more
comprehensible to individuals. Global Voices Online co-founder Rebecca
MacKinnon says the Internet is emerging into a vehicle for social change,
though its effects on any society's balance of power are still unknown.
Wipro Chairman Azim Premji says increased connectivity and falling
communication costs support offshoring, which "is creating a level playing
field and making remunerative employment available to economically less
developed countries."
Click Here to View Full Article
to the top
Taming the Digital Beast
Campus Technology (06/06) Vol. 19, No. 10, P. 40; Patrizio, Andy
Academic institutions are at the forefront of knowledge production, but
they often lag behind their own students in the implementation of
technology. Librarians are beginning to move up the curve, however, as
they are partnering with campus IT departments to develop digital
repositories to make resources available that would otherwise be
inaccessible. Faculty members are sometimes reluctant to take the trouble
to make their materials accessible to their colleagues at other
institutions, according to Denison University Assistant Provost Scott
Siddall. By way of incentive, Siddall says, "institutions have to say, 'If
you create a unique collection, digitize it, put it up online, and let
people access it, that is scholarship; that is valued, and we're going to
count it in promotion and tenure.' Then people will put it on their radar
screen." Early digital repository initiatives have largely skirted
copyright issues by only placing materials that are in the public domain
online. Convincing faculty that materials can and should be made freely
available online can be a major challenge, particularly with members of the
humanities disciplines. Creating a digital repository must be a
campus-wide effort that includes library sciences administrators, on-campus
IT, and the heads of various departments. Experts are in wide agreement
that digital repository initiatives are doomed to failure and
disorganization without metadata to make the resources searchable. When
creating a digital repository, universities work under tight budgets,
making open source software an appealing option to power repository efforts
such as DSpace, a system built for digital media that is in use at 138
institutions around the world. Back-end support can add cost to
open-source software, but data formatting is a more important concern. It
is important to use internationally recognized standards, such as Dublin
Core, JPEG 2000 for images, and Adobe PDF for documents.
Click Here to View Full Article
to the top
Speech and J2EE--A Foundation for More Creative Dialog
Design
Speech Technology (06/06) Vol. 11, No. 3, P. 49; Chirokas, Steve
The ability to facilely combine speech systems and dynamic data retrieval
to furnish a speech dialog is increasingly driving the platform for speech
applications of greater distinction and interactivity. The Java 2
Platform, Enterprise Edition (J2EE) infrastructure can help enable
enterprise integration of VoiceXML and HTML, which have emerged as the de
facto speech content delivery language and the de facto Web content
delivery language, respectively. A Java foundation can imbue a hosted or
premise-based speech environment with flexibility, and provide a more
consistent programming resource for embedding security from multiple
vendors. In addition, J2EE can let speech application developers integrate
with enterprise systems as well as bundle their components for enterprise
integration into reusable elements. Speech applications based on VoiceXML
supply a general architecture to integrate with back-end systems, and also
forge connections between the application and agents and systems that
provide advanced call routing and screen pops. Organizations can extend
packaging as links to existing corporate resources via the
VoiceXML-associated Java infrastructure. This can streamline the coding of
database queries, CTI integration, and call transfers. The penetration of
VoiceXML-based speech integration into the enterprise space will facilitate
the continued diminishment of cost obstacles standing in the way of custom
integration and levels of effort for troublesome "one off" solutions.
Click Here to View Full Article
to the top
The Semantic Web Revisited
IEEE Intelligent Systems (06/06) Vol. 21, No. 3, P. 96; Shadbolt, Nigel;
Hall, Wendy; Berners-Lee, Tim
The lack of large-scale, agent-based mediation might be construed as a
failure of the Semantic Web concept, write University of Southampton
professors Nigel Shadbolt, Wendy Hall, and Tim Berners-Lee, but they argue
that agents cannot thrive until standards are well entrenched, and progress
in the development of Web standards for expressing shared meaning has been
steady in the five years since the publication of the first Semantic Web
article in Scientific American. The authors further see the Semantic Web's
eventual success being foreshadowed by the e-science community's use of
ontologies. Standards and languages alone cannot support the Semantic Web;
also critical is uptake, the point where serendipitous reuse of data
becomes possible. The Semantic Web's semantic components are provided by
ontologies, and for this to happen, practice communities must devise,
manage, and endorse the ontologies. The effort that goes into developing
and managing ontologies depends on whether the ontologies are deep or
shallow: Deep ontologies require a substantial effort, while shallow
ontologies, which are composed of small numbers of unchanging terms used to
organize very big data volumes, are simpler. Accommodating "the next wave
of data ubiquity," as the authors put it, will constitute a major
challenge, and the Semantic Web's success will depend on elements that were
critical to the Web's success, social and design factors being just a few
of these elements. Much if that success is associated with the ladder of
authority, which the authors define as "the sequence of specifications
(URI, HTTP, RDF, ontology and so on) and registers (URI scheme, MIME
Internet content type, and so on), which provide a means for a construct
such as an ontology to derive meaning from a URI."
Click Here to View Full Article
to the top