Game Designers Test the Limits of Artificial
Intelligence
Boston Globe (06/17/07) Kirsner, Scott
Bruce Blumberg, senior scientist at Blue Fang Games in Waltham, Mass., and
former professor at MIT's Media Lab, says game developers are doing some of
the most interesting work in artificial intelligence. "You have really
bright kids who are dealing with problems they don't realize are insoluble.
They're very motivated," Blumberg says. Today's video game developers are
working to create more intelligent and realistic characters by mimicking
human intelligence. Meanwhile, online sites such as Second Life and
"massively multiplayer games" such as World of Warcraft have created
incredibly intelligent characters because they are controlled by other
humans. Players in these games create their own character, and then spend
the majority of their time interacting with other peoples' characters. "In
some ways, all of these massively multiplayer games have shone a light on
the deficiencies of artificially intelligent characters in games to this
point," says Blue Fang CEO Hank Howie. One possible approach that could
create more human-like artificial intelligence is to have humans "train"
the AI software, which is what MIT Media Lab grad student Jeff Orkin is
doing. Orkin has developed The Restaurant Game, which has players assume
the role of a restaurant's wait staff. Orkin's objective is to capture
their behavior and dialog to build more realistic software-driven
characters, much like motion capture cameras are used to record and
replicate human movement. "Ideally, AI systems in the future will observe
as designers directly control characters, and learn to play roles and even
converse," Orkin says.
Click Here to View Full Article
to the top
Students Unlock the World of Advanced Computing
Grid Today (06/18/07)
At the second annual TeraGrid conference, high school, undergraduate, and
graduate students participated in three competitions designed to increase
their technological prowess, improve their creative problem-solving skills,
and encourage young people to enter careers in science, technology,
engineering, and mathematics. One competition, called "Impact of
Cyberinfrastructure on Your World," allowed students to display their
knowledge and creativity through research papers, posters, and videos
focused on the impact of cyberinfrastructure in everyday lives, and how
scientific discovery and the global community benefit from
cyberinfrastructure. The "Student Research Competition" focused on student
submitted posters describing the applications and benefits of grid
computing to specific research projects. Finally, the on-site competition
called "Advancing Scientific Discovery," new to the TeraGrid conference
this year, allowed students to use TeraGrid resources and TeraGrid Science
Gateways to solve science-driven problems with challenges such as time
limits and working remotely with team members. University of Northern Iowa
associate professor of computer science Paul Gray said the competition was
definitely not a "pencil-and-paper exam," or a traditional programming
competition. "Students who participated in this particular competition
overcame a key impediment in computational science--'barrier of first
use'--a term used to describe people who would use computational tools for
scientific discovery, if not for the lack of understanding of how to use
the tool," Gray said. "These students were exposed to a new way of doing
science, and a new way of doing competitions."
Click Here to View Full Article
to the top
ISS Computer Woes Concern Europe
BBC News (06/18/07) Klotz, Irene
The recent failure of two computer systems on the International Space
Station (ISS) has raised concern about the Columbus laboratory and the
Automated Transfer Vehicle (ATV), two additions with the same systems. The
Columbus laboratory is scheduled to be launched in December, and the ATV
large supply vessel is scheduled to be launch early next year, but the
European Space Agency (ESA) has launched an investigation to see if the
same problem might occur with those two systems. The ESA formed a team
that joined the multi-national effort to fix the ISS computers. The
Columbus laboratory has similar computers, but the ATV has completely
identical ones, so the ESA wants to ensure any corrective action is taken
before the two sections are launched. The computer systems on the ISS
control the rocket-steering system the station uses to maintain proper
alignment with the sun for heat and energy, and with the earth for
communications. The computers also control life-support equipment, though
that equipment can also be operated manually. Engineers have not yet been
able to identify a specific cause for the computer malfunction, but NASA
and Russian engineers believe the failure was caused by a change in the
electrically charged plasma field that occurred when astronauts from the
shuttle Atlantis attached a new metal beam with a huge pair of solar wings.
NASA space station program manager Mike Suffredini says such sensitive
problems could continue to occur as the space station continues to
change.
Click Here to View Full Article
to the top
Developer Expectations Run High for Google Gears
IDG News Service (06/16/07) Perez, Juan Carlos
Many developers are working on ways to use Google's Gears, an open-source
browser plug-in that allows users to access Google's Web-hosted
applications while offline. Analysts say the enthusiastic response to
Gears shows the high demand for Web-hosted applications that do not require
a constant Internet connection. Gears contains three components--a local
server for storing and delivering "application resources" such as HTML and
JavaScript without a server connection, a database that allows information
to be stored and accessed within the browser, and a component that Google
calls a "worker thread pool" that boosts Web applications' responsiveness
by running operations in the background. Developer Dustin Hand is
considering Gears for a Web application he is developing for an American
Red Cross chapter in Florida. The application needs an offline component
so that it can be used even if a disaster knocks out the Internet
connection. Originally, Hand was considering hosting the database at the
Red Cross office for offline access, but the chapter does not have the
infrastructure to host a database on site. "Google Gears would allow us to
use our existing infrastructure--our off-site database--in the event of
Internet connectivity failure, allowing us to continue to provide expedited
service to the victims," Hand says. Although anyone can download Gears for
free, it is currently intended for developers who can give Google feedback
to improve the system. Google has acknowledged that Gears still needs
significant improvement, but hopes that it will become an industry standard
for offline access to Web-hosted applications.
Click Here to View Full Article
to the top
Dalhousie Researchers Awarded Top Prizes
Dalhousie University (06/12/07) Smulders, Marilyn
Canada's Natural Sciences and Engineering Research Council (NSERC) has
awarded Dalhousie University graduate student Connie Adsett an Andre Hamer
Postgraduate Prize for her research into improving rules-based
text-to-speech computer programs. With exceptions being somewhat the rule
for the English language, Adsett is working to move beyond rules-based
systems and develop new tools for automatically breaking words into their
proper syllables, and ultimately to improve the way text-to-speech computer
systems talk. "There are a lot of good reasons for figuring out how to
make the computer pronounce words better," says Adsett. The technology
could be used for reading electronic documents to blind people and for
communicating verbally for people who are unable to speak. What is more,
handheld computers may one day read email messages to their users via the
text-to-speech technology. NSERC awarded the other prize to Erin Johnson,
a Queens University Ph.D. student who performs chemical experiments on
computers instead of in the lab. A computational chemist, Johnson is
improving and developing new modeling methods used to predict the behavior
of chemicals.
Click Here to View Full Article
to the top
Conference Keynote: People Matter Most
HPC Wire (06/15/07) Schneider, Michael
Former director of Defense Research and Engineering for the Department of
Defense Anita Jones was the keynote speaker at the recent TeraGrid '07
conference, and she discussed the evolution of cyberinfrastructure since
the 1970s. She explained that today there is a renewed awareness about a
more broad extension of access via the TeraGrid and the National Science
Foundation cyberinfrastructure program, while the measures of computational
research investment have likewise matured from the number of machines to
the computational performance on benchmarks. The next set of measures,
Jones said, should go beyond cycles delivered into the computation and
should consider other variables, such as the effective use of memory and
visualization's usability. She said the most important factor is people
expertise, followed by software. "When lead scientists have funds to hire
the next post-doc, in almost every case they hire a discipline scientist,"
Jones explained. "This is often a mistake, because you have less than
first-class knowledge about how to do the computation." She noted that it
is the NSF's responsibility to strike a balance between the broad access
and the high-end, which means centers will still be a necessity, as "it's
important to keep high-end computing at educational sites." Jones pointed
out that high-end computing offers the "unfair advantage" that one looks
for in competition. She said TeraGrid incarnates the concept that
interdependence is critical to decisions in cyberinfrastructure.
Click Here to View Full Article
to the top
Watching Virus Behavior Could Keep PCs Healthy
New Scientist (06/15/07) Simonite, Tom
A prototype anti-virus system developed at the University of Michigan uses
the "fingerprint" of virus activity to more effectively identify viruses.
The system obtains such fingerprints by intentionally infecting a
quarantined computer with viruses. Conventional anti-virus software
monitors systems for suspicious activity and then tries to determine the
source by checking for virus signatures, which makes it difficult to spot
new pieces of malware and track different variations. The University of
Michigan team studied the files and processes malware created and modified
on an infected computer, and developed software that uses the information
gathered to identify malware. The prototype is capable of defining
clusters of malware that operate in similar ways, and can create a kind of
family tree that illustrates how superficially different programs have
similar methods of operation. In tests on the same software, the prototype
was able to identify at least 10 percent more of the sample than five
leading anti-virus programs. The prototype also always correctly connected
different pieces of malware that operate similarly, while the best
anti-virus program was only able to identify 68 percent of such links.
Click Here to View Full Article
to the top
Today's Tutorial Will Take Place on the Virtual
Beach
Times Online (UK) (06/18/07) Frean, Alexander
British educational institutions are trying to gain a better understanding
of the potential of Second Life to serve as a tool for teaching and
research. Edinburgh University and Oxford University are among the
educational institutions that are already experimenting with the
Internet-based virtual world, where residents create avatars to interact
with each other. For example, Austin Tate, a professor at the virtual
University of Edinburgh (VUE) who specializes in using artificial
intelligence for search and rescue work, delivers lectures via an avatar
that wears a skydiving suit, and he plans to create a virtual diorama for
students to extract survivors from burning buildings or blocked tunnels.
Andy Powell, of the research foundation Eduserv, believes things that are
too dangerous, expensive, or impossible in the real world could be done in
a virtual classroom, but says the educational establishment is just
scratching the surface with regards to what can be done in the virtual
world. "It is a bit like the early days of the Internet--everybody knows
that it has huge potential, but they are still figuring out what the best
uses will be," says Powell. London Knowledge Lab researcher Martin Oliver
believes the big challenge will be to devise entirely new ways to use the
virtual world as an educational tool.
Click Here to View Full Article
to the top
Army, Air Force Seek to Go on Offensive in Cyber
War
GovExec.com (06/13/07) Brewin, Bob
The Air Force held its industry day for its Network Warfare Operations
Capabilities solicitation in San Antonio on June 14, 2007, after announcing
in April that it wants tech firms to provide technology that will enable
the service to go on the offensive against those who launch cyberattacks.
In May, the Army released a similar announcement, and the service expects
to receive responses from the computer industry by the end of June. The
cyberattacks that the Army and Air Force plan to launch are referred to as
offensive information operations (OIOs), and the services are also
interested in technology that will prevent enemy computer systems from
detecting and countering OIOs. According to the request for information
from the Air Force's 950th Electronic Systems Group, the technologies would
help to "disrupt, deny, degrade or deceive an adversary's information
system." The solicitations are consistent with the offensive cyberattack
capabilities that Marine Gen. James Cartwright, commander of the Strategic
Command, discussed during a hearing of the House Armed Services Committee
in March. If "we apply the principle of warfare to the cyber domain, as we
do to sea, air and land, we realize the defense of the nation is better
served by capabilities enabling us to take the fight to our adversaries,
when necessary, to deter actions detrimental to our interests," Cartwright
said.
Click Here to View Full Article
to the top
Future Shock
Sydney Morning Herald (Australia) (06/14/07) Byrne, Seamus
A variety of technologies with potentially revolutionary applications are
on the horizon. Robots are poised to make a splash, but rather than the
versatile humanoid machines popularized in science fiction, they will
likely be utilitarian, free-roaming, task-specific devices embedded in the
home. Molecular nanotechnology promises to engineer molecule-sized
machines and systems, whose potential applications include the fabrication
of any object or substance out of ambient particles, implantable medical
devices that travel in the bloodstream to fight disease or maintain a
healthy metabolism, and a "utility fog" in which networked nanobots hang in
the air, performing tasks. The exchange of data through skin contact or
wireless transmission is also a future technology drawing interest, one
that dovetails with the emergence of "personal networks" that can share
such information with the user's mobile devices. Mind-controlled
interfaces that enable movement of virtual avatars by thought could
revolutionize the gaming industry, while buildings could be embedded with
smart systems for environmental control and power supply, among other
things. Smart clothing that can keep itself dry, can be outfitted with
electronic devices for navigation and other functions, and can diagnose the
wearer's health and treat injuries is under development. Gene therapy
promises to make hereditary medical conditions a thing of the past, while
advanced prosthetics could give people enhanced strength and endurance.
Ethical questions over the use of genetic research will undoubtedly
influence future applications.
Click Here to View Full Article
to the top
Indian-American First Female Recipient of Robotics Tech
Award
Hindu (06/18/07)
Aeolean CEO Bala Krishnamurthy is a 2007 winner of the Engelberger
Robotics Award for Technology Development. Krishnamurthy is the first
female to receive the award. The growth of robotics is the result of some
of the programming languages, networked systems, and related technologies
she designed and developed over the past 25 years. A pioneer in electric
and hydraulic industrial robots in the early stages of her career at
Unimation, Krishnamurthy applied the VAL language to the hydraulic Unimate
robot and headed the firm's software design and development effort
involving the third generation UNIVAL controller. An Indian-American,
Krishnamurthy later developed software that allowed autonomous robots to
navigate hospitals, a mobile research base, and a 3D range sensor at
HelpMate Robotics. She has also helped develop algorithms for the
Tennessee Valley Authority's 21-axis Robotic Transmission Line Rover and
next-generation robots for a European manufacturer, and served as a member
of NASA's Office of Exploration Systems (OExS) research proposal review
panel for Human and Robotic Technology in 2004. Krishnamurthy and the
other 2007 winners have helped popularize robotics today, says Joseph F.
Engelberger. "Their innovations and perseverance have led to the use of
robots in new ways, in educational curriculum, and have made it possible
for companies to gain a foothold and prosper in the global economy they
compete in," he says.
Click Here to View Full Article
to the top
FBI: Operation Bot Roast Finds Over 1 Million Botnet
Victims
Network World (06/13/07) Cooney, Michael
The FBI and the Department of Justice announced that their ongoing
cyber-crime investigations have so far detected over 1 million victims of
botnet crime. Operation Bot Roast aims to interrupt and dismantle
botherders and has caught three major botnet operators so far, including
"Spam King" Robert Alan Soloway. Bots are considered to be one of the top
industry scourges. Their destructiveness is illustrated by a report from
Mi5 describing how Mi5 installed a Web security beta product at a company
with 12,000 nodes and identified 22 active bots, 123 inactive bots, and 313
suspected bots within one month. The discovered bots had caused 136
million bot-related episodes. In addition, after examining over 4.5
million Web pages, Google researchers reported that 10 percent of Web pages
were booby-trapped with malware, and 16 percent seemed to contain dangerous
code. By accidentally permitting access to their computers, unknowing
computer owners permit their computers to be employed as vehicles for
crimes such as denial-of-service attacks and phishing. Botnets are also
increasingly threatening to national security, due to their ability to be
widely distributed. Operation Bot Roast plans to inform the unwitting
owners of hijacked computers. Meanwhile, citizens can guard against
botnets by adhering to strong computer security practices.
Click Here to View Full Article
to the top
It's Lean, But is it Agile?
SD Times (06/15/07) deJong, Jennifer
Kent Beck, the inventor of extreme programming (XP), says lean software
development and agile software development, which includes XP, are similar
approaches to developing software and are closely aligned in some respects.
Lean software development follows the methodology of lean manufacturing,
developed by Toyota in Japan as early as the 1940s. While the process of
building cars is rather different from software development, the basic
principles can be applied to both fields. For example, a key theme in lean
manufacturing is eliminating waste. In software development, waste is
considered anything that does not directly support the customer, such as
creating requirements that will need to be changed later. Another
principle in key manufacturing is to stop production when a flaw is noticed
and to fix it immediately, which in software amounts to fixing bugs as soon
as they are discovered. Agile program Crystal creator Alistair Cockburn
said that he and colleagues knew little about lean manufacturing when they
authored the Agile Manifesto in February 2001, which coined the term
"agile." "But since then I have been doing a lot of reading about lean,
and I can't see that we in the agile community have added much of anything
to what Toyota was already doing," Cockburn said. However, Ward Cunningham,
a key contributor to the Agile Manifesto, believes that ideas from
manufacturing cannot be successfully applied to software development.
"Software development is a knowledge activity, not a material processing
activity," Cunningham said. "Manufacturing has been improved by insights
associated with lean, but software has never been improved by modeling its
processes on manufacturing." Agile or not, lean is being adopted by
business executives because it is based on a concept the business world is
familiar with, Beck said, whereas agile development has mostly been
marketed to IT executives.
Click Here to View Full Article
to the top
Denial-of-Service Attacks: Street Crime on the Web
New Scientist (06/06/07) Vol. 194, No. 2607, P. 30; Giles, Jim
Malefactors are increasingly using denial-of-service (DoS) attacks--the
practice of crippling Web connections with a flood of traffic--to steal
money from unaware Web site owners, and the method's persistence is aided
by the fact that individual users and small companies generally cannot
afford anti-DoS safeguards. "There are more players, better players, in
the market than just a year ago," notes Arbor Networks computer security
specialist Jose Nazario. One of the most common techniques to launch DoS
attacks is to contaminate computers with bot software that lies dormant on
the compromised PC until it is instructed to link with the target Web site,
and the simultaneous accessing of the site by massive numbers of
bot-infected PCs can often cause the server to crash. University of
California, San Diego researchers determined that over half of the more
than 68,000 DoS attacks perpetrated between 2001 and 2004 targeted home
users or small businesses, and among the more serious kinds of attacks are
those used to hold sites for ransom. University of Washington computer
networks expert Tom Anderson thinks Web sites must be more selective in who
they communicate with if DoS attacks are to be countered, and he and his
colleagues have developed a protocol for online information exchange in
which sites insert a token in the code they share with visiting computers,
which would be interpreted by software installed at the site's ISP as proof
of legitimate communication. The distribution of these tokens would be
halted if the site is attacked, spurring the ISP to impede incoming
connections upstream to prevent the site from seizing up.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
REAL Nightmare
Governing (06/07) Vol. 20, No. 9, P. 24; Perlman, Ellen
Although states are not bound to follow the 2005 REAL ID Act, a federal
law that aims to fight terrorism by improving security for state driver's
licenses, some have nonetheless been very vocal about what they say are
problems with the legislation. One of the biggest complaints among the
states is the high cost of following the REAL ID Act's recommendations,
which include verifying drivers' original identity documents--such as birth
certificates and Social Security cards--when they show up at DMV offices to
get a new license or renew their old one. According to the National
Governors Association, states are likely to spend at least $11 billion of
their own money over the next five years to get REAL ID up and running.
The biggest factor contributing to this expense is the more than 2.1
million hours of computer programming states will need to adapt their
systems for new requirements for things such as eligibility verification
and database design. Another concern is that REAL ID needs to be supported
by a variety of databases containing citizens' personal information if the
program is to work nationwide, a big worry for some states and civil
liberties groups. Though states can always opt out of REAL ID, as Montana
and Washington have already done, doing so could create major
inconveniences to their residents because they would not be able to use
their driver's licenses to board airplanes or enter secure federal
facilities. Although the legislation may be burdensome to states, it is
nonetheless important that states implement its recommendations because
they address the known vulnerability with state-issued drivers licenses:
The ability of criminals, such as terrorists, to use identity documents to
obtain a fraudulent drivers license, said the Department of Homeland
Security's Russ Knocke. "Shame on us if we don't take steps to fix it," he
said.
Click Here to View Full Article
to the top
Saving the Internet
Harvard Business Review (06/07) Zittrain, Jonathan
Berkman Center for Internet & Society co-founder Jonathan Zittrain
comments that the openness of the Internet and PCs is responsible for both
their incredible success and their vulnerability to abuse, and he warns
that one solution to this vulnerability--"tethered appliances" that can be
instantly modified by vendors or service providers, but not users--could
rob the Internet of its creative connectivity, and endanger companies whose
business models rely on drawing and communicating easily with clients
online. Zittrain writes that the advantages and disadvantages of the
combined Internet/PC reside in its generativity, which he describes as "a
system's capacity to produce unanticipated change through unfiltered
contributions from broad and varied audiences." Generativity is defined by
four core elements: The strength of a system or technology's leverage on a
series of possible tasks; adaptability to a spectrum of tasks; ease of
mastery; and accessibility. The two chief benefits of generativity are
innovative output (new things that enhance people's lives) and
participatory input (the opportunity to link and collaborate with others,
and creatively express one's individuality). The dark side of generative
technologies is their potential for use in malevolent endeavors, which
include fraud, vandalism, malware, spam, pornography, and assaults against
Web sites and the Internet's integrity. "The fundamental tension is that
the point of a PC is to be easy for users to reconfigure to run new
software, but when users make poor decisions about what new software to
run, the results can be devastating to their machines and, if they are
connected to the Internet, to countless others," Zittrain explains.
Reducing or eliminating the role of the PC as the hub of the IT environment
by opting for tethered appliances will stymie technical innovation and
remove the "safety valve" that maintains the honesty of information
appliances, the author warns. Without such innovation, new social
networks, communities of interest, and experiments in collective
intelligence will be hindered, stunting the growth of new forms of culture,
political activism, and participation.
Click Here to View Full Article
to the top
Jaron's World
Discover (06/07) P. 59; Lanier, Jaron
Jaron Lanier attributes the poor quality of most software to the onerous
way programmers are forced to think about software, and he supports a more
human-friendly approach to software and complexity called phenotropics.
"The core idea of phenotropics is that it might be possible to apply
statistical techniques not just to robot navigation or machine vision but
also to computer architecture and general programming," Lanier writes. He
explains that the phenotropic approach takes a cue from biological
evolution, whereby a small revision in DNA results in a small revision in
an organism that is often sufficient to enable gradual evolution; in
contrast, changes to computer code usually result in "shockingly random"
consequences. Lanier conceives of software composed of modules that can
identify each other with pattern recognition, which could possibly lead to
a large software system with no susceptibility to perpetual random logic
errors. One configuration the modules could take is of a user interface
similar to the contents of a window on a Vista or Mac desktop. "Now that
machine vision and other pattern-based techniques are becoming reliable, it
is finally conceivable for Web pages to use each other at the
user-interface level, the same way that humans use them," Lanier notes,
adding that this could result in adaptable mashup constructions free of
random logic errors. The author says it is probable that pattern
recognition will play an increasingly vital role in digital
architecture.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top