Tartalo the Robot Is Knocking on Your Door
Basque Research (06/18/08) Bulegoa, Prentsa
University of the Basque Country researchers are developing a robot called
Tartalo capable of finding its way around by itself by being able to
identify different places and ask permission before going to new places.
The university's Autonomous Robotics and Systems Research Team is working
to develop a robot capable of walking without help and able to make
decisions for itself in order to increase the autonomy of robots so that
they are more capable of carrying out a wider variety of tasks on their
own. Tartalo, a 1.5 meter high robot, is equipped with the ability to
sidestep any obstacle in its path, which it detects using sensors and a
laser that measures the distance of the robot to any object within a radius
of 180 degrees. The sensors and the robot's programming allow Tartalo to
wander without problems, but what the researchers truly want to achieve is
a robot capable of going anywhere it is told. The researchers are using
biomimetic systems as a basis for developing such capabilities. The result
would be that Tartalo would do the same thing a person would do in a new
environment--explore the terrain and note points of interest. Tartalo has
be programmed to recognize common structures in buildings, including rooms,
corridors, a front hall, and a "junction." In new environments, Tartalo
runs an auto-location process where the robot moves around the area to find
and memorize the location of these structures, creating a topological map
that the owner could use to teach the robot new places.
Click Here to View Full Article
to the top
Southern Utah University Brings Supercomputing Into the
Classroom with Star-P
Sun Herald (Mississippi) (06/17/08)
Southern Utah University (SUU) believes that high performance computing
(HPC) should be more than a tool available only to elite research labs and
should be a common feature in a variety of undergraduate studies. It is
working with Interactive Supercomputing (ISC) to make HPC easy and
accessible for students and faculty at all levels as part of a program
developed by SUU's College of Computing, Integrated Engineering, and
Technology (CCIET) to create an integrated, interdisciplinary curriculum
that combines math, engineering, and computer science. SUU is using ISC's
Star-P software to provide students and faculty with access to the school's
powerful Dell 128-node parallel cluster. The platform allows SUU users to
easily program models and algorithms using familiar desktop languages by
automatically transforming the application to run on the parallel clusters,
while also eliminating the need to re-program the applications in complex
languages such as C and FORTRAN or the need to use a message passing
interface to run in parallel. "Our mission is to help students achieve
their academic goals and to compete on a global level for careers in
government, industry, secondary education, and acceptance to graduate
school," says CCIET dean and professor of mathematics Mikhail Bouniaev.
"Supercomputing is increasingly playing a critical role in those career
paths."
Click Here to View Full Article
to the top
Man vs Machine Poker Challenge Announced
Online Casino News (06/17/08)
After losing a close contest during last year's inaugural Man Versus
Machine Poker Championship, a redesigned Polaris artificial intelligence
program will once again challenge some of the best poker players in the
world. The Polaris 2 was developed by researchers from the University of
Alberta's Computer Poker Research Group, and the program will compete
against professional poker players Nick Grudzien, IJay Palansky, and Matt
Hawrilenko at the second Man vs. Machine Championship, to be held during
the 2008 Gaming Expo at the Rio All-Suite Hotel and Casino from July 3 to
July 6. Poker coach Bryce Paradis says the Polaris 2 team has made
significant improvements to the program since last year's match, with
perhaps the most incredible improvement being the program's ability to
learn from and adapt to its opponents as play progresses. "This year's Man
Versus Machine match is going to push our team to their limit," says
Paradis. Last year, Polaris 1 played professional poker players Phil Laak
and Ali Eslami in Vancouver, which ended with a final score of two wins,
one loss, and one statistical tie in favor of the human players. Each
round in the match consisted of a 500 duplicate hand matches where the same
series of cards was dealt to both players with each one playing opposite
hands in the game while the computer took the other side. At the end of
the match the total number of chips won or lost by each team determined the
winner. Organizers will be using the same format this year in an effort to
reduce the element of random luck to a minimum.
Click Here to View Full Article
to the top
Intel Develops Programming Language for Multi-Core
Computers
InformationWeek (06/12/08) Gonsalves, Antone
Intel showed off a new programming language for multi-core computing,
called Ct, at the Computer History Museum in Mountain View, Calif., on June
11. An extension of C/C++, the programming language automatically
partitions code to run on specific cores. "With Ct, it's almost like
you're writing to a single-core machine," said Intel researcher Mohan
Rajagopalan during the open house for Intel labs. "You leave it to the
compiler and runtime to parallelize." Intel developed the Ct compiler,
which chops up the code to run on separate cores based on the type of data
and the operation performed on the data, in addition to the runtime and an
API for the compiler. Less than 5 percent of Ct is new, so C/C++
programmers will find it easy to use. Rajagopalan also noted that programs
compiled in Ct can scale to the available number of cores. Intel is
relatively close to bringing to market a product developers will be able to
use to make financial analytics applications and software for processing
images or decoding video.
Click Here to View Full Article
to the top
NASA Tests Robots for Manned Move to Mars
Computerworld (06/12/08) Gaudin, Sharon
When astronauts step foot on Mars sometime in the future, they may be
greeted by robots and robotic rovers that were sent to Mars in advance.
The prepositioned robots will be sent ahead to help their human
counterparts on what will probably be NASA's most challenging mission to
date. Although the mission is still years away, NASA is testing the robots
now, and is evaluating the three robotic arms aboard the International
Space Station and the results from space shuttle Discovery, one of the most
robotically intense missions yet. NASA's Allard Beutel says NASA keeps
increasing the complexity of the robotics work, and such work has become
commonplace. Beutel says when astronauts get to Mars, they will use robots
as part of their everyday existence, including possibly building a
workstation or habitat structure on Mars. The crew of the space shuttle
Discovery helped deliver a 33-foot-long, 1,716-pound Japanese-built robotic
arm to the space station. The arm has six joints and is designed for use
outside the Japanese Experiment Module, specifically for moving materials
outside the airlock so scientists can see how they react when exposed to
space, for example. Beutel says a second Japanese-built robot arm, which
will be about six feet long and have a grapple on the end, is scheduled to
be delivered to the space station next year. Another robotic arm on the
space station, called Dextre, is designed to handle most of the exterior
maintenance jobs on the space station, reducing the number of dangerous
space walks the astronauts must make. On Mars, the Mars Lander is using a
robotic arm to scoop soil and ice from the Martian northern pole and
deliver the material to different analysis tools.
Click Here to View Full Article
to the top
National Conference on Sustaining the Science and
Engineering Workforce to Be Held at UMass Amherst June 24
University of Massachusetts Amherst (06/16/08)
The University of Massachusetts Amherst will host the national conference
"Best Practices for Science Education: Retaining Science and Engineering
Undergraduates, Sustaining the Science and Engineering Workforce" on June
24. Dean George Langford of the UMass Amherst College of Natural Science
and Mathematics says the conference is in response to a serious problem at
the national level, specifically the large percentage of students who enter
college interested in science and engineering disciplines but switch to
other majors, compounded by the fact that job opportunities in these fields
has quadrupled. "This disparity has led to outsourcing of talent, which
undermines the U.S. leadership position in educating and training the next
generation of innovators," says Langford. The conference's keynote address
will be given by the National Science Foundation's Linda Slakey, who worked
at UMass Amherst from 1973 until 2006 and served as dean of the College of
Natural Sciences and Mathematics, dean of the Commonwealth College, chair
of the department of biochemistry and molecular biology, and professor of
biochemistry. The conference is intended to identify activities that
increase the number of undergraduates in science and engineering
disciplines, and highlight curriculum reform and best practices for
retaining undergraduates. The ultimate goal is to expand the number of
graduates entering the science and engineering workforce, and to build
momentum and visibility for legislation changes and new funding
opportunities. The conference will feature four breakout sessions: Two
concurrent sessions will examine issues surrounding classroom
effectiveness, with a focus on large enrollment classes and creating a
community of scientists; another set of concurrent sessions will look at
methods for building research into the curriculum, and more effective
introductory lab courses and alternatives to labs.
Click Here to View Full Article
to the top
Space Station Could Beam Secret Quantum Codes by
2014
Scientific American (06/08) Minkel, J.R.
Researchers hope to be running an experiment on the International Space
Station (ISS) by the middle of the next decade that would allow for
transcontinental transmissions of secret messages encoded using the quantum
property of entanglement, which is when two particles, such as photons, are
created by the same event and can communicate instantaneously no matter how
far apart they are. Transmitting entangled pairs of photons reliably is
the foundation of quantum key distribution, a procedure that converts
those pairs of photons into potentially unbreakable codes. Photons can
travel maybe 100 miles on modern fiber-optic cables before their quantum
character breaks down, but that limit disappears above ground. Last year,
a team of researchers led by physicist Anton Zeilinger from the University
of Vienna successfully transmitted quantum keys up to 89.5 miles between a
pair of telescopes in Spain's Canary Islands. Now they want to send
quantum keys hundreds of miles or even more. The group is leading an
international project called Space-QUEST, or Quantum Entanglement for Space
Experiments, with the intention to prove that a system for generating pairs
of entangled photons can fit the constraints imposed by the ISS. Quantum
keys distributed from the ISS could be transmitted to any two points within
the station's line of sight, limited only by the ability of transmitters
and receivers to maintain a tight lock on one another and isolate entangled
photons from background light. Zeilinger and his colleagues demonstrated
they could detect single photons reflects off a satellite 3,700 miles above
Earth earlier this year. Space-QUEST hopes to build a prototype device and
gather preliminary data in time for a meeting of the European Space Agency
in November, where officials will decide what projects will receive funding
and earn the chance to be run in space.
Click Here to View Full Article
to the top
Can Computer Scientist Dream Team Clean Up
E-Voting?
Network World (06/10/08)
Electronic voting has become a source of concern and controversy, with
many e-voting systems proving to be security black holes. A Center for
Correct, Usable, Reliable, Auditable, and Transparent Elections (ACCURATE)
has received a $7.5 million National Science Foundation award to bring the
latest research, insights, and innovations from the lab to the voting booth
and make e-voting systems more secure and accurate. ACCURATE unites
computer experts from across the country and academic disciplines to find
areas that need additional research and to determine how to apply existing
technology and research findings to voting systems. ACCURATE members from
Rice University have designed and implemented a system called "Auditorium,"
which forms the base of a voting system prototype called "VoteBox."
Auditorium is a networked logging and auditing system built using timeline
entanglement and broadcast messages. Auditorium allows anyone to audit the
events, in the order that they occurred, with strong cryptographic
guarantees to protect against tampering with the timeline. Additional
research on secure logging is examining how log verification could be
scaled for an entire election in real time. ACCURATE members at the
University of California, Berkeley, are examining methods for building
trustworthy audit logs in electronic voting systems. Their goal is to
design a mechanism that records the entire user interaction between the
voter and the voting machine to allow auditors to replay a "movie" of the
interactions after the election. Challenges include ensuring that the
audit log does not compromise ballot secrecy and that it is a trustworthy
system.
Click Here to View Full Article
to the top
Doubling Laptop Battery Life
Technology Review (06/13/08) Greene, Kate
Intel researchers believe they have discovered a technique that can double
a laptop battery's life without changing the battery. The method optimizes
power management across the system, including the operating system, screen,
mouse, motherboard chips, and USB port devices. Manufacturers and
researchers have been exploring a variety of ways to make mobile devices
more energy efficient, including operating systems that deploy power-saving
screen savers and put the system to sleep if the user has been away for a
while, and Intel's upcoming Atom microprocessor for mobile Internet devices
has six different levels of sleep depending on the tasks being performed.
Intel's prototype power-management system, called advanced platform power
management, is aware of the power being used by all parts of the laptop, as
well as the power requirements of the user's current task, and it shuts
down operations accordingly, according to Intel's Greg Allison. Allison
says, for example, that when a person reads a static email, the screen
still refreshes 60 times a second with traditional systems, and peripherals
such as the keyboard, mouse, and USB devices continue to drain power while
waiting for instructions. In such a situation, Intel's system would save
power by essentially taking a snapshot of the screen that the user is
reading and saving it to buffer memory, so instead of refreshing, the
screen would maintain an image until the user tapped a button on the
keyboard or moved the mouse, both of which also stay asleep until
activated. Meanwhile, the operating system would monitor the use of other
applications, and limit operations for applications not actively being
used. Energy-monitoring circuits on the chips will also put unnecessary
parts of the microprocessor to sleep. Allison says it takes only 50
milliseconds for the entire system to wake up, an imperceptible time to
users.
Click Here to View Full Article
to the top
'Electron Turbine' Could Print Designer Molecules
NewScientistTech (06/11/08) McAlpine, Kate
Researchers at Lancaster University in the United Kingdom have tested the
design of a nanomotor using advanced computer simulations. However, Adrian
Bachtold from the Catalan Institute for Nanotechnology plans to build a
carbon nanotube that spins in a current of electrons, similar to a wind
turbine in a breeze. The design consists of three carbon nanotube 10
nanometers long and 1 nm wide, one suspended between the others, with its
ends nested inside them to form a rotating joint. The central carbon
nanotube spins around when a direct current is passed along the tubes. The
electrons bounce off the spiral carbon rings of the nanotube turbine, which
redirects them into a spiral flow that causes the tube to move in the
opposite direction. The Lancaster team plans to design smoother nanotubes
to prevent friction. Such an electron turbine has the potential to be used
to serve as a tiny printer or to make computer memory smaller. The June
issue of Physical Review Letters will feature a paper on the tiny, electron
windmills.
Click Here to View Full Article
to the top
Whole Proves to Be Mightier Than the Parts
ICT Results (06/11/08)
Researchers in Europe have developed a strategy that will make it easier
for research and education networks to connect with each other, and also to
commercial telecommunications networks. The MUPBED project used an
automated control plane to cut down vertically through the layers a network
encounters, and established an automatic link between networks when a user
requests a connection of bandwidth. The networks communicate with each
other, provide the solution, then inform the parties at both ends and the
operators in between what has happened over the links. Moreover, network
resources are optimized, and capacity is only used when it is needed for a
specific task. "We developed a network solution which allows multi-domain
networking, and working with standards bodies tested it against emerging
standards," says Jan Spaeth, coordinator of the EU-funded project. "We
were able to influence the standards bodies, who had not previously been
aware of the research networks' requirements and had no input from that
source." The team believes commercial networks will be able to use its
research to develop more advanced services.
Click Here to View Full Article
to the top
In Congress, H-1B Issue Pits Tech Workers Against Farm
Groups
Computerworld (06/12/08) Thibodeau, Patrick
The tech industry's effort to make it easier for skilled foreign workers
to remain in the United States came under criticism during a U.S. House
hearing on June 12, 2008. Rep. Zoe Lofgren (D-Calif.), chair of the
Subcommittee on Immigration, Citizenship and Refugees, Border Security, and
International Law, made a strong case for expanding the cap on the number
of H-1B workers, but committee member Rep. Luis Gutierrez (D-Ill.) said a
narrow approach to the issue of immigration would be taken. "I think we
should give the high-tech industry the innovators they need," said
Gutierrez, who added that something would have to be done about other
foreign workers, such as farmers. Immigration has become an all-or-nothing
battle for many lawmakers after last year's failed bid at reform, and
uncertainty surrounds the efforts of Lofgren, who has introduced three
bills related to hiring foreign tech workers. Groups such as the Institute
of Electrical and Electronics Engineers, the Semiconductor Industry
Association, and the Association of International Educators have expressed
support for improving the H-1B visa program. Sens. Barbara Boxer
(D-Calif.) and Judd Gregg (R-N.H.) introduced a companion bill in the
Senate earlier in the month.
Click Here to View Full Article
to the top
Rummaging Through the Internet
Economist Technology Quarterly (06/08) Vol. 387, No. 8583, P. 14
Web browsing promises to be transformed by new methods for navigating and
collecting information online, and one such method is the freely available
Hyperwords browser add-on, which turns every word or phrase on a page into
a hyperlink. Meanwhile, the Cooliris startup has developed PicLens, free
software that gathers and displays images retrieved from Google, Flickr,
eBay, and other Web sites on a full-screen, 3D wall without any of the
clutter on each image's Web page. Such applications hint at one possible
future incarnation of Web browsing, in which users navigate through
groupings of pages that appear to float in space, pushing undesirable ones
away and organizing others in logical clusters. In late July, the Second
Life 3D virtual environment will incorporate a feature that lets
inhabitants post Web pages on walls, changing Web browsing from a solitary
to a social experience because users roaming virtually through the
environment can convene and chat next to Web pages, according to Linden
Labs executive Joe Miller. Another social browsing tool is 3B's 3B
browser, which arranges pictures of the results of product searches within
the aisles of a virtual shop, where shoppers can gather to see better and
chat through instant messaging with other shoppers looking for similar
items. Carnegie Mellon University computer scientist Dave Farber predicts
that the coolness of the visuals generated by such tools will eventually
give way to the realization of the need for 3D navigation.
Click Here to View Full Article
to the top
Key Differences Between Web 1.0 and Web 2.0
First Monday (06/08) Vol. 13, No. 6, Cormode, Graham; Krishnamurthy,
Balachander
Among Web 2.0's key attributes are the growth of social networks,
bi-directional communication, diverse content types, and various "glue"
technologies, and the authors note that while most of Web 2.0 shares the
same substrate as Web 1.0, there are some significant differences.
Features typical of Web 2.0 Web sites include users as first class entities
in the system, with prominent profile pages; the ability to connect with
users through links to other users who are "friends," membership in various
types of "groups," and subscriptions or RSS feeds of "updates" from other
users; the ability to post content in various media, including blogs,
photos, videos, ratings, and tags; and more technical features, such as
embedding of various rich content types, communication with other users
through internal email or instant messaging systems, and a public API to
permit third-party augmentations and mash-ups. Web 1.0 metrics of similar
interest in Web 2.0 include the general portion of Internet traffic,
numbers of users and servers, and portion of various protocols. About 500
million users reside in a few tens of social networks with the top few
responsible for the bulk of the users and traffic, and traffic within a Web
2.0 site is more difficult to measure without help from the site itself.
The challenges for streamlining popular sites for mobile users differ
slightly between Web 1.0 and Web 2.0, in that instant notification to users
through mobile devices can be facilitated because of the short or episodic
nature of most Web 2.0 communications. Most communication in Web 2.0 is
between users, so Web 2.0 sites have no easy way to select during overload;
however, the sites apply varying restrictions to guarantee that overall
load and latency is reasonably maintained. Some of the Web 2.0 sites are
eager to maximize and retain members within an "electronic fence," which
can facilitate balkanization, although total balkanization is likely to be
prevented by a countercurrent stemming from the prevalent link-based nature
of Web users continuously connecting to sites outside the fence. The
authors point out that there are substantial challenges in permitting users
to comprehend privacy implications and to simply represent usage policies
for their personal data.
Click Here to View Full Article
to the top
Machine Translation for the Most Part Targets Internet
and Technical Texts
Universidad Politecnica de Madrid (05/27/08)
President of the Association for Machine Translation in the Americas Mike
Dillinger says in an interview that machine translation (MT) is primarily
oriented around Internet and technical texts, so content creators must be
trained to assure that documents are machine translatable. "The new
approach [to MT] uses statistical techniques to identify qualitatively
simpler rules" in a rapid, automatic, and scalable manner, says Dillinger.
He describes MT systems as mature for industrial applications but immature
for use by the general public, and says that people are almost always
disillusioned by translation systems because they have unrealistic
expectations about their capabilities. Dillinger outlines five steps in
the MT process: Document preparation, translation system adaptation,
document translation, translation verification, and document distribution.
He says MT follows the exact same stages as human translation except in two
key respects--translation systems can accommodate a large volume of
translated documents, while wording must be handled very carefully because
the systems, unlike human translators, do not possess the technical
knowledge to perceive erroneous wording and take corrective action.
Dillinger dismisses fears that MT systems will drive flesh-and-blood
translators out of work, and attests that "MT takes the most routine work
out of translators' hands so that they can apply their expertise to more
difficult tasks." Variability of vocabulary usage can be a major
impediment that commercial MT systems overcome by using the most common
words to generate a core system and then adding 5,000 to 10,000
customer-specific words, although Dillinger says this method is not
workable for Web applications. The development of systems to direct
authoring of Web content is therefore necessary, he reasons.
Click Here to View Full Article
to the top
Information Accountability
Communications of the ACM (06/08) Vol. 51, No. 6, P. 82; Weitzner, Daniel
J.; Abelson, Harold; Berners-Lee, Tim
Accountability for the misuse of personal information must be enforced by
systems and statutes, as the openness of the information environment makes
protection via encryption and access control impossible. "Information
accountability means the use of information should be transparent so it is
possible to determine whether a particular use is appropriate under a given
set of rules and that the system enables individuals and institutions to be
held accountable for misuse," write the authors. Rules are needed, both in
the United States and internationally, to address the permissible use of
certain types of information, in addition to simple access and collection
restrictions. The authors say that the information-accountability
framework is more reflective of the relationship between the law and human
behavior than the various initiatives to enforce policy compliance via
access control over information. Supporting information accountability
requires a technical architecture that features policy-aware transaction
logs, a common framework for representing policy rules, and
policy-reasoning tools. "One possible approach to designing accountable
systems is to place a series of accountable appliances throughout the
system that communicate through Web-based protocols," the authors suggest.
The authors conclude that perfect compliance should not be the standard for
evaluating laws and systems that aid the enforcement of information
accountability. "Rather we should ask how to build systems that encourage
compliance and maximize the possibility of accountability for violations,"
they write.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Can Machines Be Conscious?
IEEE Spectrum (06/08) Vol. 45, No. 6, P. 55; Koch, Christof; Tononi,
Giulio
Some people are convinced that a conscious machine could be constructed
within a few decades, including Caltech professor Christof Koch and
University of Wisconsin, Madison professor Giulio Tononi, who write that
the emergence of an artificially created consciousness may not take the
form of the most popular speculations. They note that consciousness
requires neither sensory input nor motor output, as exemplified by the
phenomenon of dreaming, and emotions are not a necessary component for
consciousness, either. Koch and Tononi also cite clinical data to suggest
that other traditional elements of consciousness--explicit or working
memory, attention, self-reflection, language--may not be essential, while
the necessary properties of consciousness depend on the amount of
integrated information that an organism or machine can produce. The
authors offer the integrated information theory of consciousness as a
framework for measuring different neural architectures' effectiveness at
generating integrated information and achieving consciousness, and this
framework outlines what they describe as "a Turing Test for consciousness."
One test would be to ask the machine to concisely describe a scene in a
manner that efficiently differentiates the scene's key features from the
vast spectrum of other possible scenes. Koch and Tononi suggest that the
building of a conscious machine could involve the evolution of an
abstracted mammal-like architecture into a conscious entity.
Click Here to View Full Article
to the top