EU Wants to Take Lead in 'Web 3.0' Technology
IDG News Service (09/30/08) Meller, Paul
Europe is in an excellent position to become the leader in Web 3.0
technology because of its focus on open and pro-competitive telecom
networks and commitment to online privacy and security, says Viviane
Reding, the European Commission's (EC's) commissioner for information
society and media. "Web 3.0 means seamless 'anytime, anywhere' business,
entertainment and social networking over fast reliable and secure
networks," Reding says. "It means the end of the divide between mobile and
fixed lines." She says there could be a 10-fold increase in the scale of
the digital universe by 2015. The EC's consultation on the next generation
of the Internet launched on Sept. 29 was accompanied by a roadmap. The
report described social networking, online business services, nomadic
services based on GPS and mobile TV, and smart tags using RFID as trends
that would lead to Web 3.0. In a blog post, Vint Cerf welcomed Reding's
stance on free and open networks and on open standards. "For Europe to
keep up in the global online race, it needs to sprint ahead powered by an
openness recipe encompassing a neutral network, users' rights, and open
standards," Cerf wrote. "I'm delighted to see that Europe's policymakers
stress the successful ingredients to promoting a robust, healthy
Internet."
Click Here to View Full Article
to the top
The 2008 Technology Innovation Awards
Wall Street Journal (09/29/08) Totty, Michael
The Wall Street Journal's 2008 Technology Innovation Award winners
included Salesforce.com's Force.com software tool suite, which enables
companies to build their own specially tailored business applications that
are developed and delivered over the Internet. The cloud computing service
enables companies to access computing power on an as-needed basis.
Globalstar's Spot unit earned an award in the consumer electronics category
for the Spot Satellite Messenger, a handheld device that transmits
preprogrammed messages such as "I'm OK," along with users' whereabouts.
Software that tests for security holes in new applications by searching for
flaws in binary code netted an award for Veracode, and company co-founder
Chris Wysopal says the method offers the accuracy of source code analysis
without the need for disclosing proprietary source code. The winner in the
network/Internet technologies category was Xsigo Systems for hardware and
software that allows the replacement of physical cables in a data center
with virtual connectors, each of which is capable of mimicking the
performance of up to 14 separate cables. Dispersed Storage software from
Cleversafe won for a technology that allows sensitive computer files to be
stored more securely and reliably by slicing them up and sending the
slices, which by themselves are unreadable to unauthorized parties, over
the Internet to multiple storage locations on a network. Swiss Federal
Institute of Technology professor Jane Royston says the software "could be
an important part of Internet data storage systems." The winner in the
wireless category was Tata Consultancy Services' mKrishi service, which can
supply crop advice to farmers in rural India via cell phones using a
combination of remote sensors, a voice-enabled text-messaging service, and
a camera phone.
Click Here to View Full Article
to the top
ACM Visionaries to Address Women in Computing
Conference
AScribe Newswire (09/30/08)
ACM President Wendy Hall and ACM A. M. Turing Award recipient Fran Allen
will be two of the speakers at the Grace Hopper Celebration of Women in
Computing, which takes place Oct. 1 to Oct. 4 in Keystone, Colo. The
conference will illuminate the significant role that women play in creating
and utilizing technology to improve world conditions. Hall is a founding
director of the Web Science Research Initiative and Allen is an IBM Fellow
Emerita. Other speakers include ACM Committee of Women co-chair Elaine
Weyuker of AT&T Labs-Research Technical Staff, and ACM-W Turkish Ambassador
Reyyan Ayfer, department chair at Bilkent University. The Grace Hopper
Celebration of Women in Computing, which highlights the research and career
interests of women in computing, is run by the Anita Borg Institute for
Women and Technology and co-presented with ACM. Hall is the first ACM
president from outside North America and one of the first computer
scientists to undertake serious research in multimedia and hypermedia. The
influence of Hall's research can be seen in digital libraries, the
development of the Semantic Web, and the emerging research discipline of
Web Science. Hall will participate in a panel on European Women in Science
and Engineering, as well as a session on ACM's Membership Gender Study and
how to meet the needs of women in computing. Allen, the first women to
receive the A. M. Turing Award, works on computer languages and compilers,
contributing to advances in the use of high-performance computers to solve
problems through techniques that are already used in business and
technology.
Click Here to View Full Article
to the top
See What I See--Machines With Mental Muscle
ICT Results (10/01/08)
The enhancement of machines' interpretive capabilities was the goal of the
European Union-funded MUSCLE project, which established a pan-European
network of excellence involving the participation of more than 30 academic
and research institutions from 14 nations. The initiative has devised a
broad spectrum of practical applications made possible by developments in,
and convergence of, techniques for generating, acquiring, and interpreting
metadata in complex multimedia environments. One application stemming from
the MUSCLE project is a virtual talking head that models what is happening
within the human mouth so that users can mimic the on-screen action as an
aid in pronouncing words and learning vocabulary. Another application
involves a Web-based, real-time object categorization system that can carry
out searches based on image recognition, as well as automatically
categorize and index images according to the objects within the images. A
third application can be used to spot and prevent piracy through the use of
an intelligent video method that employs software to detect any variation
from original recordings, says MUSCLE project coordinator Nozha Boujemaa.
"During the course of the project, we produced more than 600 papers for the
scientific community, as well as having two books published, one on
audiovisual learning techniques for multimedia and the other on the
importance of using multimedia rather than just monomedia," Boujemaa
says.
Click Here to View Full Article
to the top
Computer Failure Hobbles Hubble, Derails Shuttle
Mission
Computerworld (09/30/08) Gaudin, Sharon
NASA scientists announced that a data formatter and control unit on the
Hubble Space Telescope has "totally failed," preventing data from being
sent to Earth and delaying a shuttle mission. The Science Data Formatter
is designed to collect information from five onboard instruments, format
the data into packets, put headers on the packets, and send the packets to
Earth. Hubble Space Telescope program executive Michael Moore says the
Hubble's problematic computer, which has been in orbit for more than 18
years, is a simple but vital part of the telescop's communications system.
NASA scientists are now working to switch the Hubble to onboard redundant
systems to resume services until a space shuttle arrives with a replacement
system. NASA postponed the space shuttle's planned October repair mission
so a replacement computer system can be obtained. Hubble manager Preston
Burch does not know what caused the failure, but notes that the unit runs
at a relatively high temperature compared to other components, and high
temperatures tend to accelerate the degradation process. Moore says
switching over to the redundant systems should take about 10 hours, and
technicians and scientists expect to complete the process at the end of the
first week of October. NASA's Ed Weiler says the switchover and subsequent
installation of new redundant systems should add another five to 10 years
to Hubble's life.
Click Here to View Full Article
to the top
Paper Transistors Make for Disposable Electronics
New Scientist (10/01/08) Das, Saswato
New University of Lisbon researcher Elvira Fortunato has built a
paper-based transistor by coating both sides of a sheet of paper with metal
oxides before applying aluminum contacts. The paper acts as both a
flexible substrate and as an integral part of the semiconductor by helping
to amplify the current that passes through the transistor. "Using the
interstrate is a clear advantage," says Columbia University professor
Ioannis Kymissis. The transistors were produced at room temperature and
tested for two months without any deterioration in performance or
stability, making it plausible that they could be used to make disposable
microelectronics such as RFID tags and smart labels for everyday use. The
paper-based chips are susceptible to tearing or becoming soggy, but these
problems can be corrected by laminating the device. Researchers have been
looking for ways of making semiconductors without the need for expensive
manufacturing plants for some time. "This may go a long way toward
achieving a dream that many groups have pursued for very low-cost, flexible
organic electronics," says Georgia Tech Quantum Institute director Dick
Slusher.
Click Here to View Full Article
- Web Link May Require Paid Subscription
to the top
R&D Chiefs Share Their Strategies
EE Times (09/30/08) Merritt, Rick
Computer science researchers are using new models and fresh subjects to
handle new and old challenges, according to a panel discussion between four
senior research and development (R&D) executives at ResearchFest. "One of
the differences about research at Google is the extent to which researchers
are embedded--everyone is mixed together," says Google director of research
Peter Norvig. IBM Almaden Research center director Jim Spohrer says
getting senior researchers to work on fresh topics is one of the ongoing
challenges in R&D. "We don't get anywhere near the one third of projects
based on out-of-the-box thinking we would like to see for long-term
impact," Spohrer says. "The most mature senior researchers do conservative
things to help the business along while the new researchers come up with
the wacky ideas we like." Norvig says Google often has research teams work
alongside engineering teams to encourage the underlying development of the
product. For example, Google's speech recognition researchers work with
the Google 411 product team and a machine translation research effort is
working with a separate product group. The panelists cited Web services as
a hot new area for computer research. "The ability to provide information
as a service will be the next big opportunity for the next 20 years," says
Xerox PARC's Mark Bernstein.
Click Here to View Full Article
to the top
BYOC: Company Gives Workers Unusual Laptop Leeway
Associated Press (09/25/08) Madkour, Rasha
Citrix Systems is launching a new pilot program for its workers in which
employees are given a $2,100 stipend to buy a laptop of their choosing and
a three-year service plan. In exchange for the freedom to work on a
computer with the specifications they want, workers essentially become
responsible for the company's technology purchasing and maintenance
responsibilities. Citrix appears to be the first large company to launch
such a program, although Gartner analyst Steve Kleynhans says other
technology companies have launched similar pilot programs but are doing so
under the radar. Allowing employees to choose their own computers presents
several technical challenges, including ensuring employees can access the
programs they need for their jobs, and raises corporate policy questions,
such as how sensitive information is protected on employee computers. For
Citrix, the program is being used to promote its virtualization technology,
which enables companies to run software programs they need from a central
data center. Employees can access applications by logging in remotely, but
the programs and information in the data center are never downloaded to a
worker's computer. Citrix CIO Paul Martine believes that such programs
will be the trend of the future. Analyst Tim Bajarin says
buy-your-own-computer programs are likely to be easier for companies such
as Citrix, which specializes in virtualization and understands how to make
the program work, while more traditional IT companies may find such
programs problematic.
Click Here to View Full Article
to the top
SchoolBots Seeks and Destroys Maths Fatigue
Silicon Republic (10/01/08) Boran, Marie
Ireland's SchoolBots competition, run by the Tipperary Institute and
sponsored by Lenovo and Google, is a robot design competition designed to
encourage Transition year and Leaving Certificate students to pursue
technology education. Information and communication technology (ICT) is
not on the curriculum at the second level, and educators say that events
such as SchoolBots are one of the only ways to show young adults what a
third-level course in ICT may entail. "My experience from running
SchoolBots for the past two years would suggest that the new mobile phone
and game console generation of school kids appreciate and understand that
computer software is needed and used in their everyday lives," says the
Tipperary Institute's Liam Noonan. "The issue, however, is that ICT is not
taught as a second-level subject, so how can we expect them to know what
ICT or computer science is all about if they have no experience?" Noonan
says he would like to see a senior-cycle ICT Leaving Cert subject that
teaches the fundamentals of programming, multimedia, and technology and
operating systems concepts through fun programs such as SchoolBots. He
says the current ICT education courses do not address the ICT sector in an
adequate manner and do not show students what ICT is really about, such as
writing software, developing multimedia content, and other high-tech
activities.
Click Here to View Full Article
to the top
Preventing Forest Fires With Tree Power
MIT News (09/23/08) Thomson, Elizabeth A.
Massachusetts Institute of Technology researchers are developing a new
sensor system that uses trees to power a network of sensors that could
detect forest fires and other events in forests. Each sensor is equipped
with an off-the-shelf battery that can be slowly recharged using
electricity generated by the tree. A single tree does not generate a lot
of power, but over time the "trickle charge" adds up, similar to a faucet
dripping into a bucket, says MIT researcher Shuguang Zhang. A tree
produces enough electricity to enable temperature and humidity sensors to
wireless transmit signals four times a day, or immediately if there is a
fire. Each transmission jumps from one sensor to another until it reaches
an existing weather station. Zhang and fellow MIT researcher Christopher
J. Love say trees produce electricity due to an imbalance in pH between a
tree and the soil. The researchers plan to place sensors on four trees per
acre, and note that the system is designed for easy installation by
unskilled workers. The group is now finalizing how the wireless sensor
network should be configured to minimize power usage and expect to begin a
trial of the system next spring.
Click Here to View Full Article
to the top
Analyzing Music the Digital Way
Philadelphia Inquirer (09/22/08) Avril, Tom
Engineers, musicians, and computer researchers recently gathered at Drexel
University for the International Conference on Music Information Retrieval
to discuss using computers to analyze and manage the world of sound. The
event was first held in Plymouth, Mass., in 2000, with music theorists and
librarians heavily represented among the few dozen attendees. Now, the
event is far more technology oriented. Some of the technologies could be
incorporated into iPods in the next 18 months, possibly helping listeners
sort through an unruly music collection. A key part of the conference is
the announcements of results from a competition in which various
universities pit their music-analysis algorithms against one another.
Entrants from more than a dozen countries competed in 18 tasks, using their
computers to listen to selections of music and identify aspects such as
genre, mood, composer, and title. The goal is to eventually help people
search for music they might like by autonomously combing through millions
of audio files. University of Illinois at Urbana-Champaign professor J.
Stephen Downie was particularly impressed by the entrants' success at
identifying cover songs by different artists. Another task challenged the
algorithms to identify tunes someone hummed, which could eventually enable
karaoke machines and music shops to identify the song that someone is
humming and provide it to them.
Click Here to View Full Article
to the top
New Study Highlights Risk of Fake Popup Warnings for
Internet Users
NC State University News Services (09/22/08)
North Carolina State University (NC State) researchers have created phony
popup messages that were able to fool participants in its study 63 percent
of the time. The undergraduates opened up popup messages in a way that
could have exposed them to malevolent software, such as spyware or a
computer virus, says NC State professor Michael S. Wogalter. Simply
closing the message box would have been safer, considering phony popup
messages are sometimes designed to trick users into downloading harmful
software. Companies could add unique features to real messages to help Web
users differentiate between genuine warning messages and fake popups, says
Wogalter, who also expressed concern that these warnings might eventually
be duplicated as well. Internet users must be reminded to remain cautious
as they surf the Web, he says. "Be suspicious when things pop up,"
Wogalter advises. "Don't click OK--close the box instead."
Click Here to View Full Article
to the top
Is This The Future of Internet Search?
Israel21c (09/22/08) Shamah, David
Danny Fine of BrainDamage in Haifa, Israel, says computers, not people,
should be carrying out Internet searches. He says searching is a form of
artificial intelligence that analyzes documents and creates a map of
keywords and their relationships to each other. "The search engine doesn't
really understand what you're asking, of course--it's just a dumb computer,
after all," Fine says. "The way it figures out what you're looking for is
by comparing your request to a long list of keywords that are indexed in a
database with other terms that could really be what you're looking for."
BrainDamage is developing Noesis, a new approach to Internet searching
based on natural thinking technology. Natural thinking technology puts the
burden of understanding search queries on the search engine, enabling it to
return more accurate results. Fine says Noesis gathers information and
develops it, guided by the user, to reach a conclusion using the same
patterns of logic and ideas that humans use. Noesis essentially teaches
machines to understand what humans have in mind when they make a request.
"Our system advances artificial intelligence far beyond where it is today,
enabling computers to truly understand what is being asked of them--and to
respond appropriately," Fine says.
Click Here to View Full Article
to the top
Scaling Mount Exaflops
Computerworld (09/22/08) Vol. 42, No. 38, P. 24; Anthes, Gary
The most recent Top500 list was topped by a computer that offers a top
processing speed of 1.206 petaflops, the first computer to break the
petaflop barrier. However, the Roadrunner computer, built by IBM for the
Los Alamos National Laboratory, may seldom reach its peak operating
capacity. "The Top500 list is only useful in telling you the absolute
upper bound of the capabilities of the computers," says National Center for
Atmospheric Research (NCAR) director of supercomputing research Richard
Loft. "It's not useful in terms of telling you their utility in real
scientific calculations." Loft doubts that Roadrunner would operate at
more than 2 percent of its peak rated power on NCAR's ocean and climate
models. He says the problem is that placement on the Top500 list is
determined by performance on the decades-old Linpack benchmark, which is
Fortran code that measures the speed of processors on floating-point math
operations, such as multiplying two long decimal numbers. Loft says the
HPC Challenge Benchmark, a suite of tests sponsored by the Defense Advanced
Research Projects Agency, more closely resembles what supercomputers are
used for. The test suite consists of the Linpack floating-point benchmark
and six others that measure performance factors such as integer math,
memory updates, sustainable memory bandwidth, and interprocessor
communications. "As long as we continue to focus on peak floating-point
performance, we are missing the actual hard problem that is holding up a
lot of science," Loft says.
Click Here to View Full Article
to the top
Toward the Semantic Deep Web
Computer (09/08) Vol. 41, No. 9, P. 95; Geller, James; Chun, Soon Ae; An,
Yoo Jung
The Semantic Deep Web integrates Semantic Web components with the
employment of ontology-aware browsers to squeeze information out of the
Deep Web, which is nonindexable, invisible, and concealed online content
that is only accessible via Web services or Web-form interfaces, write New
Jersey Institute of Technology professor James Geller and colleagues. "The
primary goals of the Semantic Deep Web are to access Deep Web data through
various Web technologies and to realize the Semantic Web's vision by
enriching ontologies using this data," the authors note. To access the
Deep Web with Semantic Web technologies, the Semantic Deep Web utilizes
ontology plug-in search, a method for enriching a domain ontology with Deep
Web data semantics so that it can be used to refine user search queries
processed by a conventional search. Another key Semantic Deep Web process
is Deep Web service annotation, in which Deep Web services are annotated
with Deep Web data semantics so that they can be searched by a Semantic Web
search engine. It is simpler from a semantic perspective to obtain
ontologies from Deep Web data sources, especially well-structured
relational back-end databases, than from unstructured natural-language text
documents. Activities Geller lists as necessary for fusing Semantic Web
and Deep Web technologies together include the development of
ontology-aware, high-quality Web search engines; construction of large
ontologies from Deep Web sites, beginning with all e-commerce subdomains;
achieving acceptance of an "open source attitude" in the e-commerce space
to simplify the building of Deep Web ontologies by accessing securely
locked data sources; creation of libraries of semantic crawlers designed to
extract back-end database information; and assembly of comprehensive index
structures for Deep Web sites.
Click Here to View Full Article
- Web Link May Require Paid Subscription
to the top