Programs Written in Old Code Pose Business Problem
Financial Times Digital Business (11/22/06) P. 6; MacKenzie, Kate
While legacy languages such as Cobol are still in relatively heavy use by
many companies, the number of programmers able to work with them is
dwindling. Between 12 and 15 percent of new development, mostly back-end
financial systems, is being written in these languages, says Gartner's Jim
Duggan, and a Computerworld survey showed that 58 percent of respondents
using Cobol are developing new applications in it. Gartner estimates that
50 percent of programmers who are skilled with mainframes, on which legacy
programs are run, will be eligible to claim their pension by 2007, and to
make matters worse, these older employees also expect higher salaries.
Many schools no longer teach legacy languages, favoring "object-oriented"
languages. Many companies are turning to internal training of recent
graduates, while some have even worked with nearby schools to develop
legacy language education curriculums. Some are choosing to keep
applications in legacy languages but run them using SOAs that are easier to
maintain; "clone" products provide the additional option of reproducing
legacy versions of software from mainframes onto modern systems. The fact
that these languages have hung around shows the longevity of the mainframe,
which many thought was coming to an end in the 1990s; mainframe giant IBM
has established a new educational programs to promote and simplify
mainframe administration.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Kurzweil: Computers Will Enable People to Live
Forever
InformationWeek (11/21/06) Gaudin, Sharon
In a keynote speech at SC06, "The Singularity is Near" author, futurist
Ray Kurzweil, explained his views that computer, or non-biological
intelligence, will allow humans to overcome illness and aging in just 25
years. By reverse engineering our own brains, scientists will be able to
develop intelligent machines that will surpass human intelligence. He
describes the medical revolution he predicts as replacing the "human body
version 1.0" through technology such as nanobots that can swim through our
blood stream to make necessary repairs to keep us young, and computers that
can download and backup our memory. "$1,000 worth of computation in the
2020s will be 1,000 times more powerful than the human brain," Kurzweil
predicts. Computers will be integrated into the world around us, even our
bodies, he adds; imagine a computer in your eye that projects images upon
your retina. "We won't experience 100 years of technological advance in
the 21st century; we will witness on the order of 20,000 years of progress,
or about 1,000 times greater than what was accomplished in the 20th
century," he said. Kurzweil has received the National Medal of Technology
and the Lemelson-MIT prize; his book has been endorsed by Bill Gates
himself, a robotic director at CMU, and a professor and a physicist at MIT.
Kurzweil says, "Supercomputing is behind the progress in all of these
areas," and predicts that non-biological intelligence created in 2045 will
be one billion times more powerful than all human intelligence today. He
also predicts that in 15 years virtual reality environments will be
routinely used to talk with others instead of making a cell call and
computers will be ubiquitous and pervasive.
Click Here to View Full Article
to the top
Electronic Voting Trend May Be Short-Circuiting
Sarasota Herald-Tribune (FL) (11/19/06) Hull, Victor
An audit of the congressional election in Sarasota County, Fla., is at the
center of a push to require electronic voting machines to produce paper
records, or to even ditch electronic voting altogether. Support is rising
in Congress for legislation requiring a paper trail, and a bill has even
been filed that would require a hand count for the presidential election.
Twenty-seven states have already passed a paper-trail mandate, some also
requiring audits of the electronic voting process. The Sarasota election
was mentioned by Democrats in Congress and is seen by many citizens as a
clear indictment of e-voting, since there is really no way of figuring out
what went wrong, as e-voting expert Avi Rubin of Johns Hopkins University
points out. Votersunite.org executive director John Gideon feels that an
examination of the Sarasota problem will serve as "a death knell" for the
technology. Verifiedvoting.org's David Dill believes optical-scan voting
would have prevented the Sarasota problem, while others would only feel
comfortable with hand-counted paper ballots; but both of these systems have
been found to have their share of flaws as well. Officials such as
Charlotte County, Fla., elections supervisor Mac Horton, whose district
uses the same machines as Sarasota County, are reluctant to abandon the
costly system. "I've been very well pleased," says Horton. If it's left
up to me, I'd stay right where I'm at." For information about ACM's
e-voting activities, visit
http://www.acm.org/usacm
Click Here to View Full Article
to the top
Science Ph.D.'s Continue to Grow
Inside Higher Ed (11/20/06) Lederman, Doug
A new National Science Foundation report found that a record 27,974
science and engineering Ph.D.'s were handed out by American universities in
2005, eclipsing 1998's previous all-time high of 27,232, but this news will
not be enough to stem the growing fear of weakening American scientific
competitiveness. The number of Ph.D.'s given to women, Asian Americans,
several underrepresented minority groups, and "STEM" fields were also found
to be at an all-time high. However, the greatest increase in growth was in
the category of non-U.S. citizens, who earned 13.4 percent more doctorates
from American universities in 2005 than in 2004. Moreover, 41 percent of
all doctorate recipients in 2005 were foreign-born, up from 39 percent in
2004. More Ph.D.'s were handed out in the U.S. in 2005 in computer science
than any previous year, up significantly from 2002. Females accounted for
19.8 percent of computer science doctorates awarded in 2005, up from 15.1
percent in 1996.
Click Here to View Full Article
to the top
Image Labeling for Blind Helps Machines 'Think'
Washington Post (11/21/06) P. A2; Goldfarb, Zachary A.
An online game has been designed to make image labeling fun and make
surfing the Internet easier for the blind. The type of Internet program
used by the blind reads Web pages aloud, but since images cannot be
identified by the program and thus have no way of being spoken, many pages
are prohibitive. The solution to this is image labeling, which would give
these programs a way to describe images verbally. Carnegie Mellon
University computer science professor Luis von Ahn developed the ESP game
for just this purpose: random visitors to ESPgame.org are paired up and
challenged to provide identical labels for the image they see; some people
have spent as much as 40 hours a week on the site. Programs such as the
ESP Game are known as human computation, where a computer asks a human a
question and the human does the answering. Teaching a computer using human
computation is a lot like the way children learn to identify things, but as
von Ahn says, "Nobody bothers to teach a computer." CMU's Manuel Blum, who
advised von Ahn's dissertation, explains, "What he's doing is mining the
ability of humans." Von Ahn aims to develop computer intelligence that
resembles that of humans and could perform language translation that
accounts for the subtleties of foreign languages, for example, or make fast
illness diagnoses in hospitals. Von Ahn says his goal is "To be able to
use all of this data and to have computers be able to do pretty much
everything we can do."
Click Here to View Full Article
to the top
Networks Could Self-Organize Sooner Than We Think
IST Results (11/21/06)
Autonomic communication, which is expected to help networks automatically
adapt to the growing complexity of the Internet, has received a boost in
recent years from technology known as "self-organization." The independent
Automated Communication Forum (ACF), which was put together by European
scientists in 2004, aims to enhance this new form of self-organizing,
self-managing, context-aware, and autonomous type of networking through
work in both market-related and non-market-related initiatives. Automated
communication encompasses these "self" terms, as well as many other
concepts, and according to Mikhail Smirnov, who has played a large role in
the ACF and coordinated the ACCA project, a Future and Emerging
Technologies initiative that came to an end this September, these
technologies can ensure that networks have online identity, allow ISPs to
better manage wholesale interfaces, and remedy the big problem of
complexity in future networking technology by providing management.
Self-management, says Smirnov, is the "top of the pyramid of
technologies...It's not about putting intelligence into hardware, but
making networks behave intelligently without human intervention." While
these communications are currently accessible through an ISP or gateway,
"in [the] future people will be connected directly to their contacts,"
which will require enhanced linkage between users, content, and service on
the part of telecoms, explains Smirnov.
Click Here to View Full Article
to the top
Phishing Toolbars: All as Hopeless as One Another
Techworld (11/20/06) Dunn, John E.
Anti-phishing Web browser toolbars are not very effective, concludes a new
study conducted by Carnegie Mellon University researchers and supported by
the National Science Foundation and the U.S. Army Research Office. The
study, "Finding Phish: An Evaluation of Anti-Phishing Toolbars," looked at
10 browser toolbars to determine their anti-phishing abilities and
concluded that even the most capable toolbars (Earthlink, Google,
Cloudmark, MS Internet Explorer 7, and Netcraft) identified only 85 percent
of malicious Web sites, while the rest of the toolbars (eBay, Geotrust's
TrustWatch, Stanford University's Spoofguard, and McAfee's Site Advisor)
scored below the 50 percent mark. "Overall, we found that the
anti-phishing toolbars that were examined in this study left a lot to be
desired," said the authors of the study. "Many of the toolbars tested were
vulnerable to some simple exploits as well." A good deal of those tested
delivered a significant amount of false positives, which the researchers
viewed as equally harmful because of the lack of trust this could breed in
users. The researchers concluded that all filters must be used with care,
and that the filter itself, not the browser it is used with, determines the
level of security; the ability of the heuristics applied to detect
fraudulent sites, and the usability of the software design for the user are
the most important aspects of security.
Click Here to View Full Article
to the top
Hard-working Chips May Reveal Encryption Keys
New Scientist (11/20/06) Knight, Will
"Branch prediction" could put the modern microchip at risk to hackers,
according to Jean-Pierre Seifert of the University of Haifa in Israel and
the University of Innsbruck in Austria, and colleagues. Microchips second
guess the logical flow of a program before the actual execution from branch
to branch as a way to process information at a faster rate. However,
branch prediction can tip off hackers about encryption key details that are
processed, if there is a rapid increase in the work it performs and the
time required, which would result from a need to perform another operation
or a mistake. In a few thousandths of a second, Seifert and his team were
able to figure out a high-security 512-bit encryption key, which is often
used to protect online financial information and email messages from
eavesdroppers. "Security has been sacrificed for the benefit of
performance," says Seifert, who suggests the "Simple Branch Prediction
Analysis" attack method could be carried out by hiding a small piece of
software on a target computer. The researchers have posted their work
online, and will participate in the RSA Security conference in February
2007.
Click Here to View Full Article
to the top
CMU Robot Car to Face Urban Traffic Challenges
Pittsburgh Post-Gazette (11/20/06) Templeton, David
The CMU Tartan Racing Team has one more year to perfect its robotic Chevy
Tahoe for the 60-mile DARPA Urban Challenge, to be held next November at a
yet-to-be-named Western location. Work is taking place at CMU's
state-of-the-art Robot City, a converted steel site, where the car has
already logged over 100 miles, including 50 "blind" miles. Although
Congress will not provide any prize money for the winning team, DARPA's
Track A teams, including CMU, can qualify for up to $1 million in funding.
The competition asks each team to design a robotic vehicle that can
navigate its way through a city, obeying traffic laws and dealing with
traffic and other obstacles; "It's a whole other layer of complexity," said
Tartan Racing's director of technology Chris Urmson. "People are working
on perception problems with city speeds and urban driving conditions. Not
a lot of work has been done in operating in those spaces." The vehicle
must even be able to tell when a broken down competitor is in its way,
realizing that it is then acceptable to break the double yellow line.
Technology being developed includes 360-degree vision and depth perception
as well as software that lets the car comprehend and react to what it sees.
CMU has 24 researchers working on the project, and Tartan Racing director
Charles "Red" Whittaker says, "Nothing less than full commitment will
succeed. We're involved in it because it matters to the future of
robotics, it matters to each of us individually, and we're in it for the
win." Whittaker says to meet the demands of the race it will require the
development of technology that is "a leap beyond what's already been done,"
but the ultimate payoff could be the creation of vehicles that do all or
most of the driving and much safer driving conditions.
Click Here to View Full Article
to the top
Driving a Wheelchair With Your Shirt
Technology Review (11/20/06) Singer, Emily
Scientists at Northwestern University are creating a garment with built in
sensors that can adapt to the way of quadriplegics, rather than previous
systems that required patients to "fit the capacity of the machines," says
Alon Fishbach, who works on the project. Control mechanisms currently used
by quadriplegics include the sip/puff switch, which only allows two
commands, and a headswitch that registers head movements against the back
of the chair, but the Northwestern team's system is different. The shirt
contains 52 flexible, piezoresistive sensors, developed at the University
of Pisa, that change voltage as a result of being stretched at different
angles. An algorithm has been developed to analyze the signals from each
sensor in order to determine a definite number of movements that are
translated into movement of the wheelchair. A virtual reality environment
allows the patient to orient himself with the controls and also suggests
ways to control the wheelchair more efficiently. The technology has been
successfully tested with one patient, who has use of his hands. Further
innovations are planned to bring this liberating technology to other
aspects of life, such as video gaming.
Click Here to View Full Article
to the top
Quest for the Last Word in Search
Times Online (UK) (11/19/06) Durman, Paul
Google's director of research Peter Norvig describes today's search
engines as "the very beginning of search," seeing the eventual possibility
for semantic search engines. Google is constantly tweaking its search
engine, not only combating companies trying to boost their own search
rankings, but improving the effectiveness of the search engine's ability to
provide users with appropriate, helpful results. Norvig, an artificial
intelligence expert who previously worked at NASA, says that most of his
colleagues are "disappointed" that searching still consists of entering a
few words into a box. Google's goal is to be able search all types of
information, including books, pictures, and videos. The company's Google
Print project, for example, is an effort to digitize and make searchable
thousands of books, while it's also working on speech-recognition
technology that would allow a user to simply "tell" their cell phone what
they are looking for and have Google provide them with results. Norvig
says, "We need to understand where words are on a page, we need to
understand diagrams. We need to update our algorithms to understand what
the right answer is from a 500-page book. We are doing all that." One day
Norvig hopes to see the advent of semantic searches, where users can simply
pose a question and receive an answer. Meanwhile, other companies are
attempting to surpass Google's abilities, including Hakia, a New York Firm
advised by Yorick Wilks of Sheffield University, which is creating what
they call a "meaning-based" search engine. Norvig believes that
competition will only strengthen overall innovation, and that only Google
is fully capable of realizing and deploying such innovations on a large
scale.
Click Here to View Full Article
to the top
Wanted: Solutions for Post-CMOS Era
EE Times (11/20/06) LaPedus, Mark
Researchers at the AVS International Symposium & Exhibition in San
Francisco expressed the need for a feasible replacement for CMOS-silicon
within the next decade, when Moore's Law is expected to come to an end.
Carbon nanotubes, nanowires, molecular electronics, quantum computing,
three dimensional transistor designs, and spintronics are all
possibilities. No format has received an overwhelming share of funding,
but carbon nanotubes are being talked about the most; and a Massachusetts
company has built and tested a 22-nm nonvolatile random access memory
switch based on a carbon nanotube matrix structure set across an etched
trench. However, Intel says that although nanotubes can be developed and
utilized in a lab, mass production is still quite a challenge. Others
favor spin technology, and the Nanoelectronics Research Initiative plans to
develop devices with critical dimensions under 10 nm; but manufacturing of
this technology has its share of difficulties as well. Self assembly is
one possibility, as Wisconsin University scientists discovered a material
called block polymers that can spontaneously assemble into complex 3D
shapes when inserted onto special 2D surfaces. The future of the chip
industry is uncertain, as 450mm fabs and 675mm fabs have been hypothesized,
but would mean incredible costs.
Click Here to View Full Article
to the top
Sending Touch, Smell Over Net
Nikkei Weekly (11/13/06) Vol. 44, No. 2260, P. 16; Matsuda, Shogo
Researchers have set their eyes on the sense of touch, smell, and taste as
they attempt to usher in a world of total perceptual communications.
Sensual interaction over the Internet is primarily limited to seeing and
hearing, but researchers across Japan are working to allow more sensory
feedback for users of communications technology. For example, a tactile
mechanism would enable surgical robots to transmit tactile sensations to a
remote surgeon who performs a procedure while controlling surgical
instruments on a video monitor. A pair of forceps, a scissor-like device
developed by researchers at Keio University, is designed to allow the
surgeon to "feel" the tissue and organs she is touching. "This promises a
huge jump in the safety of robotic surgery," says professor Kohei Onishi of
Keio. Meanwhile, the Tsuji Academy cooking-school chain has developed a
device that can produce artificial smells like beef stew and curry using a
process that is similar to the way in which printer ink is released from a
cartridge. And Intelligent Sensor Technology is already bringing to market
a taste sensing system that uses technology developed by Kyushu
University's Kiyoshi Toko.
Click Here to View Full Article
- Web Link to Publication Homepage
to the top
Grid and Bear it
Government Computer News (11/20/06) Vol. 25, No. 33, Jackson, Joab
Although grid computing has been widely adopted by the research community,
it still struggles to find its place in the world of business. While grid
computing is still young, even those who use it admit how complicated it is
to effectively deploy, with problems ranging from high costs to a lack of
applications. The technology has always been rather esoteric, beyond the
reach of most IT shops, and as Ian Foster, who originally created the idea
of grid computing along with Carl Kesselman, admits, "There is no turnkey
solution, and there won't be one for a while." The Defense Information
Systems Agency (DISA) has chosen on-demand computing, rather than grid
computing, to meet its needs. The agency awarded contracts, that could be
worth as much as $700 for the development of on-demand processing
capabilities. Despite DISA's rejection of grid computing, the technology
is gaining popularity in the pharmaceutical industry, and oil companies and
financial firms also are using the technology to some degree. A major step
toward establishing a presence in private business is for grid computing to
cultivate partnerships with Web services. Kesselman is confident that
in-roads will be made as a wider range of groups realize the potential
benefits of grid computing. He says, "I think a lot of the early
perception was that grid computing was for doing science and
supercomputing. We've just begun to scratch the surface of real business
transformation."
Click Here to View Full Article
to the top
Categorizing Web Search Results Into Meaningful and
Stable Categories Using Fast-Feature Techniques
ResourceShelf (11/21/06) Kules, Bill; Kustanowitz, Jack; Shneiderman, Ben
Bill Kules, Jack Kustanowitz, and Ben Shneiderman of the University of
Maryland's Human-Computer Interaction Lab and Department of Computer
Science propose a number of "fast-feature" methods to categorize Web search
results into stable and meaningful categories. These techniques were
developed to address the metadata challenge of increasing numbers of
unstructured and semi-structured digital documents, and the advantages such
techniques yield include the provision of overviews, navigation within
search results, and negative results. These methods use nothing beyond the
features available in the search result list (title, snippet, URL, etc.),
while credible knowledge resources (the Open Directory Project Web
directory's thematic hierarchy, a U.S. government organizational hierarchy,
personal browsing histories, DNS domain, and document size) are also
employed to augment search results with important metadata. The
researchers ran three tests in which the percentage of results categorized
for a quintet of representative queries was high enough to suggest that the
techniques were practically beneficial for such applications as general Web
search, government Web search, and the Web site of the Bureau of Labor
Statistics. A prototype search engine (SERVICE) incorporates fast-feature
techniques, and Kules et. al make suggestions about improving
categorization rates and how Web site designers could restructure their
sites to support rapid search result categorization. They note, for
example, that categorization engines would be capable of classifying pages
in precise accordance with the authors' intentions if sites published a
machine-readable site map and placed it in a standard location.
Click Here to View Full Article
to the top
Semiometrics: Applying Ontologies Across Large-Scale
Digital Libraries
University of Southampton (ECS) (11/10/06) McRae-Spencer, Duncan M.;
Shadbolt, Nigel R.
Duncan McRae-Spencer and Nigel Shadbolt of the University of Southampton's
School of Electronics and Computer Science stress the need for services
that can integrate and carry out inference calculations on the metadata
generated by increasingly populous, accessible, and complete online digital
libraries, and note that relational database management systems (RDBMS)
cannot always deal with concurrent data updates and retrieval at immense
scales. They think an alternative can be found in the expansion of RDF and
the growing interest in Semantic Web technologies, and propose a method for
large-scale metadata analysis and scalability testing that employs
real-world data. The authors detail how RDF data storage and SPARQL
querying offer performance levels at least as practical as standard SQL
tactics for complex queries; they also have the flexibility and speed
required for online digital library services. McRae-Spencer and Shadbolt
conducted tests that determined that most searches translated into SPARQL
were finished in a suitable time for use in Web services, while those that
were too slow were aligned with the few that SQL queries could respond to
in a reasonable timeframe from the open database. Blending the success of
the SPARQL semantic Web service querying model with the SQL queries to the
open database yielded the development of a Semiometric viewer application.
"The results...show that in practice, the only realistic way for the
SemioViewer application to work is to have both open SQL and SPARQL
queries," the authors note. "While not typical Semantic Web applications,
both the SemioViewer and the SPARQL-based Web services and client
pages...require both SQL and SPARQL queries in order to perform
effectively, if they are to remain open to having regular data updates."
McRae-Spencer and Shadbolt reach the conclusion that RDF technologies are
better than RDBMS strategies as in terms of scalability and realistic
execution, and offer superior performance over traditional RDBMS for
queries based on relationship on large-scale metadata stores because they
enable timely retrieval as well as data updates.
Click Here to View Full Article
to the top
Whence Data Management?
Dr. Dobb's Journal (11/06)No. 390, P. 79; Ambler, Scott W.
The adoption of agile database methods can address a shortage of viable
solutions to data quality problems among most organizations, as determined
by a July Dr. Dobb's Journal poll. Though data is regarded as a corporate
asset by an overwhelming majority of the organizations polled, only 40.3
percent of respondents said their organization has a validation test suite
in place, while 63.3 percent of those organizations let the developers run
the test suite without restrictions. Sixty percent of organizations
reported having a data group, but two-thirds of respondents within those
organizations said developers sometimes circumvent the data group and
tackle data issues by themselves, which leads to dubious database design.
Just 34.2 percent of the organizations with data groups offer data-oriented
training to developers. Developer education can solve about 25 percent of
the problem of developers going around data groups, while 8 percent of
respondents are unaware of the data group's existence and 17 percent do not
know that they should be collaborating with the data group. However, 20
percent of developers cited problems working with data professionals; 36
percent called data groups very slow to respond; and 19 percent saw little
value in data groups. Teaching data management basics to developers and
modern development techniques to data professionals could bridge the gap in
understanding between these two groups. A hopeful sign is that 33 percent
of respondents said their organizations are approaching the fixing of data
sources with an evolutionary strategy.
Click Here to View Full Article
to the top