Software Pioneer Peter Naur Wins Association for
Computing Machinery's Turing Award
AScribe Newswire (03/01/06)
Peter Naur has been named the recipient of ACM's 2005 A.M. Turing Award
for his groundbreaking work defining the Algol 60 programming language,
which would become the model for numerous subsequent languages, including
many that are indispensable to software engineering. Named for the British
mathematician Alan M. Turing and often recognized as the Nobel Prize for
computing, the award features a $100,000 prize. Naur edited the "Report on
the Algorithmic Language Algol 60," defining program syntax with what would
come to be known as Backus-Naur Form. The award also recognizes Naur's
work in compiler design, as well as the art and practice of programming.
"Dr. Naur's Algol 60 embodied the notion of elegant simplicity for
algorithmic expression," said Intel CTO Justin Rattner. "This award should
encourage future language designers who are addressing today's biggest
programming challenges, such as general-purpose, multi-threaded
computation, to achieve that same level of elegance and simplicity that was
the hallmark of Algol 60." The late Edsger Dijkstra, recipient of ACM's
1972 Turing Award, credited Naur's work with elevating automatic computing
to a legitimate academic pursuit. Until Naur's report, languages had been
informally defined by their support manuals and the compiler code. The
manual gave precise and economical definitions to both syntax and
semantics. After the publication of Algol 60, Naur co-authored the GIER
Algol Compiler. Microsoft's James Gray, who chaired the 2005 Turing
Committee and is the recipient of the 1998 Turing Award, hailed Naur's
contribution as a "watershed" that introduced many of the programming
conventions that are taken for granted today. Naur is also credited as a
pioneer in establishing software design as a discipline. Naur will receive
the award at the annual ACM Awards Banquet on May 20, 2006, in San
Francisco.
For more information, visit
http://campus.acm.org/public/pressroom/press_releases/2_2006/awards05.cfm
Click Here to View Full Article
to the top
Computing Error
New York Times (03/01/06) P. A24
The frequently sounded alarm that the U.S. computing industry is in
decline due to rampant offshore outsourcing is overstated, according to a
New York Times editorial that cited ACM's recently issued report. More
than anything, the future of the U.S. technology economy is imperiled by
the declining interest in computer science and engineering that results
from these gloomy predictions, as students are likely to pursue other
fields of study when they are told that all the computer jobs are migrating
overseas. The ACM study found that while between 2 percent and 3 percent
of U.S. technology jobs are moved overseas each year, the worldwide
computing boom leads to the creation of far more jobs than are lost. With
technology creeping into every aspect of business, the demand for workers
with technical skills defies industry classification. The U.S. technology
sector will be in trouble, however, if the declining interest among
students persists, depleting the domestic workforce. The ACM report warns
that jobs in research and other high-end technology positions are already
migrating overseas, and that initiatives to "attract, educate, and retain
the best IT talent are critical" to sustaining the industry. The Times
notes that the post 9/11 immigration policies and the condition of math and
science education are less than encouraging signs.
The complete report, "Globalization and Offshoring of Software--A Report of
the ACM Job Migration Task Force," is available at
http://www.acm.org/globalizationreport
Click Here to View Full Article
- Web Link May Require Free Registration
to the top
Outsourcing: Silicon Valley East
Newsweek (03/06/06) Vol. 147, No. 10, P. 42; Naughton, Keith
U.S. consulting firms have overestimated the number of high-tech jobs that
would be outsourced to India, and there is no longer as much of a stigma
attached to shipping jobs to the country. Tech firms continue to maintain
call center and basic support operations in India and are looking to set up
R&D centers in the country, while tech employment in the United States is
on the rise. Over the next decade, the tech industry in the United States
will grow by 1 million jobs, or 30 percent, according to the Bureau of
Labor Statistics. And a study by the Association of Computing Machinery
reveals that the number of tech workers in the United States is up 17
percent from 1999, when the industry was facing a bubble. "Everyone was
worried about the offshoring bogeyman," says Moshe Vardi, an author of the
ACM study. "But the big whoosh of jobs to India never happened."
Infrastructure issues, such as bad roads and a patchy power grid, prevented
many tech companies from rushing head-first into outsourcing to take
advantage of the cheaper labor pool. Outsourcing estimates have now been
cut by at least half, but India still stands to gain a large number of jobs
from U.S. tech companies in the immediate future.
The complete report, "Globalization and Offshoring of Software--A Report of
the ACM Job Migration Task Force," is available at
http://www.acm.org/globalizationreport
Click Here to View Full Article
to the top
Bush Names 14 New PCAST Members
Federal Computer Week (03/01/06) Sternstein, Aliya
In the latest sign of the Bush administration's about-face on the
strategic importance of IT, 14 new appointees to the President's Council of
Advisors on Science and Technology (PCAST) were announced yesterday, almost
nine months after Bush dissolved the President's Information Technology
Advisory Committee (PITAC). When the reconstituted PCAST met for the first
time last month, its membership was unchanged, fueling concerns among
researchers that IT remained at the bottom of the administration's list of
priorities. Those fears have somewhat been allayed by the Bush's American
Competitiveness Initiative, which he unveiled at the State of the Union
address. "It's a new day since the president's State of the Union
message," said Ed Lazowska, PITAC co-chairman from 2003 until its
dissolution in June 2005. "The American Competitiveness Initiative gives
reason for hope that research and advanced education will receive
appropriate prioritization. That creates an opportunity for PCAST to be
effective." The new appointees are: F. Duane Ackerman, president and CEO
of BellSouth; Paul Anderson, chairman and CEO of Duke Energy; Robert Brown,
Boston University president; Nance Dicciani, president and CEO of Honeywell
Specialty Materials; Richard Herman, Chancellor of the University of
Illinois at Urbana-Champaign; Martin Jischke, president of Purdue
University; Fred Kavli, chairman of the Kavli Foundation; Daniel Reed,
director of the Renaissance Computing Institute; AMD Chairman, President,
and CEO Hector de Jesus Ruiz; VeriSign Chairman and CEO Stratton Sclavos;
John Slaughter, president and CEO of the National Action Council for
Minorities in Engineering; EMC President and CEO Joseph Tucci; University
of Alabama President Robert Witt; and GlaxoSmithKline's Tadataka Yamada.
Click Here to View Full Article
to the top
Activists Warn of Rerun of Euro Software Patent
Fight
IDG News Service (02/27/06) Taylor, Simon
The Foundation for a Free Information Infrastructure (FFII) has warned
that a proposed general patent for the European Union could open the door
to software patents. "If you take the case law of the EPO (European Patent
Office) and apply it across the board, that means allowing software
patents," said FFII President Pieter Hintjens, who described the use of
patents in a technology-driven field as "obscene." In the wake of the
stalemate over the EU-wide community patent, the European Commission has
begun again to solicit input from interest groups and private industry on
how to make Europe's system workable. The Business Software Alliance's
(BSA) Francisco Mingorance disagrees with Hintjens, however, noting that
the consultations are not intended to lead directly to software patents,
but rather to address the question more globally, and that the commission's
questionnaire suggests more solutions than just the community patent. The
questionnaire also offers the options of leaving the system unchanged or
modifying the commission through other methods, such as agreeing to cut the
number of languages in which patent applications must be filed to three:
German, French, and English. Mingorance says the BSA seeks a more
transparent and cost-effective patent system, noting that the community
patent is unlikely to see ratification after six years of debate. Hintjens
is steadfast in his opposition to a system that makes it easy to obtain
junk patents or allows software to be patented.
Click Here to View Full Article
to the top
Market Is Hot for High-Skilled in Silicon Valley
Wall Street Journal (02/28/06) P. B1; Tam, Pui-Wing
The Silicon Valley labor market has rebounded from the dot-com collapse,
though tech companies have shifted their hiring focus to more highly
skilled workers. Unlike previous tech-economy recoveries, this time
Silicon Valley firms of all sizes are almost exclusively hiring engineers,
designers, and other skilled workers, having moved the lower-skilled
positions to less expensive parts of the country, or outsourced them
altogether. Joint Venture Silicon Valley found that, for the first time in
four years, the region saw a net increase in jobs last year, with most of
the growth coming in the creative and innovative services category,
comprising research and development, consulting, and industrial design.
Conversely, the number of jobs in semiconductor-equipment and
electronic-component manufacturing both sustained double-digit percentage
losses. The core design, engineering, and science sector accounts for 14
percent of all the jobs in Silicon Valley's economy, indicating a regional
shift from manufacturing and production to design and innovation. Average
income in Silicon Valley rose 2.7 percent from 2004 to $69,455 last year,
though it is still well off the $80,000-plus average of 2000. As
low-skilled labor steadily moves out of the region, Silicon Valley's base
of industries is consolidating, potentially leaving it vulnerable to
another downturn. Silicon Valley has re-centered itself around
higher-skilled jobs in the past in response to a lean economy or increased
competition from other regions in the country, however. Many companies
report that the increased wages improve their competitive position, as
revenue per employee has risen dramatically at firms such as Palm and
SanDisk as they have moved their hiring patterns up the skill curve.
Click Here to View Full Article
- Web Link May Require Paid Subscription
to the top
Geekcorps: A Peace Corps for Techies
CNet (02/28/06) Kanellos, Michael
The U.S.-based not-for-profit Geekcorps is looking for volunteers to
develop the telecommunications infrastructure in Africa and bridge the
technological divide by equipping local radio DJs with PCs, digital
broadcasting equipment, and salvaged antennas that will enable them to
function as ersatz Internet service providers. In Mali, with a 70 percent
illiteracy rate and a virtually non-existent infrastructure, Geekcorps is
providing the equipment for citizens to relay messages to friends in other
parts of the country by giving a message to a radio DJ, who sends it to a
station located closer to the recipient, where a second DJ then broadcasts
the recipient's name and tells him to come to the station to receive his
message. Radio stations collect fees and earn advertising revenue from the
service. "The underlying goal with every implementation is: how can you
make sure this is a money maker for the community?" said Wayan Vota,
director of Geekcorps. Geekcorps has had operations in West Africa and
other areas for more than five years, and has provided some of the
technology that powers Via's durable, energy-efficient PCs that it designed
for emerging markets. Geekcorps typically sends its volunteers for
one-month deployments, and it is currently seeking experts in C++, Linux,
knowledge management, and object-oriented programming. Geekcorps'
long-term mission is to help develop a middle class and further social
mobility through technology, though immediate results can be seen in
providing the locals with access to information about vaccines and
coordinating transportation. Geekcorps has introduced publishing software
and digital printing to Ghana, and helped create a West African
agricultural database. The antennas in Mali were created through reverse
engineering Western-built devices that the group brought over, and ended up
costing around $1 each.
Click Here to View Full Article
to the top
IT Proves to Be a Turnoff for Women
New Zealand Herald (02/28/06) Hembry, Owen
The perception of information technology being an industry not only for
males, but for geeks as well, is keeping many young women from pursuing IT
careers, according to women connected to the recent Computing Women
Congress in New Zealand. Sydney consultant Maggie Alexander says there
were more women filling information, communication, and technology roles in
Australia when she began her career in IT 25 years ago than there are
today. Young girls have few role models in IT, and many young women now
believe science and technology courses are too difficult for them, says
Alexander. "They often don't get the right information from their schools
about the kind of careers and variety of careers there are in IT," adds
Alexander. Annika Hinze, organizer of the congress and a senior lecturer
in computer science at the University of Waikato, also takes issue with the
way IT is marketed to young women. "It's the environment that is presented
in the sense of, 'Oh, the guys are really good at this,' so women get the
feeling that they are not wanted there [and] this is not for them," says
Hinze. At Waikato, the site of the gathering, women represent about 25
percent of computer science students. Statistics New Zealand figures show
that women account for 42 percent of the IT workforce, but the number
includes 82 percent and 72 percent of workers in data entry and desktop
publishing positions, respectively.
For information of ACM's Committee on Women in Computing, visit
http://www.acm.org/women
Click Here to View Full Article
to the top
How to Digitize a Million Books
Technology Review (02/28/06) Greene, Kate
While Google has been cagey about the technical details of its plans to
digitize the book collections of Harvard, Stanford, the University of
Michigan, the University of Oxford, and the New York City Public Library,
computer scientists at Carnegie Mellon University have faced similar
challenges in that school's Million Book Project, begun seven years ago.
The scope of Google's project covers an estimated 18 million books in
roughly 430 languages of various fonts, all of which must be put into a
standardized format to make their text searchable and to simulate the
experience of browsing in a conventional library. Carnegie Mellon's
nonprofit project has established 40 scanning locations in China and India,
where low-paid workers turn each page manually, scanning roughly 100,000
pages each day. Raj Reddy, director of the Million Book Project, notes
that optical-character recognition (OCR) is a rapidly developing field, and
that Carnegie Mellon's Chinese and Egyptian partners are helping develop
tools to read languages written in different scripts and unusual fonts.
Reddy and his team are using software that creates structural metadata to
overcome the inconsistencies of pagination and other physical flaws of
monographs. Reddy says that creating the linkages between words in the
table of contents and a book's chapters is still a manual process.
Google's Daniel Clancy said that Book Search aims to provide the same
experience as a physical library by linking resources through a variety of
criteria, using new organizational algorithms that Clancy describes as a
major challenge that will take years to overcome. Reddy says his team has
been using a statistical approach, and that Google might also consider a
model similar to Amazon's collaborative filtering that would draw from the
results of previous searchers.
Click Here to View Full Article
to the top
Want to Read E-Mail With Your Feet? Microsoft Is Working
on It
Seattle Post-Intelligencer (03/01/06) Bishop, Todd
Microsoft Research's annual TechFest gets underway Wednesday at the
company's campus in Redmond, Wash., when scientists in the research unit
will present prototypes to product teams during the internal event.
Microsoft Research, which has approximately 700 people in the unit, has
researchers working on very technical computer initiatives involving
machine learning and artificial intelligence, but many are also focusing on
more mundane projects such as using common surfaces as computer screens.
During a news conference on Tuesday, researchers showed a prototype
software program that allows people to view emails and digital photos by
moving and stomping their feet on a floor pad. Microsoft researcher Brian
Meyers deleted spam, scrolled through mail, opened messages and viewed them
on a large screen, and used a double-footed move to flag a message that
would require a response when he returned to the office. "It's just
amazing to stomp your email out," says A.J. Brush, another researcher
considering the use of feet as a way to interact with a PC. In addition to
the StepMail program, the Step User Interface prototypes included
StepPhoto. The researchers involved in the Step User Interface Project
Group view the programs as supplemental tools that could potentially
provide some relief from repetitive stress conditions, and not necessarily
as replacements for the keyboard or mouse.
Click Here to View Full Article
- Web Link May Require Free Registration
to the top
Scientist Brings Life to Cell Phones
Korea Times (02/27/06) Tae-gyu, Kim
Researchers affiliated with Samsung Electronics are preparing to give
three-dimensional avatars the ability to mate and have offspring. The
project represents the second phase of an initiative to bring cell phones
to life through the use of avatars that will have the ability to think,
feel, evolve, and interact with users. "It is possible because they have
chromosomes, or a set of computerized DNA codes with genetic data,"
explains Lee Kang-hee, a researcher involved on the project. "We will come
up with various ways how they can pass on their traits to the next
generation." Prof. Kim Jong-hwan of the Korea Advanced Institute of
Science and Technology says Samsung has almost finished the first phase of
the project, which involves installing the software. Lee describes the
three-dimensional avatar as a sophisticated creature, or a
software-incorporated robot, that will can change as it interacts with the
owner of the cell phone. Although the owner can initially set the
personality, it can become better or worse based on the way the owner
treats the avatar, such as how users respond to a signal [popping up on the
screen] that the creature is lonely. The work involving the "artificial
chromosomes" could be finished in a year, and the new phones could be in
stores in 2007.
Click Here to View Full Article
to the top
Research Due for Course Correction
EE Times (02/27/06)No. 1412, P. 1; Merritt, Rick
University of California, Berkeley, researchers held a group session last
week outlining their latest projects, where ACM President David Patterson
warned that software and hardware designers are moving in opposite
directions as they work toward a definition of the next generation's
multicore systems and software, with hardware development focusing on
parallelism at the thread level, while the software camp is concentrating
on data-level parallelism. "We desperately need a new microprocessor
architecture focused on parallel computing" to coalesce the two, said
Patterson, who is heading up the new Research Accelerator for Multiple
Processors (RAMP) project. Patterson expects RAMP to produce an FPGA-based
system by the end of next year that will be able to integrate different
parallelism styles and instruction sets. RAMP, which could settle the
debate between thread-level and data-level parallelism, has attracted
interest from IBM, Intel, Microsoft, Hewlett-Packard, and Sun, as well as
MIT, Stanford, and four other universities. The projected system will cost
around $100,000, providing 1,000 64-bit CPUs connected by a cache-coherent
interconnect capable of handling a variety of instruction sets.
Semiconductor development is expected to continue its dependence on CMOS
technologies, as alternative techniques, such as replacing silicon dioxide
with germanium and carbon nanotubes, will likely rejuvenate the technology
rather than replace it. FinFETs and other new structures are also likely
to extend the life of the transistor beyond most experts' expectations. As
cell phone sales approach 1 billion annually and the proliferation of
devices such as cameras, MP3 players, and game machines showing no signs of
abating, home networking is the next major challenge in the development of
wireless infrastructure, according to the Berkeley researchers, who also
emphasized the need for more secure and reliable software.
Click Here to View Full Article
to the top
The Emergence of GeoSensor Networks
Location Intelligence (02/27/06) Stefanidis, Anthony
While collecting geospatial information has rapidly advanced due to new
technologies such as laser scanners, GPS sensors, and sophisticated
cameras, the emergence of geosensor networks promises to elevate the field
to new heights. The development of geosensor networks is due to advances
in nanotechnology, where it is now relatively easy and cost-effective to
develop energy-efficient semi-autonomous sensors that serve as basic
computing platforms. Because sensor networks depend on the raw data
collection of each node distributed throughout local environments, it is
common for networks to contain sensors with different degrees of capturing,
processing, and communication capacities. High-end networks can contain
hundreds of sensors with the ability to relay data at speeds in excess of
500 Kbps, and an array of gateway sensors to aggregate local data. The
software is typically powered by Linux or the energy-efficient open-source
system TinyOS. New sensor networks have been deployed in a variety of
applications, such as monitoring the quality of drinking water and
improving human/computer interaction. The spatial component of geosensor
networks appears either in the content level, where it is the prime
characteristic of the data being collected, or the analysis level, where
spatial considerations are applied to the data once they have been
gathered. Geosensors can be used for a variety of data collection
applications across vast areas, such monitoring traffic patterns or
tracking a single car around a major city. While sensors have been
gathering spatial data for years, the old calibrated model that had to be
used in controlled areas has given way to wireless networks of varying
sensors that can produce homogeneous datasets. Geosensors also time-stamp
their data, enabling a quantifiable assessment of change over time.
Click Here to View Full Article
to the top
Cyberthieves Silently Copy as You Type
New York Times (02/27/06) P. A1; Zeller Jr., Tom
Many computer users are already aware of the dangers of phishing attacks
but they may not be aware of the use of keylogging programs that silently
copy the keystrokes of computer users and send that information to the
criminals. Recently in Brazil, federal police went to Campina Grande and
several surrounding states and arrested 55 people for seeding the computers
of Brazilians with keyloggers that recorded their typing whenever they
visited their banks online. The criminal ring stole about $4.7 million
from 200 different accounts at six different banks since it began
operations last May, according to the Brazilian authorities. Keylogging
programs work by exploiting security flaws and monitor the path that
carries data from the keyboard to the other parts of the computer. They
are often more intrusive than phishing attacks. The monitoring programs
can be hidden inside ordinary software downloads, email attachments, and
files. "These Trojans are very selective," says Cristine Hoepers, general
manager of Brazil's Computer Emergency Response Team. "They monitor the
Web access the victims make, and start recording information only when the
user enters the sites of interest to the fraudster." The bad news is that
these kinds of crimes are beginning to soar. The amount of Web sites known
to be hiding this kind of malicious code nearly doubled between November
and December to more than 1,900, according to the Anti-Phishing Working
Group. Last year iDefense says there were over 6,000 different keylogger
variants, a 65 percent increase from 2004. The SANS Institute estimates
that last fall, as many as 9.9 million machines in the United States were
infected with some kind of keylogger, putting as much as $24 billion in
bank account assets in the hands of crooks. To reduce the growing threats,
the Federal Deposit Insurance Corporation strengthened its guidelines for
Internet banking this past fall, to require banks to do more than just ask
for a user name and password.
Click Here to View Full Article
to the top
Implementing a Quantum Computation by Free Falling
Science (02/24/06) Vol. 311, No. 5764, P. 1106; Oppenheim, Jonathan
Quantum computers hold numerous advantages over traditional machines,
though few are more pronounced than the disparity in the theory of
computation. While only a few quantum algorithms exist, interest in the
field is growing rapidly, and some have speculated that quantum computing
could eventually solve all the problems that can be verified with a
conventional computer, which would have a tremendous impact on our
understanding of physics. In gauging the efficiency of a computation, one
seeks to determine whether the computation ran at polynomial time
(efficient) or exponential time (inefficient). The physical mapping
between the first quantum states and the final states is known as a unitary
evolution, most of which cannot be implemented efficiently. A computation
is efficient if the steps used by the computer, each of which begin with a
set of basic interactions known as a gate, can be described as polynomial.
The efficiency of a computation can be measured once its unitary evolution
is broken down into the smallest number of rudimentary gates. To determine
whether computations are running at optimal efficiency, a group of
researchers has essentially plotted the coordinates that a computation
should travel and calibrated the speed of a clock based on whether it takes
the most efficient (polynomial) route, or whether it follows a path that
invites unnecessary complications, such as the interaction of more than two
qubits. The goal is to send the computation along a geodesic, or the route
that an object will follow if free falling. Still, in the quantum
environment, there can be multiple geodesics, compelling scientists to
apply Riemannian geometry in the complex search for the shortest.
Click Here to View Full Article
- Web Link May Require Free Registration
to the top
Power Surge
InformationWeek (02/27/06)No. 1078, P. 38; Dunn, Darrell; Claburn, Thomas
Denser server clusters mean greater electricity consumption and higher
heat output, which often entails additional energy consumption for heat
management. Data centers currently consume around $3.3 billion a year in
electricity, and IDC expects the number of U.S. servers to increase by half
over the next four years; the research firm also estimates that electricity
bills for businesses rose by 20 percent between 2004 and 2005. Meanwhile,
a recent poll of 200 AFCOM members found that data centers suffer more than
one serious blackout each year on average, while one-fifth of all data
centers run at 80 percent or more of their power capacity. Technology
vendors are emphasizing new heat-management systems to address the problem,
while Web giants such as Yahoo! and Google are focusing on the development
of computer models that are both cost- and energy-efficient. "It used to
be that you wanted the fastest processor and to process as much as possible
in a small footprint," explains Yahoo! CIO Lars Rabbe. "But everyone has
realized that power pricing, and having low-power CPUs and low-power
systems in general, are becoming more important." A measure of relief can
be derived from the replacement of inefficient servers with power-efficient
technologies such as multicore processors and virtualization. Sun, AMD,
and the EPA hosted a summit in January designed to make the industry more
aware of cooling and energy issues, and Andrew Fenary with the EPA's Energy
Star program says his agency can help coordinate conferences among server
makers, cooling-equipment makers, microprocessor vendors, and data center
managers to develop plans for pinpointing and addressing problems through
the creation of a metric that enables buyers to more readily assess
computer products' efficiency.
Click Here to View Full Article
to the top
Knowledge Management and the Semantic Web: From Scenario
to Technology
IEEE Intelligent Systems (02/06) Vol. 21, No. 1, P. 53; Warren, Paul
The Semantic Web was originally envisioned as a tool for providing
services, but the concept has been reimagined as a complementary knowledge
management environment with unique requirements. The ability to
semiautomatically learn ontologies and extract metadata is one such
requirement. This ability would aid users as they create new knowledge and
help accommodate a massive volume of online legacy data. The original
vision of the Semantic Web assumes that service providers are highly driven
to manually generate metadata so the computer can interpret the service,
and that the quantities of information to be dealt with are relatively
limited; the knowledge management vision cannot support these assumptions,
so methods for reducing the knowledge creator's burden are needed. BT
Research's Paul Warren sees a need for automatic annotation of documents
with metadata via software capable of statistical and linguistic analysis,
and a user interface that facilitates easy and natural metadata insertion.
Automatic or semiautomatic ontology generation, again through statistical
and linguistic methods, is called for in scenarios requiring ontologies for
specialized domains or ontologies that must evolve in keeping with domain
changes. Ontology mediation can combine knowledge from different
ontologies, while visualization techniques can show users relationships in
an ontology and the affiliated metadata. The challenge to realizing the
knowledge management vision of the Semantic Web is twofold, involving the
deployment of knowledge management systems in one instance and achieving
organizational functionality in another; the first challenge might be
addressed by supplementing the results of semiautomatic ontology-learning
and metadata-generation methods with information derived from the context
in which the user is working, while business process software designed to
encourage the user to save to, or retrieve from, the knowledge repository
at major decision points might tackle the second challenge.
Click Here to View Full Article
to the top
Modern Performance Monitoring
Queue (02/06) Vol. 4, No. 1, P. 52; Purdy, Mark
A new performance monitoring and analysis paradigm is needed as the
computer world grows increasingly heterogeneous and decentralized, writes
PurSoft's Mark Purdy. Purdy has developed the PurSoft Analyst, a
performance analysis toolkit created to address performance monitoring
problems he has encountered, which could benefit diverse Unix environments.
The toolkit was designed to facilitate a consistent representation of data
on the graphical user interface regardless of whether the data comes from a
real-time thread or from a disk log file. Included within PurSoft Analyst
is a command-line interface-style logging binary that can be remotely run
on any server. This function can log a system crash, be scheduled for a
specific time, or capture the machine activity when certain user-settable
criteria are fulfilled; the logs can be compiled and catalogued for support
center analysis or for select vendor system engineer evaluation. Since
both the Unix engineer on the floor and primary commercial Unix vendors
receive the same look and feel from the logging tool, all participating
parties involved in a server incident get an identical incident metrics
perspective. PurSoft Analyst also features a profiling analysis tool that
helps the Unix engineer locate any item in the sampled Unix data that
diverges from any user-defined baseline. A RulesEngine that monitors the
computer and flags any variance can be defined by this profiler, according
to Purdy. The author predicts that artificially intelligent-style
profiling will accelerate progress toward real-time problem determination
by enabling profilers to spot a hardware or software variance with a
user-settable threshold, analyze the process data, and ascertain the
processes behind the incident.
Click Here to View Full Article
to the top