In Patent Disputes, a Scramble to Prove Ideas Are Old
Hat
Wall Street Journal (01/25/06) P. A1; Squeo, Anne Marie
In defending its patents, Research In Motion (RIM) offered as evidence
eight reports issued by the Norwegian Telecommunications Administration
from 1986 to 1989 that detail a wireless email system that predates NTP's
patents, which are the subject of an infringement suit that could shut down
RIM's popular BlackBerry device in the United States. RIM's strategy of
searching for old research that could undermine the legitimacy of a rival's
patent claim, known as prior art, echoes the actions of many companies
around the country that are having to defend themselves in infringement
cases. RIM's discovery resonated with the Patent Office, which rejected
NTP's patents in a preliminary decision last year, citing the Norwegian
research. "All you need is one reference that was publicly available, and
that's sufficient to invalidate a patent," said patent attorney Dennis
Crouch. Until the mid 1990s, patent officers searched manually for
evidence of prior art, though the Internet has streamlined the process
significantly, as there are now entire Web sites devoted to the discovery
of prior art, and obscure and out-of-print books and journals are now
available from all corners of the world. The patent process requires
applicants to search for prior art, and examiners do the same, though with
the current backlog of applications the typical patent only receives about
20 hours of consideration. An industry official, whose name RIM would not
disclose, tipped off the company to the Norwegian reports, which were
issued at least two years prior to NTP's first patent. Although the Patent
Office has invalidated four of NTP's five patent claims, citing the
Norwegian research, the company's federal lawsuit is proceeding briskly,
with a judge having threatened to halt BlackBerry service in the United
States as early as next month.
Click Here to View Full Article
to the top
NJIT's SmartCampus Project to Create Closer Connections
Among People and Places
New Jersey Institute of Technology (01/23/06)
The New Jersey Institute of Technology is preparing to launch SmartCampus,
an experimental program that will explore new ways for students to connect
with each other through cell phones and other wireless devices. The
program aims to link students together with shared interests and provide
information about campus news and events. "We'll use mobile tracers to
detect the places where students like to gather and use those places to
identify students' interests and patterns," said assistant professor of
information systems Quentin Jones. "SmartCampus is a unique social
computing research project that uses technology to unite an urban
environment--in this case the NJIT campus--into a community." The
development group draws from the disciplines of electrical engineering,
computer science, information systems, and human-computer interaction in an
effort to cultivate personal connections and a sense of place that could
eventually change the way people interact in urban areas. The National
Science Foundation is contributing $1.7 million to the SmartCampus project
over the next three years, some of which will be used to provide equipment
such as cell phones, laptops, and other wireless devices to program
participants. SmartCampus will begin with 100 volunteers equipped with the
technology to locate and interact with each other. The program will then
expand to 500 participants and eventually will include the entire campus.
Volunteers will have software that enables them to access a database
comprised of the interests and activities of their fellow participants.
The researchers are aware of the privacy and safety concerns that surround
this initiative, and are requiring participants to specify which personal
information they are willing to make public. Collecting geotemporal data
is a central component of the project, though users can block the view of
their location at certain places and times.
Click Here to View Full Article
to the top
A New Way to Help Computers Recognize Patterns
Ohio State Research News (01/24/06) Gorder, Pam Frost
A pair of Ohio State University researchers has developed a technique for
improving pattern recognition software by determining in advance the most
appropriate algorithm for a given application. Aleix Martinez, assistant
professor of electrical and computer engineering, develops algorithms that
simulate human vision, a study that draws heavily on pattern recognition,
traditionally using the body of techniques collectively referred to as
linear feature extraction. Martinez notes the limitations of that
approach, however, claiming that researchers can explore a given method for
weeks, only to discover that it does not work. Martinez developed a test
with doctoral student Manil Zhu that rates the effectiveness of a given
algorithm when applied to a particular application. Using an imperfect
algorithm does not necessarily yield incorrect information, but it usually
produces superfluous information that only adds to the scientist's
workload. The researchers applied algorithms to a data-sorting exercise
and ranked their effectiveness on a scale of zero to one. Martinez and Zhu
used each algorithm to sort through a database of facial images, and then
to distinguish items within a database of objects, such as apples and
pears. Scores closest to zero were the most accurate, meaning that an
algorithm that scored 0.68 had just a 33 percent accuracy rate, and would
probably not be worth a scientist's time.
Click Here to View Full Article
to the top
After Subpoenas, Internet Searches Give Some Pause
New York Times (01/25/06) P. A1; Hafner, Katie; Bernstein, David;
Falcone, Michael
As the Justice Department aggressively pursues a court mandate for search
companies to disclose their customers' queries, many Internet users are
modifying their search habits. Google has been resistant to the Justice
Department's request, though MSN, AOL, and Yahoo! have all complied,
echoing the government's assurance that the search terms do not link back
to their original source, so there is no danger of compromising personal
information. The government has recently stepped up its monitoring
activities in the name of combating terrorism, though the revelation that
the National Security Agency has been intercepting emails and phone calls
without court-issued warrants has outraged many privacy advocates and civil
libertarians. Many Americans have no qualms about disclosing personal
information for online banking or other activities where there is a
perceived gain, but the notion of wanton government surveillance has
aroused acute resentment and anxiety among the same body of Internet users.
Still others are reassured by the ubiquity of the Internet, reasoning that
with nearly half of the almost 300 million people in the United States
using the Internet, the odds of being snared in the government's web are
slim. The motion to subpoena Google's search records has nonetheless
caused many Internet users to think more carefully about the search terms
that they select. Google claims that the subpoena would compromise its
trade secrets and impose an unreasonable burden of compliance, as well as
potentially expose users' personal information, an eventuality which Google
has said is untenable.
Click Here to View Full Article
to the top
Internet Coalition Sets Up Anti-'Badware' Site
Washington Post (01/25/06) P. D4; Mohammed, Arshad
The Stop Badware Coalition, which consists of Google and institutes at
Harvard and Oxford universities, today will announce the launch of an
anti-spyware campaign designed to counteract the spread of malicious
computer programs that have the ability to steal personal information, spy
on users who are Web surfing, and overcrowd computers with pop-up ads. The
coalition will have a Web site, www.stopbadware.org, that catalogs programs
that are dangerous to users so they can know if a program is harmful before
downloading it. Companies that manufacture malicious software will be
targeted for possible class action law suits. "For too long, unscrupulous
companies have made millions of dollars infecting our computers with
malicious software," says Stop Badware Coalition co-director John Palfrey.
"This is so dangerous because there are intruders in your house, but you
don't know that they are in there or how they got there." Harvard Law
School's Berkman Center for Internet and Society and the Oxford Internet
Institute are the two main groups involved in the project, which is
receiving funding from Google, Lenovo Group, and Sun Microsystems.
Consumer Reports WebWatch says it will be an unpaid advisor to the
coalition. Google VP Vinton Cerf says "our interest is very strong in
doing anything we can to help defend against this sort of abusive
behavior."
Click Here to View Full Article
to the top
Can Video iPod Lead to DMCA Reform?
CNet (01/23/06) McCullagh, Declan
The growing popularity of Apple's video iPod could provoke a groundswell
of opposition to the Digital Millennium Copyright Act (DMCA) of 1998, which
prohibits the distribution of software that can rip content from a DVD.
"Our best hope for getting amendments to the DMCA is for more regular
customers to feel the pinch of the DMCA," said the Electronic Frontier
Foundation's Fred von Lohmann. Digital rights advocates have protested the
DMCA since its inception, though until the iPod, there has yet to be a
mainstream consumer device significantly affected by the legislation.
Previous DMCA-based legal challenges to 321 Studios, DVD burning software,
and many computer scientists and security researchers passed under the
radar of most Americans. Reps. Rick Boucher (D-Va.) and Zoe Lofgren
(D-Ca.) have both introduced legislation seeking to address the iPod
problem. The Boucher bill is mired in ambiguous language that still
restricts the distribution of software that enables DVD ripping. The
Lofgren bill gives broader legal support to software that can circumvent
DVD encryption, though it has yet to garner much support. A broad swath of
the entertainment industry supports the DMCA as it stands, and the bill
passed through Congress with an overwhelming majority. Although many
hardware companies and Internet providers such as Intel, Sun, Verizon, and
Red Hat have pressed for reform, the outcome could be determined by how
loudly video iPod users complain.
Click Here to View Full Article
to the top
Pleasing Plant Shapes Explained by New Computer
Model
EurekAlert (01/23/06)
A team of computer scientists from the University of Calgary has developed
an animated model that simulates the growth of plants as they assume
recognizable patterns. Theirs is the first detailed model of
phyllotaxis--the process that begins molecularly where lateral organs
gather around a central axis, creating the familiar spiral pattern in many
plants. "Biologists have many theories about why phyllotaxis exists but
have always wondered how it happens," said Richard Smith, a PhD student at
Calgary. "This model is exciting because it proposes a mechanism that
works and can be used to try and prove some of the biological theories
about the growth process." Smith and computer science professor Przemyslaw
Prusinkiewicz partnered with a team of Swiss botanists to develop
three-dimensional models of plant growth at the microscopic level,
revealing the process of cell division and the incrementally spaced
concentrations of the plant growth hormone auxin. The model showed the
clear development of spiral patterns that appear in many common plants,
such as daisies and sunflowers. The scientists hope that their research
will provide botanists with an additional tool to supplement their
biological experiments. Prusinkiewicz says, "This was a great example of
the synergy you can have between biology and computer science and how the
tools of one discipline can be used to answer questions in another."
Click Here to View Full Article
to the top
GPL 3 Draft Draws Mostly Positive Response
IDG News Service (01/23/06) Martens, China
Preliminary reactions among lawyers and the open source community to the
Free Software Foundation's (FSF) draft of the third version of the GPL have
been generally favorable. Despite lingering concerns about software
patents and digital rights management (DRM), the industry has also been
receptive to the first update to the GPL in 15 years, generally crediting
the draft for taking a multilateral approach. The FSF estimates that the
GPL distributes the majority of all free and open-source software,
including MySQL and Linux. The FSF formally released the draft at the
First International Conference for GPLv3 at MIT, which boasted some 262
registered attendees from a broad cross-section of companies and countries.
The FSF's Eben Moglen described the audience's reaction to his
presentation as attentive, and noted that many in the crowd seemed relieved
that the new version was not pushing copyleft--the provision where works
and their subsequent updates and revisions are free--too far. One
controversial element of the draft protects "downstream" software users
from patent infringement suits by large companies. Moglen hopes that
companies will either disavow their own licenses or employ new patent tools
such as the Open Invention Network created by IBM, Red Hat, and other
groups. While the FSF has long been critical of the software patent
process, a newer source of controversy in the update is DRM, which Moglen
describes as an endless cause of harm. The update also makes the GPL
compatible with Eclipse and Apache, the two principal open-source licenses.
Moglen notes that the new GPL takes a more international approach than
previous versions, as it was written to untether itself from U.S. law,
though English will remain its only official language. Moglen and FSF
President Richard Stallman expect to release the second version of the
draft in late May.
Click Here to View Full Article
to the top
Encryption Using Chaos
Technology Review (01/24/06) Greene, Kate
Security researchers are exploring a new method of encryption where the
chaotic fluctuations of a laser beam encode messages passing over fiber
optic cable, which require a receiving laser of almost identical properties
to decode it. The University of the Balearic Islands' Claudio Mirasso used
chaotic laser encryption to transmit data at 1 Gbps, a speed equivalent to
the typical commercial transmission rate. To transmit data inside a
chaotic laser, the message must first be translated into an optical signal,
which is then funneled into the laser that emits it along with its beam.
The chaos of the beam is then accentuated, and the message is transmitted
to a receiving beam of near-identical properties. Upon receipt of the
message, the process gives way to chaotic synchronization, which, while
still not completely understood, pairs the sending and receiving lasers
together, and the receiving laser subtracts the chaos of the transmission
to recover the original message. Chaotic laser encryption will have to
prove its effectiveness if it is to supplant conventional optical signals,
though a body of scientists has already reported the successful
transmission of a chaos-encrypted message through an intermediate laser,
which is critical for commercial applications where messages would have to
travel great distances. While Mirasso admits that the basic technology is
not perfect, he will next turn his attention to developing smaller devices
for communication based on chaotic encryption, though he does not expect
commercial applications of the technology to appear for the next five
years.
Click Here to View Full Article
to the top
Manufacturing Gets Absolutely Fabulous
BBC News (01/23/06) Day, Peter
Long a center for innovation and cutting-edge research, MIT's Media Lab
under the direction of founder Nicholas Negroponte has extricated itself
from the day-to-day computing concerns to embrace the fantastic
possibilities of the digital future. In the same vein, MIT's Neil
Gershenfeld has been running MIT's Center for Bits and Atoms since 2002,
which harnesses an interdisciplinary talent pool to manufacture new and
innovative devices through unconventional techniques. By linking
innovation to manufacturing, Gershenfeld is attempting to level the barrier
between thought and action that he claims has bifurcated man into clearly
delineated camps since the Renaissance. "Computer science is one of the
worst things that ever happened to computers or to science, because it
prematurely froze the notion of computing on what was possible in 1950,"
Gershenfeld explains as he describes his vision of computers that digitize
production by acting as tools themselves, similar to how proteins direct
the human body, rather than simply serving as the control center for other
tools. Gershenfeld hopes that in 10 years, three-dimensional copiers will
be as commonplace as today's printers and conventional copiers, though the
price of mini-fabs currently ranges between $20,000 and $30,000. When
prices come down, Gershenfeld looks to the home-fab as the wrecking ball
that will level the digital barrier between rich and poor, carrying with it
the promise of a capacity for innovation that knows no economic or national
boundaries.
Click Here to View Full Article
to the top
Privacy for People Who Don't Show Their Navels
New York Times (01/25/06) P. 7; Glater, Jonathan D.
There is growing interest today in software that protects the
confidentiality of Internet users sending emails and posting blogs. Tor, a
free anonymity software package, has seen increased downloads, while the
free Java Anonymous Proxy is another program for users who want to
communicate anonymously. While it is difficult to quantify how many people
have opted to conceal their Internet presence, the recent surge in Web
anonymity can be attributed to a growing number of users who want to
download music but are concerned about legal reprisals from the
entertainment industry, as well as those who use the Internet as a
confessional or a forum for political dissent. Electronic Frontier
Foundation technology manager Chris Palmer says, "People in the world are
more interested in anonymity now than they were in the 1990s." While many
software companies are moving away from identity protection software, their
heavy investment in the technology several years ago preceded demand, as a
rapidly growing number of Internet users is growing concerned with hackers
looking to steal credit card numbers, bank accounts, and other sensitive
personal information. Despite the renewed interest, many identity
protection ventures are still having difficulty turning a profit. Tor's
Defense Department funding has run out, and project leader Roger Dingledine
is now working without compensation as he searches for new backers. Tor
employs a technique called onion routing where a tiered server structure
extricates the user from the sites he has visited. The Privoxy software
that comes with Tor prevents new cookies from being created and blocks a
computer from sending some personal information to Web sites, though the
package can slow browsing speeds.
Click Here to View Full Article
to the top
Web Surfers Decide a Site's Worth in Fraction of a
Second: Study
Canadian Press (01/22/06) Pacienza, Angela
Businesses would do well to invest more in the appearance of their Web
sites, suggests Gitte Lindgaard, a professor of human-computer interaction
in the department of psychology at Carleton University in Ottawa.
Lindgaard is the co-author of a study that reveals that Web surfers base
decisions on how they feel about a particular site according to what they
immediately see. At the University's Human-Oriented Technology Lab,
researchers compared the assessments of volunteers exposed to sites for 500
milliseconds, or half a second, with those of a new set of volunteers who
only had 50 milliseconds, or 1/20th of a second, to make a judgment. The
results were similar for both groups, and were completely based on visual
appearance because the volunteers had very little time to consider whether
they felt good or bad about a site. "People make up their minds very, very
quickly about how much they like what they see," says Lindgaard, adding
that those who like what they see will then continue at the site in an
attempt to prove that they made a good decision. "The message to Web
developers is, at this point, you better make sure you don't offend people
visually," says Lindgaard. The study can be found in the current edition
of the journal Behaviour and Information Technology.
Click Here to View Full Article
to the top
They're Hiring in Techland
BusinessWeek (01/23/06) Ante, Spencer
The technology industry is no longer producing over 300,000 jobs a year as
it did in the late 1990s, but the industry is starting to create an average
of about150,000 jobs a year. After adding some 125,000 tech jobs last
year, according to Moody's Economy.com, chief economist Mark Zandi is now
forecasting that 217,000 jobs will be created in 2006. "As the memory of
the tech bust fades, we seem to be getting better and better job growth,"
says Zandi. The steady pace of tech job growth comes at a time when
corporate America is starting to boost spending on software and business
equipment, and the hiring is expected to be somewhat broad-based.
Companies such as Google, Microsoft, Accenture, Amazon, Advanced Micro
Devices, and Infosys indicate they will be joining startups such as
NetSuite and OfficeTiger in hiring more tech workers in the United States
this year. The best opportunities will be available to high-level software
engineers, management consultants, and computer scientists, while low-level
engineers will continue to see jobs outsourced, and major sectors such as
telecommunications and enterprise software are expected to show more
weakness. Economy.com expects the average high-tech salary to rise in the
mid- to high-single digits this year, compared with a 5.1 percent increase
in 2005 to $69,000, and a 4.3 percent gain in 2004. Also, Silicon Valley
is no longer the hub for job growth, considering only 2,000 new tech
positions were added in the region last year.
Click Here to View Full Article
to the top
Bringing Communities to the Semantic Web and the Semantic
Web to Communities
University of Southampton (ECS) (01/23/06) Lawrence, K. Faith; Schraefel,
M.C.
K. Faith Lawrence and M.C. Schraefel of the University of Southampton
argue that the key types of virtual communities existing within the
Semantic Web--Communities of Practice (COPs) and social networks--may not
necessarily qualify as communities when compared with definitions found in
other areas. The authors propose a hybrid COP/social network model, the
Internet Based Community Network (IBCN), in which the properties of both
kinds of networks are combined to fulfill the definition of a community;
applications and services can then be developed and operate with the
assumption that the network acts in ways specific to a community. In a
COP, the links are inferred, the focus is on practice, and social
interaction is not required; in a social network, links are explicit, the
focus is on people, and shared purpose and behaviors are not mandatory. An
IBCN features explicit as well as implicit links, and focuses on people as
members of the community. Lawrence and Schraefel focus on amateur
communities where the creation and sharing of data is incorporated into the
platform on which the community infrastructure rests, and collaborating
with such communities offers an opportunity to work in a dynamic, data rich
environment. Careful study of community interaction allows researchers to
explore the transformation of data sharing from good practice to expected
behavior, which raises the issue of whether the community's governance
elements can be migrated to non-community networks. To determine who could
support the Amateur Online Writing Community with the addition of semantic
data to their existing work and interaction processes, the authors
developed two ontologies, the Fan Online Persona ontology and the OntoMedia
ontology. The ontologies are customized to the community's needs and the
development of software to allow community members to indirectly interact
with the ontology constrained metadata.
Click Here to View Full Article
to the top
A Semantic Solution to Finding Information Among
Peers
IST Results (01/24/06)
The IST-funded SWAP program has developed an open-source system for
retrieving information by infusing peer-to-peer networks with Semantic Web
technologies. The application advances the use of ontologies, or
vocabularies of formal description, by applying them to content in
peer-to-peer databases. File sharing applications have become an
enormously popular method for transmitting content over the Internet,
linking two users' computers together directly, rather than relaying
through a server. "However, because of the distributed nature of P2P
systems it can be hard to find the information you are looking for," said
Marc Ehrig, leader of the SWAP team. Semantic Web technologies enable
computers to retrieve targeted information more quickly, as two pilot SWAP
projects have demonstrated. One pilot program, XAROP, coordinated 30
tourism stakeholders in the Balearic Islands, providing instant access for
travel agents, tour operators, and other concerned parties to information
such as the number of guests staying in a particular facility. Bibster,
the second pilot program, is still in use by the academic community to
oversee bibliographic information, facilitating the search of scholarly
articles, journals, and papers through a peer-to-peer network.
Click Here to View Full Article
to the top
NSA Spy Program Hinges on State-of-the-Art
Technology
National Journal (01/20/06) Vol. 38, No. 3, P. 47; Harris, Shane
Cutting-edge data-mining technologies play a key role in the National
Security Agency's (NSA) plan to collect and analyze vast volumes of call
and email traffic to extract valuable data about terrorists and other
potential enemies. Data-mining not only spots key words but also unearths
hidden relationships between data points, and can even identify the
thinking patterns and biases of specific analysts and propose alternative
speculations. In 2002, the Advanced Research and Development Activity
(ARDA) group apportioned $64 million in research contracts for the Novel
Intelligence from Massive Data (NIMD) project, a program to develop an
early-warning system designed to prevent information overload--and thus the
overlooking of important data--among intelligence analysts. A "Call for
2005 Challenge Workshop Proposals" issued by ARDA says research funded by
NIMD is supposed to not only help analysts cope with the flood of data, but
also to "detect early indicators of strategic surprise, and avoid analytic
errors." The NIMD project and other ARDA-supported efforts are very
similar to the Defense Department's Total Information Awareness (TIA)
program, which sought to establish a system for uncovering terrorist plots
by mining intelligence databases as well as private databases; concerns
over TIA's potential to infringe on civil liberties led to the program's
suspension in 2003, but other agencies are continuing the development of
tools used in TIA. Also generating discomfort among lawmakers is the
unanswered question as to whether NSA's current data-mining programs, like
TIA, are making sizable investments in technology and policy research to
safeguard privacy. Former program manager in the office of ex-TIA manager
John Poindexter Tom Armour confirms that the NSA's interest in pursuing
projects such as NIMD lies in the analysis of call and email traffic.
Click Here to View Full Article
to the top
Expert Calls for Increased E-Voting Security
Computerworld (01/23/06) P. 14; Songini, Marc L.
In a Q&A with Computerworld, security specialist Herbert Thompson
describes his volunteer effort to hack into Diebold Elections Systems'
e-voting machines in Leon County, Fla., on Dec. 13, in response to fears
about accuracy and security expressed by local officials. Thompson,
director of research at Security Innovations in Wilmington, Mass., says he
wrote a five-line script in Visual Basic that provided access to the
central tabulator of the Diebold AccuVote optical scan device, and the
opportunity to change votes without leaving a log. He added that Finnish
security expert Harri Hursti was able to change the content of a memory
card, describing his effort as the equivalent of stuffing a ballot box. As
a security expert, Thompson views the issue more as a bad software matter
than as a political one. He says the exercise was not about Diebold,
because other vendors are also making tabulation software and optical scan
gear that is not open to independent audit and analysis. Thompson says the
security of e-voting pales in comparison to the standards of critical
business processes. "There should be much more severe security-testing
requirements," he says. "The key is you need to raise awareness that these
vulnerabilities do exist and can be exploited, and you need a way of
measuring security."
Click Here to View Full Article
to the top
Salaries Stagnant for IT Workers
eWeek (01/16/06) Vol. 23, No. 3, P. 27; Hines, Matt
IT salaries appear to have been flat during the fourth quarter of last
year, according to the preliminary results of a survey of about 2,000 U.S.
companies by research firm Janco Associates. The January 2006 IT Salary
Survey reveals that the mean compensation for computer-industry
professionals rose to $74,636 in the last quarter of 2005, slightly up from
$69,579 during the final quarter of 2004. Salary winners included
management jobs in wireless communications and security, and positions
involving e-commerce operations, computer programming or systems
networking, and production operations. However, the average salary for
rank and file workers such as software engineers and database specialists
fell from roughly $95,000 in the third quarter of 2005 to about $94,000.
For the past eight quarters, salary levels have been relatively level as
companies have looked to curb their IT spending. "There has been a
degrading of the demand for IT professionals because many companies aren't
looking at technology to gain a competitive advantage as much as they see
it as a cost center," says Janco CEO Victor Janulaitis. "Companies are
looking at IT more like any other business unit." Nonetheless, some top
executives and specialized workers could see more growth in their salaries
this year, according to Janco.
Click Here to View Full Article
to the top
On Its Face, ALM's Appealing
Software Development Times (01/15/06)No. 142, P. 29; DeJong, Jennifer
Application life-cycle management (ALM) offers faster and more efficient
application development, although its wide adoption is being held up by a
lack of tool integration and the cultural challenge ALM presents to
developers, according to analysts. Although integrated ALM products can
help companies conform to federal regulations such as Sarbanes-Oxley, the
spreading adoption of service-oriented architectures is an even bigger
driver of ALM integration. "Organizations need to not only make sure they
have the technical processes to deliver application services, but they also
need to make sure they have the organizational capabilities to define,
capture, share, and manage service requirements, services delivery, and
ongoing services support," says Upside Research President David Kelly. He
notes that ALM can deliver enormous benefits to enterprise software
development teams when properly implemented, but discipline is necessary if
life-cycle management is to be effective. Forrester Research analyst Carey
Schwaber says the first thing that must be done is to recognize ALM as
being more than an integrated tool set, but a process in and of itself;
this means all the face-to-face interactions and other communications
necessitated by ALM must be factored in. Interarbor Solutions analyst Dana
Gardner says ALM also combines the traditionally separate processes of
development and deployment, noting that developers traditionally produced
applications based on requirements and handed them in to someone else.
Applications must now be designed to accommodate deployment and flexible
implementation, and this requires a change in thinking. Adding to the
cultural difficulty is the lack of interoperability between different
vendors' tools, according to Ovum analyst Bola Rotibi, who says the Eclipse
Application Lifecycle Framework project could potentially address the issue
by providing a loosely coupled means to link divergent products, based on
Web services.
Click Here to View Full Article
to the top