There are several drivers which compel an IT organization to spend effort on knowledge transfer of legacy applications.
As a maturing workforce of Baby Boomers retire, organizations end up handing over the maintenance task of these assets to a new incoming
workforce of Gen Y employees.
As organizations strive to cut costs and outsource the maintenance of legacy applications to IT vendors, the mostly young engineers from outsourcing
vendors need to understand the legacy applications before they start support activities.
Mergers and acquisitions result in consolidation of IT systems for seamless functioning of the new organization. Application consolidation requires a
good understanding of existing IT applications so that embedded business rules and concepts are not lost in the process of consolidation.
In all these scenarios, organizations are faced with the challenge of effective knowledge transfer from the experts who have been maintaining these
assets for more than 25 years to the young engineers. Most of these legacy applications contain business rules and business practices developed over so
many years and are poorly documented. Adding to the problem, most of these legacy applications are mission critical and organizations cannot afford
disruption to business due to inadequate knowledge transfer.
The traditional techniques of knowledge transfer such as interviewing, going through source code and documentation are proving to be ineffective. It is
not possible to capture the rationale behind many implementation decisions taken by the programmer over a course of application life through interviewing
alone. Also going through the source code of the entire application which has undergone enhancements over many years is very expensive and few
organizations can fund such an initiative in these difficult times. It is most likely that documentation of the legacy applications is out of sync with the
executable code. Since knowledge transfer brings an additional cost to existing IT budgets, IT organizations are under pressure to keep the cost to the
Use Parametric Estimation for Pragmatic Transition Timelines
Many times, we have observed that transition timelines are arrived at based on cost and/or schedule constraints. We recommend that transition
timelines be estimated based on certain parameters like application complexity, code quality of the application as measured by the maintainability index of
the application, process maturity of the IT organization, business criticality of the application and type and number of users for the application. Creating an
organization specific estimation model taking these parameters into account will help in attaining a realistic estimate for knowledge transfer.
Upfront and Honest Communication Helps Secure Cooperation During Knowledge Transfer
Most of the business knowledge related to legacy applications remains in tacit form. For an effective knowledge transfer, it is important to ensure the
cooperation of senior programmers to pass on this knowledge. Yet, most organizations discount this human factor and mandate senior programmers to
transfer knowledge without clearly articulating the business rationale behind the transition initiative or any clarity on the future role for senior programmers.
Upfront and honest communication on business rationale for the transition initiative and a well defined plan to absorb senior programmers at the end of the
transition goes a long way in motivating senior programmers to share the tacit knowledge without any sense of insecurity or hostility.
Guided Hands-On Learning Helps to Gain Firm Understanding of the Application Nuances
No amount of reading documentation or interviewing the senior developers is going to help to capture the important operational details related
application maintenance or reveal the implementation assumptions made. We recommend that in addition to traditional techniques, teams design mock
exercises or simulate application maintenance enhancements and provide an opportunity for junior developers to work on them under the guidance of
senior programmers. This would prepare the receiving team to understand the operational procedures and conventions followed for maintaining the legacy
In most cases, the decision of the person to take over the responsibilities is made by other people. This puts two people at odds with each other. If
the experienced person has a more active role in identifying and recruiting the junior person, the chances of success will increase. We recommend that
where possible, senior developers should be actively involved in identifying and recruiting the recipients of the knowledge transfer.
Picture is Worth a Thousand Words
It is a well known fact that visual models communicate much better than voluminous textual documentation and model driven development is an
accepted practice for modern application development. But what do we do with legacy applications which were not developed using the visual models?
We recommend that it is never too late to start this activity and in fact junior developers can create visual models to playback to senior developers their
understanding of the application during knowledge transfer phase. Reverse engineering tools can be leveraged in some cases to create visual models from
Seek Knowledge from Experts from Other Organizations Within the Same Industry Vertical
We often encounter situations when the original developers who have developed the application are no longer with the organization and there exists
very little documentation. In these scenarios, we recommend tapping into experts in other organizations which belong to the same industry vertical.
Business vocabulary, business processes and business entities tend to remain almost the same within an industry vertical, though specific business rules
could differ across organizations. Nevertheless, interaction with people in the same industry vertical provides a high-level understanding of the business
context, which can then be supplemented by other techniques to understand the application. In these days of social networking, this should not be an
While the tools to record user actions and replay the script have been in use for a while, most of these tools are aimed at automating the user interface
testing. A new breed of tools has emerged in the recent past which not only replays the user actions but also present the execution of source code in
response to a user initiated event. This unique capability to map user actions and specific statements in source code gets executed in response to user
action and helps in understanding the application much better. Many of the traditional debugging tools also provide this capability. Record the application
execution flow using either an application problem resolution tool or a debugging tool, create one recording for each use case and use the recording to
understand the application flow in the source code based on use case execution.
These recordings can be archived. During knowledge transfer phase, a developer can replay the recording associated with the use case and quickly
understand the source code execution flow in context and gain valuable insights into the application logic and the business rules.
Leverage MDM and DQ Tools to Explore Entity Relationships and Understand the Business Context
Most of the legacy applications are heavy on data processing. Hence, understanding the application data schema, the underlying entities, relationships
and constraints is an important step in knowledge transfer phase.
The growing focus on master data management (MDM), data quality (DQ) and data warehouse (DW) initiatives within IT organizations has led toa
number of products that specialize in automated data relationship discovery across different data sources. Use of these data relationship discovery tools
can provide visibility into relationships between various entities. Data relationship discovery tools scan the multiple data sources, databases and files to
expose the linkages between different entities within the same domain, computations and derived fields, correlations stored in cross-reference tables, and
arithmetic relationships between columns. While these tools are traditionally used in MDM / DQ / DW initiatives, using these tools in the context of
knowledge transfer helps to understand the underlying data model for a given application.
Application source code undergoes many changes over its life time. These changes could be bug fixes or enhancements. However, the changes to
source code are not uniform across the application. The 80/20 rule applies to software issues as well. Eighty percent of the issues in software are caused
by 20 percent of the software. This 20 percent of the software contains the code which is either unstable or addresses a business domain that is changing
frequently. Also, only certain modules of the application are business critical or contain sensitive business rules. These are called hot spots in an
application. Configuration management tools capture the changes done to the software each time it is checked in. Configuration management tools help us
to identify the source code that is frequently modified and reveal the hot spots in the application. The extent to which these changes are revealed depends
upon how sophisticated the configuration management tools are. At a minimum it is possible to obtain the version number of each configuration item.
A program which has a higher version number denotes it is frequently checked in and implies either it has undergone a relatively large number of
changes or extremely unstable design. During the knowledge transfer phase, instead of trying to understand the entire application and consume the
valuable time and effort of experienced developers, it helps to prioritize knowledge transfer efforts on hot spots in the application and be well grounded on
the functioning of those modules. This ensures that the most important parts of the application are well understood and minimizes the risk for future
enhancements and bug fixes.
Code mining tools analyze the source code and provide a report on cross referencing of data items, business rules and the snippets of code that
operate CRUD on the data. Analyzing the output of code mining tools is more effective than manual study of source code. Output of source code mining
tools helps us to understand the program context, a broad structure of the program and important data elements used by the program.
By leveraging these best practices and techniques, the inherent risk involved in knowledge transfer from a 25-year experienced senior developer to a
25-year-old young developer can be minimized.
Ravi Karanam is the CTO for Application Services business at Unisys, and Cem Tanyel is the Vice President for Application Services business