There are several drivers which compel an IT organization to spend effort on knowledge transfer of legacy applications. As a maturing workforce of Baby Boomers retire, organizations end up handing over the maintenance task of these assets to a new incoming \n\nworkforce of Gen Y employees. As organizations strive to cut costs and outsource the maintenance of legacy applications to IT vendors, the mostly young engineers from outsourcing \n\nvendors need to understand the legacy applications before they start support activities. Mergers and acquisitions result in consolidation of IT systems for seamless functioning of the new organization. Application consolidation requires a \n\ngood understanding of existing IT applications so that embedded business rules and concepts are not lost in the process of consolidation.In all these scenarios, organizations are faced with the challenge of effective knowledge transfer from the experts who have been maintaining these \n\nassets for more than 25 years to the young engineers. Most of these legacy applications contain business rules and business practices developed over so \n\nmany years and are poorly documented. Adding to the problem, most of these legacy applications are mission critical and organizations cannot afford \n\ndisruption to business due to inadequate knowledge transfer. The traditional techniques of knowledge transfer such as interviewing, going through source code and documentation are proving to be ineffective. It is \n\nnot possible to capture the rationale behind many implementation decisions taken by the programmer over a course of application life through interviewing \n\nalone. Also going through the source code of the entire application which has undergone enhancements over many years is very expensive and few \n\norganizations can fund such an initiative in these difficult times. It is most likely that documentation of the legacy applications is out of sync with the \n\nexecutable code. Since knowledge transfer brings an additional cost to existing IT budgets, IT organizations are under pressure to keep the cost to the \n\nminimum.Use Parametric Estimation for Pragmatic Transition TimelinesMany times, we have observed that transition timelines are arrived at based on cost and\/or schedule constraints. We recommend that transition \n\ntimelines be estimated based on certain parameters like application complexity, code quality of the application as measured by the maintainability index of \n\nthe application, process maturity of the IT organization, business criticality of the application and type and number of users for the application. Creating an \n\norganization specific estimation model taking these parameters into account will help in attaining a realistic estimate for knowledge transfer. Upfront and Honest Communication Helps Secure Cooperation During Knowledge TransferMost of the business knowledge related to legacy applications remains in tacit form. For an effective knowledge transfer, it is important to ensure the \n\ncooperation of senior programmers to pass on this knowledge. Yet, most organizations discount this human factor and mandate senior programmers to \n\ntransfer knowledge without clearly articulating the business rationale behind the transition initiative or any clarity on the future role for senior programmers. \n\nUpfront and honest communication on business rationale for the transition initiative and a well defined plan to absorb senior programmers at the end of the \n\ntransition goes a long way in motivating senior programmers to share the tacit knowledge without any sense of insecurity or hostility.Guided Hands-On Learning Helps to Gain Firm Understanding of the Application NuancesNo amount of reading documentation or interviewing the senior developers is going to help to capture the important operational details related \n\napplication maintenance or reveal the implementation assumptions made. We recommend that in addition to traditional techniques, teams design mock \n\nexercises or simulate application maintenance enhancements and provide an opportunity for junior developers to work on them under the guidance of \n\nsenior programmers. This would prepare the receiving team to understand the operational procedures and conventions followed for maintaining the legacy \n\napplications.In most cases, the decision of the person to take over the responsibilities is made by other people. This puts two people at odds with each other. If \n\nthe experienced person has a more active role in identifying and recruiting the junior person, the chances of success will increase. We recommend that \n\nwhere possible, senior developers should be actively involved in identifying and recruiting the recipients of the knowledge transfer.Picture is Worth a Thousand WordsIt is a well known fact that visual models communicate much better than voluminous textual documentation and model driven development is an \n\naccepted practice for modern application development. But what do we do with legacy applications which were not developed using the visual models? \n\nWe recommend that it is never too late to start this activity and in fact junior developers can create visual models to playback to senior developers their \n\nunderstanding of the application during knowledge transfer phase. Reverse engineering tools can be leveraged in some cases to create visual models from \n\nimplementation. Seek Knowledge from Experts from Other Organizations Within the Same Industry VerticalWe often encounter situations when the original developers who have developed the application are no longer with the organization and there exists \n\nvery little documentation. In these scenarios, we recommend tapping into experts in other organizations which belong to the same industry vertical. \n\nBusiness vocabulary, business processes and business entities tend to remain almost the same within an industry vertical, though specific business rules \n\ncould differ across organizations. Nevertheless, interaction with people in the same industry vertical provides a high-level understanding of the business \n\ncontext, which can then be supplemented by other techniques to understand the application. In these days of social networking, this should not be an \n\ninsurmountable challenge.Leverage Application Problem Resolution Tools to Map Source Code to User ActionWhile the tools to record user actions and replay the script have been in use for a while, most of these tools are aimed at automating the user interface \n\ntesting. A new breed of tools has emerged in the recent past which not only replays the user actions but also present the execution of source code in \n\nresponse to a user initiated event. This unique capability to map user actions and specific statements in source code gets executed in response to user \n\naction and helps in understanding the application much better. Many of the traditional debugging tools also provide this capability. Record the application \n\nexecution flow using either an application problem resolution tool or a debugging tool, create one recording for each use case and use the recording to \n\nunderstand the application flow in the source code based on use case execution. These recordings can be archived. During knowledge transfer phase, a developer can replay the recording associated with the use case and quickly \n\nunderstand the source code execution flow in context and gain valuable insights into the application logic and the business rules. Leverage MDM and DQ Tools to Explore Entity Relationships and Understand the Business ContextMost of the legacy applications are heavy on data processing. Hence, understanding the application data schema, the underlying entities, relationships \n\nand constraints is an important step in knowledge transfer phase. The growing focus on master data management (MDM), data quality (DQ) and data warehouse (DW) initiatives within IT organizations has led toa \n\nnumber of products that specialize in automated data relationship discovery across different data sources. Use of these data relationship discovery tools \n\ncan provide visibility into relationships between various entities. Data relationship discovery tools scan the multiple data sources, databases and files to \n\nexpose the linkages between different entities within the same domain, computations and derived fields, correlations stored in cross-reference tables, and \n\narithmetic relationships between columns. While these tools are traditionally used in MDM \/ DQ \/ DW initiatives, using these tools in the context of \n\nknowledge transfer helps to understand the underlying data model for a given application.Leverage SCM Tools to Identify the Application Hot Spots and Prioritize Deep DivesApplication source code undergoes many changes over its life time. These changes could be bug fixes or enhancements. However, the changes to \n\nsource code are not uniform across the application. The 80\/20 rule applies to software issues as well. Eighty percent of the issues in software are caused \n\nby 20 percent of the software. This 20 percent of the software contains the code which is either unstable or addresses a business domain that is changing \n\nfrequently. Also, only certain modules of the application are business critical or contain sensitive business rules. These are called hot spots in an \n\napplication. Configuration management tools capture the changes done to the software each time it is checked in. Configuration management tools help us \n\nto identify the source code that is frequently modified and reveal the hot spots in the application. The extent to which these changes are revealed depends \n\nupon how sophisticated the configuration management tools are. At a minimum it is possible to obtain the version number of each configuration item.A program which has a higher version number denotes it is frequently checked in and implies either it has undergone a relatively large number of \n\nchanges or extremely unstable design. During the knowledge transfer phase, instead of trying to understand the entire application and consume the \n\nvaluable time and effort of experienced developers, it helps to prioritize knowledge transfer efforts on hot spots in the application and be well grounded on \n\nthe functioning of those modules. This ensures that the most important parts of the application are well understood and minimizes the risk for future \n\nenhancements and bug fixes.Leverage Code Mining Tools to Understand the Program Structure and Control Flow\nCode mining tools analyze the source code and provide a report on cross referencing of data items, business rules and the snippets of code that \n\noperate CRUD on the data. Analyzing the output of code mining tools is more effective than manual study of source code. Output of source code mining \n\ntools helps us to understand the program context, a broad structure of the program and important data elements used by the program. \n\nBy leveraging these best practices and techniques, the inherent risk involved in knowledge transfer from a 25-year experienced senior developer to a \n\n25-year-old young developer can be minimized.\n\nRavi Karanam is the CTO for Application Services business at Unisys, and Cem Tanyel is the Vice President for Application Services business \n\nat Unisys.