The new SAP Test Data Migration Server (TDMS) is an automated solution for creating test systems that can be a real alternative to full-system copies. It can pare down the size of your test system in a variety of ways, the most popular of which is to copy only a time slice of your production data to populate your test system. This article gives you an idea of the capabilities of TDMS, provides a detailed walkthrough of how to use it to create a reduced-size test system, describes its technology and process types, and gives you some hints on how to get it running in your landscape. Learn how TDMS’s new technology can save you time, money, and memory in your non-production systems and benefit your organization.
Key Concept
Test Data Migration Server (TDMS) replaces the client-specific data of a client in a non-production system with current test data. For the extraction of new test data from a production system, for example, TDMS uses various rules to lessen the volume of data. This enables you to include, for instance, only the previous three periods of transactional data in the test data refresh. In addition, all cross-client configurations, such as interfaces and repository objects, as well as all user-related data will remain as they are in the non-production system to avoid time-consuming adjustments after a refresh.
Today’s production system databases are growing rapidly — not only in hardware requirements, but also in the creation of non-production systems — as current test data becomes more expensive and technically challenging. Even if you can still perform full-system copies or remote client copies, the manual effort required to refresh such a system, including all of the post-processing steps, demands an easier, more automated solution to create test systems. TDMS addresses exactly these issues. It has become an appealing product for providing current test data in non-production environments.
This article gives you an overview of TDMS, as well as a detailed walkthrough of how to use it to set up a test system with a reduced volume of data. I often receive feedback in training classes and implementation projects that the “getting started” portion of TDMS would be a lot easier if a walkthrough describing the first data transfer with TDMS were available.
As an introduction, I would first like to give you an idea of the capabilities of TDMS and how you might benefit from using it. After that, I’ll describe the technology and the process types of TDMS, as well as give you some hints on how to get TDMS running in your landscape. In the main part of this article, I’ll follow a TDMS instance for a time-based reduction of data volume in the non-production system, the process type chosen for most TDMS data transfers. (This process type transfers customizing, master, and application data; the other available process types are explained in this article, too.) This article is based on TDMS 3.0 Service Pack 8.
Prerequisites
Anyone reading this article should know what an SAP landscape looks like and what the challenges are when you need to provide non-production systems with current test data. Typically, interest in TDMS rises when you face high data volume in the production system and higher costs for data storage. In these cases, you don’t want to create full copies of the production system anymore, but you do want to populate your test system with production data via TDMS 3.0.
TDMS Exploration
From a high-level perspective, TDMS performs a remote client copy with two exceptions: The data transfer excludes or significantly lowers the amount of certain kinds of data (e.g., transactional data, application logs) to reduce data volume, and it doesn’t change certain areas in the receiver system, such as user-related data. This minimizes the post-processing effort required after data transfer. You can set up test users and authorizations once and then use TDMS to refresh the test data again and again without needing to re-enter the username and authorization. From a closer perspective, TDMS is a powerful tool that brings current data to a non- production environment for testing, decreases data volume, and reduces the manual effort required to create test systems.
In a typical three-system landscape (i.e., containing development, quality assurance, and production systems), a full-system copy of the production system replaces the quality assurance system on a regular basis (typically, once or twice a year). Custom developments occur on the development system and are quickly transported to the quality assurance system. All changes are routed through the quality assurance system before they are transported to the production system. Some changes may fail the tests in the quality assurance system and, therefore, won’t be transported to the production system. In the quality assurance system, you can test these changes, so if your tests execute successfully, the quality assurance system releases the changes to the production system.
Note
A full-system copy of the production system to the development system would mean completely overwriting the old development system. It’s very hard to identify and export only those developments and settings from the development system that you haven’t yet transported into the production system. Something is always missing afterward, and you want to have current test data in the development system to test the new changes that have not yet transported to the production system.
A full-system copy of the production system doesn’t typically replace the development system because the danger of losing developments and customizations is too high. Therefore, developers and testers need to create their own test data or find a way to manually transfer test data into the development system. Because the development system lacks current test data, it usually does not reflect what is going on in the production system.
The developers must manually create the only test data available. You often don’t discover issues until you send the test data to the quality assurance system. Therefore, you have to transport the new data to the quality assurance system, test it there, and detect the issues. Then, go back to the development system, correct the problems, and transport again. If current test data were available in the development system, most of those transports would not be required.
To enable a system for reasonable testing, you replace your quality assurance system with a full copy of the production system. These copies include system-specific settings, such as users, authorizations, logical system names, and interfaces to other systems that require manual effort to adjust. In addition, testers don’t need the total volume of production system data; the latest periods of transactional data usually suffice.
However, if you use full-system copies, the storage for the quality assurance system’s database must be large enough to contain the entire volume of data in the production system. Also, the system’s hardware must be powerful enough to provide reasonable performance when working on this large volume of data. Test systems need to provide response times that don’t frustrate their testers. If the servers aren’t large enough, it could take the testers an inordinate amount of time to complete their work. TDMS can help you resolve these issues. For example:
- You could create an additional client in the development system and then use TDMS to provide a small amount of current test data for it without needing to replace your whole system with a full-system copy and without affecting other clients.
- You could refresh the quality assurance system with a reduced set of current test data using TDMS. Users, authorizations, and interfaces, as well as cross-client customizing, wouldn’t change in the data transfer, so you wouldn’t need to perform any manual adjustments.
- Using TDMS Shellcreation to create the quality assurance system based on a production system export is also an option: You would get a small system shell that is basically a copy of the production system but without the data that’s in the SAP ERP application tables. TDMS would then transfer the omitted SAP ERP volume of data according to your test requirements. (For more information on TDMS Shellcreation, see Nils Krugmann’s article, “Build a Reduced-Size System Copy for Testing with TDMS Shellcreation.”)
Although TDMS provides all of these features, you need to consider other things as well. For example, the runtime of a data transfer is often longer than it would be to perform a full-system copy, and you may already have scripts that automate the post-processing steps. There are a lot of pros and cons to using TDMS. However, a discussion of these pros and cons exceeds the scope of this article, so make sure you also check out other documents and sources (see the sidebar “TDMS Sources” below).
TDMS Technology
TDMS is a pure ABAP-written solution that runs on top of the SAP BASIS system. For sender and receiver systems, TDMS supports all available SAP ERP releases starting with SAP R/3 4.6C. The TDMS server (which equals the TDMS central system plus the TDMS control system) requires SAP Web Application Server (SAP Web AS) 6.20 or above as a base. You can also install other ERP solutions on the systems that have industry solutions and add-ons installed, and are not just pure ERP standard systems. These solutions won’t prevent you from using TDMS.
TDMS Shellcreation has its own dependencies on operating systems and databases; TDMS itself has none. For all of its communications among the systems: sender, TDMS server, and receiver systems (see the sidebar “System Roles with TDMS” below), TDMS uses Remote Function Calls (RFCs). It generates and manages the required RFC destinations so you don’t need to create them manually. To work with TDMS, you need to provide communications users in all of the TDMS clients that RFCs use.
- Sender: This is the system (i.e., the client) where the data you plan to transfer is currently located. This can be a production system or a non-production system that contains good-quality test data (e.g., a recent copy of the production system).
- TDMS server: Basically, you can use any non-production system as the TDMS server; however, running the TDMS server on the receiver system is not supported. A system becomes the TDMS server when you start using it as such, (i.e., you open the TDMS Migration Server Overview and load a transfer package). The TDMS server includes both the TDMS central and control systems, so the related RFC destinations must always point to the TDMS server.
- Receiver: This is the system (i.e., the client) on which the test data is refreshed. It can be any non-production system, but it must provide a cross-client environment (customizing and repository) similar to that of the sender system.
All the operations that you need to perform a data transfer take place on the TDMS server. Other than software installation and some post-installation steps, you typically don’t need to perform any activities on the sender and receiver systems to transfer the test data using TDMS. To start, you analyze the sender and receiver systems and then configure the data transfer. TDMS analyzes the systems looking for everything that might be relevant for the data transfer: table structure differences in sender and receiver, tables that are missing in one of the systems, and tables that are empty on the sender system. Also, TDMS checks system settings to make sure they work for TDMS.
You can configure changes to the standard settings of the chosen process type, selecting individual tables that you don’t want to transfer to or delete from the receiver. If you do want to transfer data, you can adjust the data selection rules. This configuration is the base for the code that reads data from the sender system and writes it to the receiver system. The TDMS server uses these objects (generated based on the configuration) to control the data transfer and change selected data if you configured it to do so (e.g., you changed logical system names).
For test data consistency, TDMS deletes all client-specific data from the receiver system when each data transfer starts. This prevents duplicate key errors when TDMS writes to the receiver tables. TDMS works in a purely client-specific fashion, so it only deletes data from and writes it to client-specific tables.
The Data Transfer Process divides into two parts: data selection and actual data transfer. During data selection, TDMS reads the data to be transferred from the sender system and stores it in the sender-cluster table (i.e., a TDMS table where the system temporarily stores data before transferring it to the receiver system). TDMS then extracts the data from the sender-cluster table and sends it through the TDMS server to the receiver system.
A general rule for TDMS is that it won’t touch any cross-client data; TDMS is purely client-specific. Therefore, all objects from the ABAP system repository that contain the data dictionary (DDIC) and development objects and any cross-client customizing remain unchanged in the receiver system. In addition, TDMS preserves user and authorization data in the receiver system, and it won’t transfer tables containing data that typically doesn’t need to be copied to test environments, such as change documents. The production system requires change documents to track changes, however, test systems usually don’t require the change documents created in the production system. (For further information, see SAP Note 1159279 – Objects that are not transferred with TDMS.)
Assign Process Types for TDMS
When you use TDMS to transfer data, the first step is always to create a new package on the Migration Server Overview (see the sidebar “Migration Server Overview” below). A package is an instance of a data transfer that contains a process tree with all required steps and configuration options, as well as all monitoring and logging information. After packages finish executing, they remain on the TDMS server for documentation purposes and as a source for package copies (i.e., you create a package either by copying it from the standard template in client 000 or by copying from a previously completed package).
You call the Migration Server Overview with transaction CNV_MBT_TDMS, which shows all of the objects with which your user is involved as default. You can toggle the overview to display all objects using the button in the upper-left corner or by choosing Project > All objects from the Migration Server Overview.
To create a new project, subproject, or package, choose the related node in the tree structure on the Migration Server Overview:
- Select a Projects, Subprojects, or Packages node on the Migration Server Overview.
- Choose Create from the project/subproject/package menu, double-click the selected node, or right-click it and click the Create button.
- Begin all project and subproject IDs with the letter Z. Package IDs are assigned automatically; they all begin with the number 9.
- Choose the Description text for the project, subproject, or package individually.
SAP recommends that users create separate projects for each sender system and subprojects for each combination of sender and receiver systems. Experience shows that it’s the easiest way to clearly structure the data transfer packages.
To create a new package, you need to select a process type.
Figure 1 shows you the process types that come with TDMS (add-ons DMIS and DMIS_CNT).
 |
Figure 1 |
Choose a TDMS process type |
- ERP Shell Creation Package for SAP Release 4.6C and ERP Shell Creation Package for SAP Release higher 4.6C each perform a TDMS Shellcreation.
- ERP Package for Client Deletion enables you to run the standard TDMS deletion process for the receiver system decoupled from a data transfer (which means that when you execute the activities of a TDMS package, you always delete the old receiver data before you transfer the new data). Please note that a package created with this process type cannot completely delete a client — the client itself, its user data, and some customizing remain in the system. You can use this process type, for example, to remove data for a test client that is no longer in use. You can refresh this client later with current test data using TDMS without adding the runtime of the data-deletion steps in the data transfer package. This process type refers only to data deletion without data transfer.
- ERP Initial Package for Master Data and Customizing copies client-specific customizing and master data to a receiver system. TDMS comes with the classification information of numerous tables and contains steps to classify additional tables in the sender system, including customer tables. These steps will handle all tables that are not already classified. Once TDMS has classified all of the tables, you can manually adjust this configuration. In the end, TDMS only transfers tables marked as customizing or master data, and it transfers every selected table completely. By excluding all tables with transactional data, this process type typically only copies a small amount of data to the receiver system. You may want to use this process type for test clients in development systems.
- ERP Initial Package for Time-Based Reduction is the most frequently used process type. Packages created with this process type basically copy all client-specific data from the sender system to the receiver system, but they select only a certain period of time from the transactional data tables. For example, you might elect to copy the data from the previous three time periods. TDMS then completely transfers the data of client- specific customizing and master data tables. Transactional data is either transferred completely (smaller tables and tables with no suitable date-related information), or TDMS selects just those records related to the previous three periods (tables with huge data volumes). To improve the consistency of test data, TDMS contains programs to resolve references of data objects within the transfer period to objects outside the period; this enables TDMS to include these additional objects in the data transfer.
- ERP Initial Package for Time-Based and Company Code Reduction is identical to the previous process type except that it has an additional configuration option to select company codes for the data transfer. For example, if a project requires test data only from a certain company code, you can configure TDMS with a package of this process type to transfer the data of the previous three periods from the specific company code. Again, these reduction rules apply to large transactional data tables. TDMS transfers all other client-specific data completely.
Install and Prepare for TDMS
When you decide to roll out TDMS in a system landscape, there are certain things you need to handle before you move forward. One especially important task is system preparation; you need to have the add-ons, user roles, SAP Notes, and system parameters completely in place, or TDMS just won’t work.
System Considerations
The receiver system must meet the release requirements mentioned above, and it cannot be a production system. Typically, the sender system is either a production system or a recent copy of one (e.g., the quality assurance system). TDMS puts some load on the sender system during data selection. Ideally, you can lock the sender system during data selection, but if you can’t, you should choose a time with little system activity to run the data selection for a data transfer with TDMS.
In general, the receiver system should be as similar to the sender system as possible, with the same cross-client data (the repository of an ABAP system and cross-client customizing) — identical, if possible. TDMS transfers only client-specific data. If the data environment in the receiver system differs too much from that in the sender system, the data won’t fit (e.g., if a business process relies on data in table extensions that do not exist in the receiver system), and TDMS won’t work properly. A system created with TDMS Shellcreation provides a perfect receiver system environment, as does a full-system copy of the sender system. You should also think about the load coming from TDMS during data deletion and data transfer; for example, if you’re using the system in parallel to the Transport Management System (TMS) or some other important operations, can you accept the additional load that TDMS will put on the system? The effect can be especially severe if you decide to use an existing multi-client system as a receiver system.
Basically, you can use any system based on SAP Web AS 6.20 or above as the TDMS server. SAP doesn’t support running the TDMS server on the same system as the receiver system because of potential problems during data deletion on the receiver system; for example, if TDMS deletes data from the receiver client, then running the TDMS server on the receiver system might affect TDMS itself during data deletion. A common scenario is to use a separate client on, for instance, the SAP Solution Manager system as your TDMS server. Basically, you just need some SAP system where you can install the TDMS software. Then, SAP recommends that you create a separate client and use it as the TDMS server. This clearly separates the access to TDMS server from the rest of your SAP system.
Software Installation
The TDMS installation procedure is the same for all systems involved (i.e., sender system, receiver system, and central server systems). Be sure that you have the latest version of the TDMS software and the same TDMS software level installed on all applicable systems when you prepare for a new data transfer. Otherwise, it won’t work and you’ll run into unforeseen issues because TDMS programs of different versions are not compatible.
To install TDMS:
- Import the add-ons DMIS 2006 (SAP Note 970531 – Installation and Delta Upgrade of DMIS 2006) and DMIS_CNT 2006 (SAP Note 970532 – Installation of DMIS_CNT 2006) into your systems. The TDMS add-ons don’t modify other software components.
- Import the latest available support package for the DMIS and DMIS_CNT add-ons to your systems.
- See SAP Note 1003051 – TDMS 3.0 corrections – Composite SAP Note to get the current list of necessary corrections to TDMS. To ensure a smooth TDMS operation, import all of the notes mentioned in SAP Note 1003051 into your systems. These notes contain corrections for TDMS code and customizing and are, therefore, mandatory to install. If you don’t install them, you will run into dozens of known issues you could have easily avoided.
System Preparation
To use TDMS on your systems, follow these steps:
- Log on to the relevant client systems: sender, central, and receiver.
- Generate the authorization profiles for the SAP_TDMS_USER role and the profile for the SAP_TDMS_MASTER role in the central client.
- Create communications users on all three clients, and assign the role SAP_TDMS_USER to these new users. SAP systems use these communications users to log on to other SAP systems; in fact, SAP systems are the only ones that can employ communications users.
- Assign the SAP_TDMS_MASTER role in the central client to the dialog user (i.e., a user that you can use to log on to a system) that will operate TDMS.
- Make sure that while you’re using TDMS, you set the SAP system parameter rdisp/max_wprun_time to 1800 or more. This is the work-process timeout for dialog processes in an SAP system, measured in seconds. You need to set this parameter because TDMS executes some steps via RFCs, so you might run into work-process timeouts if this parameter’s value is less than 1800. TDMS might need more time.
- Adjust the database parameters and other SAP system parameters according to SAP Note 890797 – SAP TDMS – required and recommended system settings.
- On the receiver system, reduce database-logging and SAP table-logging to a minimum, especially during data deletion and data transfer, for performance reasons. Database-logging keeps a record of all of the changes that occur in a database. This enables you to restore the data if something happens to the system (e.g., a system crash). The SAP system itself logs all changes to customizing tables for documentation purposes. When you use TDMS, you delete and transfer a huge volume of data. If logging is still active in the receiver system, it completely logs the deletion and then completely records the new data when the system writes it to the database. This significantly slows down TDMS.
The next section provides a step-by-step guide to executing a data transfer with a time-based reduction of data.
Execute Time-Based Reduction
First, you need to create a project and subproject in the Migration Server Overview (it gives you a view of all of the projects, subprojects, and packages in your TDMS server and access to them).
- Open the Migration Server Overview and double-click the Projects node.
- Enter a project ID and a description text. Figure 2 shows the Migration Server Overview after the creation of the project Z_DEMO.
 |
Figure 2 |
Project creation using the Migration Server Overview screen |
Expand your new project’s node and double-click the tree node in which you want to create a sub-element. Then, enter the ID and description, and expand the new subproject node. Double-click the packages node, and choose the ERP Initial Package for Time based Reduction process type (
Figure 1). Enter a description and the system loads the package (see the sidebar “Package Load Process” below).
Package Load Process
The TDMS control tables for client 000 contain the configuration for all standard TDMS process types. When TDMS loads a package, it copies the default configuration of the selected process type in the control tables to the client. Process types define what the package looks like and how it is created. SAP also uses the term “process type” to refer to the different packages themselves. The TDMS control tables for client 000 store all standard data for process types and packages. When you load a package using a certain process type, TDMS copies the data from client 000 to the current client and assigns a package number to this set of data. This is then “your” new package in which you can change the configuration and execute the data transfer. You never use client 000 to work in an SAP system, so you need to copy the required control information for TDMS to the client on which you are logged. That is also the place where you find your package number.
In client 000, the package IDs are the technical names of the process types TDMS time-based reduction (TDTIM) and TDMS master data and customizing (TDMDC), and so on. You should not modify the standard configuration settings of TDMS for client 000 because importing a new TDMS support package might change them. It might just overwrite the old standard settings in client 000, losing all manual changes to the data.
Phase 1: Package Settings
After TDMS loads the package, it directs you to your new package’s process tree (see the sidebar “The Process Tree” below). The Define RFC Destinations activity shows a red traffic light in the State column, as shown in
Figure 3. If your tests of the RFC destinations are unsuccessful, you will also get a red status. In a new package, the status remains red until you define the destinations.
 |
Figure 3 |
View of the process tree of a package |
The Process Tree
The process tree contains all of the activities required to configure and run a data transfer with TDMS. Activities are structured into phases that you need to execute sequentially. In general, you also need to run the activities sequentially (with some exceptions where parallel execution is possible). Only when all of a phase’s activities have executed successfully can you continue with the activities of the next phase. Once you start an activity in a new phase, you cannot jump back to the activities of the previous phase. These rules ensure that you run the activities in the correct sequence.
To execute an activity in the process tree, select the activity text and press F8, or click the Execute button to the left of the button row. You can execute all activities marked with the Execute symbol to the left of the activity text in the process tree. To get more information about any activity, select its title and click the Documentation button. Some activities have an information symbol next to their title. You can’t execute these activities; they are merely placeholders for documentation.
Using the magnifier button or by selecting Settings > Change Process Tree View, you can switch to an extended view of the process tree. Here, you can expand all collective activities to access the individual activities and their logs on lower hierarchy levels. In addition, the extended view shows some optional activities that you don’t need for a standard data transfer with TDMS (e.g., data-scrambling). However, you should always work from the standard view and switch to the extended view only when it’s necessary (e.g., to solve issues with activities on lower levels of the process tree).
Execute the Define RFC Destinations activity to access the screens that are used to maintain the RFC destinations. Expand the Maintenance area in the middle of the screen, and click the Definition tab, as shown in
Figure 4. Activate all checkboxes and enter host names, system numbers, log-on languages, client IDs, communication usernames, and passwords.
 |
Figure 4 |
Define the RFC destinations |
RFC destinations always point to clients on SAP systems. TDMS refers to sender/receiver/central/control system instead of client system.
Figure 5 shows you how these different systems interact with one another. Note that the control system and the central system should always be identical. The control system and the central system are just separated into two layers for technical reasons. Click the Apply button on the upper left of the Definition tab to apply the technical information for your RFC destinations to your TDMS package configuration. TDMS then generates RFC destinations on the TDMS server for every involved client that points to itself (control and central) and to the remote clients (sender and receiver). Of all possible combinations, the only ones missing are direct connections between the sender and receiver systems.
 |
Figure 5 |
Interactions among TDMS systems |
Click the Synchronization tab, and click the Synchronize button, as shown in
Figure 6. In this step, TDMS uses the recently generated RFC destinations to log on to the sender and receiver systems and generate their RFC destinations. TDMS logs on to the remote systems and runs programs on them to generate further RFC destinations. The sender and receiver systems will have RFC destinations that point to themselves and back to the TDMS server.
 |
Figure 6 |
Synchronize the RFC destinations |
If at least one of your systems is an SAP NetWeaver Application Server (SAP NetWeaver AS) 7.0 or higher system, TDMS shows an additional screen during synchronization. You need to decide whether to enter the communications user passwords again and have TDMS transfer them unencrypted through the network to the remote systems, or manually log on to the sender and receiver systems and enter the passwords there (see the sidebar “Troubleshooting RFC Destinations” below).
Troubleshooting RFC Destinations
The naming convention for all RFC destinations created by TDMS is MBT__, where can be CEN (central), PCL (control), RCV (receiver), or SND (sender).
If the definition or synchronization of RFC destinations fails, check this list of common issues:
- You have not assigned user role SAP_TDMS_USER or SAP_TDMS_USER_EXT.
- You have not generated an authorization profile for the assigned role.
- The user is not of type communication.
- The user is locked.
- The password contains lowercase or special characters or is longer than eight characters.
Make sure that you have activated the checkbox on the line (
Figure 4) of the Definition tab when you apply new settings (e.g., to correct a wrong entry such as username, password, IP address, and so on). Also, double-check any password fields before clicking the Apply button because they may have been reset.
If required, you can manually create and maintain RFC destinations using the Get Destination and transaction SM59 (RFC Destinations [Display/Maintain]) buttons in the Maintenance area (
Figure 4).
The next activity on the process tree, as shown in
Figure 7, checks to see whether the sender and receiver systems have the same ID. If not, TDMS must add the following two activities to the Package Settings phase to perform the required data conversion:
- Exclude HR Objects from Transfer (Optional), as shown in Figure 3, enables you to choose to exclude Human Resources (HR) tables from the data transfer to create a test system that doesn’t contain sensitive HR data or to copy these tables separately.
- Specify Drop/No Drop of SAP Office tables (Optional) allows you to configure TDMS to delete SAP Office data, a functionality used for workflows and communications via email, fax, and so on, including transaction SBWP (SAP Business Workplace), from the receiver system. Since SAP Office is user-related, the default setting is to keep the SAP Office data. However, if it’s not required for the test system, you can save additional space in the receiver database if you delete the SAP Office data.
 |
Figure 7 |
Determine whether sender and receiver IDs are the same |
A standard post-processing step after a system or client copy is to convert the logical system names via transaction BDLS (Conversion of Logical System Names). Using the Assign Logical System Names in Receiver System, TDMS activity, you can define a mapping table for the logical system names that you want to convert — typically from the names of the sender system to the names used in the receiver system — during the data transfer, as shown in
Figure 8.
 |
Figure 8 |
Convert logical system names |
The last activity on the standard view of the process tree is Start Programs for Package Settings. It is a collective starter for multiple activities that don’t require user interaction, such as entering values, answering questions on pop-ups, and so on. All of the information required to run the Phase 1 activities on TDMS is already available in the control tables.
Phase 2: System Analysis
Now that Phase 1 is complete, expand the System Analysis (Active Phase) node, as shown in
Figure 9. This phase contains activities to configure the data transfer, generate required objects, and perform a simulation of the data transfer.
 |
Figure 9 |
System analysis phase |
Execute the Analyze Table Sizes activity, which collects information about the sizes of the tables in the sender system. This helps you decide whether the data transfer requires additional configuration for the next steps; for example, when you have a list of the biggest tables, you can decide whether you want to exclude some of them from the data transfer to reduce the volume of data.
The Select Tables for Reduction, Assign Selection Groups, and Analyze and Specify ‘From Date’ activities begin with a pop-up dialog in which you can determine whether you want to execute the functions on these screens in Change or View mode. If you execute them in Change mode, the system forces you to execute any dependent activities again. So, if you just want to look at the settings on these screens, you should choose the View mode.
For the Select Tables for Reduction activity, you can choose the SAP standard settings or your customized settings, as shown in
Figure 10.
 |
Figure 10 |
Use standard or custom table reduction settings |
If you click the SAP Standard button, you just return to the process tree, and then you can proceed with the next activities. Clicking the Customize button leads you to a screen that displays all tables that TDMS transfers completely. If you can’t find a particular table, be aware that TDMS filters the list to show only transparent tables (i.e., they look the same in SAP and the database) that are larger than 1024KB and client-specific; remember, TDMS doesn’t handle cross-client data.
Using the top button row in
Figure 11, you can choose the displayed table content: SAP Tables, All Tables, Customer Tables, and Unreduced Tables. If you want to change the transfer behavior of a certain table, activate its checkbox in the left column under Reduce and save your settings. This enables you to use the dependent Assign Selection Groups activity to define exactly what TDMS needs to do with the table.
 |
Figure 11 |
Select tables for reduction |
Back in the process tree for the Identify Selection Groups activity, TDMS analyzes all of the tables that you chose to reduce to determine whether you can apply available reduction rules. For example, if you have selected a table for reduction and it contains a date field, TDMS will propose to decrease the data in that table by date.
The Assign Selection Groups activity makes the final determination on how TDMS should transfer a table.
TDMS shows you the tables that have also been selected for reduction and how it will transfer them. A yellow triangle on the right symbolizes the full transfer of a table; the green light stands for a reduced transfer. Double-click a line in the upper table to see its available reduction methods in the lower table, as well as the fields to which you can apply them.
In the lower table, you can use the checkboxes to determine which selection group (i.e., reduction rule) to use and which table fields it uses. In
Figure 12, TDMS uses the selection group G_DATUM on field Z00HEAD-DATUM to reduce the table Z00HEAD. This selection group ensures that TDMS only transfers those records where Z00HEAD-DATUM is greater than or equal to the chosen From Date.
 |
Figure 12 |
Assign selection groups |
If TDMS can apply at least one selection group to one of your tables, it automatically assigns and activates the selection group. Therefore, make sure you carefully check all additional selected tables to determine whether TDMS has chosen the selection groups and table fields that make sense for your data. For example, if a table contains multiple date fields with different meanings, you need to make sure that the selection group is attached to the correct date field.
If you have selected a table for reduction in the previous activity, you can now configure that table for either full transfer or exclusion from the data transfer. Activate the No Transfer checkbox or the Full Transfer checkbox in the upper table to define your configuration.
In the Assign Selection Groups activity, you can choose the content you want to display. When you switch to All Tables, you can change the configuration for the reduced tables by default, just as you can change it for the tables in the Custom Tables view. Because TDMS also has some more complex reduction methods, some tables can only be configured together, while some tables cannot be configured at all. In
Figure 13, table NAST (i.e., a message status database table) has been excluded from the data transfer; TDMS would typically have transferred it in a reduced form. When you are done with the configuration, save your changes and return to the process tree.
 |
Figure 13 |
Change to a standard configuration |
The Analyze and Specify ‘From Date’ activity lets you enter the start date of your test data selection. Enter the date and press F8 to proceed, and then use one of the Choose as From Date buttons to save the selected date (
Figure 14).
 |
Figure 14 |
Choose the from date |
Analyzing the sender system’s organizational structures — by reading the sender system’s customizing and gathering information about its organizational structures — allows you to evaluate the date in more detail. The more fiscal-year periods that start with your selected date, the better the TDMS rating it will get. The TDMS rating enables you to visualize how well the selected From Date corresponds with the start dates of the periods in the sender system.
As a final step in the system analysis phase, you need to use the De-/Activate Size Prediction for Receiver System activity to determine whether you want to simulate the data transfer to estimate how much data the receiver system will create. If you activate this simulation, TDMS performs the complete test-data selection without writing the data anywhere. This typically takes just as long as selecting the data for the real data transfer (at least a couple of hours). From my experience, I recommend running this simulation whenever you use TDMS on a sender client for the first time. Besides estimating the data volume, you get runtime figures for the data selection activities. More importantly, if TDMS has problems during data selection, you’ll find out about it in the simulation run; then, you can fix them before the real data selection and data transfer take place.
The Start Programs for Generation and Receiver Settings activity is a collective starter for all activities that generate the required ABAP objects for data transfer and data deletion. They are used for data selection, data transfer, and writing the data to the receiver system. Also, there are objects used to delete the data in the receiver client before it starts to transfer the data. If you activate this activity, TDMS also executes the size prediction if — and only if — it has been switched on. If the size prediction is activated, you can display its results with the Display Data Transfer Volume (Optional) activity. This display includes the sizes of the table indices in the receiver system and, therefore, provides an estimated projection of the data growth in your database. Note, it’s still just an estimation; it isn’t 100% exact. (See SAP Note 894307 – TDMS: Tips, tricks, general problems, error-tracing for further information on the data-volume calculation.)
Finally, the Confirm Settings activity completes the system analysis phase with a status check of all mandatory activities. If some of these activities are not yet complete, TDMS will give you an error here. You need to execute all of the mandatory activities of the system analysis phase before you can successfully run Confirm Settings and proceed to the next phase.
Phase 3: Preparations for Data Transfer
Figure 15 shows the activities in the Preparations in the Systems for Data Transfer (Active Phase).
 |
Figure 15 |
Phase preparations for data |
To improve consistency and avoid errors while writing the test data, TDMS deletes all client-specific data from the receiver client before it executes the data transfer. Obviously, you shouldn’t use the receiver client after this deletion. You can either manually keep the users away from your receiver client or use TDMS to lock them until your package has finished post-processing.
You can use the optional Lock users in receiver system activity to lock all users in the receiver client with TDMS. Technically, TDMS backs up the current lock status and then locks all users. Only DDIC, SAP*, and the TDMS communications user remain unchanged. When TDMS unlocks the users in the post-processing phase, it restores the original lock status.
Based on the data volume distribution in the receiver system, TDMS determines the fastest deletion method automatically. Available methods are:
- Array-Delete: The data is deleted from the receiver client using normal SQL statements. TDMS will choose this option when it deletes only a few pieces of data.
- Drop-Insert: If a high volume of data is to be deleted, TDMS will choose this method. TDMS then drops (deletes) and recreates the whole table. This action affects the data of all clients in the specific table. To preserve the data of clients other than the receiver client, TDMS temporarily copies their data into a shadow table and restores it to the table after TDMS has recreated it.
See also SAP Note 1054180 – TDMS 3.0: Change deletion scenario for a table.
The Start Programs for Data Deletion in Receiver System collective starter activity runs other activities that delete the receiver system data, prepare the sender system for data selection and data transfer, and perform some final consistency checks on the generated objects. For example, if table structures have changed since the objects were generated, TDMS would need to regenerate the related objects.
Phase 4: Data Transfer
The Create Cluster in Sender System activity sets up the TDMS sender-cluster table. In a database with tablespaces, the activity moves the table DMC_INDXCL into the tablespace that has the most available space. TDMS temporarily stores all of the data selected in this table before it transfers the table to the receiver (see the section “Data Selection” below). Check the logs of this activity after execution to see where TDMS put the table and manually correct the location in the sender system, if required.
Figure 16 shows the activities of the data transfer phase.
 |
Figure 16 |
Phase data transfer |
Now, you have completed all steps necessary to prepare for data selection and data transfer. You can directly proceed to run the data selection or wait for a particular state in your sender system (e.g., when the system load decreases after finishing scheduled batch jobs).
Data Selection
The data selection process consists of the following activities: Fill header tables (Collective Exec.), Start Data Selection for Header Tables, and Start Data Selection. When all three activities have finished and all of the data to be transferred is stored in the sender-cluster table, the data selection is complete. TDMS then no longer accesses any SAP ERP application table; it reads the selected data directly from the sender-cluster table.
You begin data selection with the collective activity Fill header tables (Collective Exec.). It initiates a lot of activities on a lower level of the process tree to fill the TDMS internal-header tables in the sender system. You don’t need to manually initiate the Start Data Selection for Header Tables activity; the previous activity starts it automatically.
When you run “Fill Header Tables,” TDMS executes lots of programs in the sender system. When one of those programs finishes execution, it automatically triggers the “Start Data Selection for Header Tables” activity. This activity enables the user to monitor the data-selection progress of all conversion objects containing TDMS internal-header tables. When Start Data Selection for Header Tables is complete, run the Start Data Selection activity, which selects the data for the remaining conversion objects (those without TDMS internal-header tables).
Data Transfer
Once the data selection is complete, the Start Data Transfer activity begins reading the selected data from the sender-cluster table and writing the data to the SAP ERP tables in the receiver client. The data is routed through the memory of the central system, but it’s not written to the central system’s database. If you have assigned any data conversion rules, they execute now on the central system during the data transfer of the relevant conversion objects.
As the final activity of the data transfer phase, the Check consistency of data load activity verifies that all conversion objects have successfully completed data transfer.
Phase 5: Postprocessing
The final phase of the process tree is postprocessing, as shown in
Figure 17.
 |
Figure 17 |
Postprocessing phase |
The collective starter, Postprocessing – background processing, runs various activities to transfer the number ranges (in the SAP Help portal), resets the temporary data, adjusts the user’s address and SAP Office data, and finalizes the conversion of logical system names.
The Unlock Users in Receiver System activity restores the original status of every user before TDMS locked them. The Refresh Data Selection Cluster of Current Transfer activity deletes all of the current package’s data from the sender-cluster table. If the table doesn’t contain data from any other packages, TDMS just drops it and recreates it.
As the final activity, Complete Package deactivates your package. You can still access all logging information as documentation of the data transfer, but you can’t execute any more activities.
When you release your new test system to the users, make sure you tell them about the reduced amount of data. Also, you should collect all test-data issues that might be reported to adjust the TDMS configuration during the subsequent data transfer.
New Process Types
Now, you should have some insight into TDMS and its time-based reduction process type. In addition to the process types mentioned, some others might also prove interesting for your test-data requirements: TDMS Business Process Library (with add-on Data Migration Server Extension [DMIS_EXT]) is already available and allows for an object-based transfer of data and the creation of custom scenarios with a drag-‘n’-drop UI. TDMS for SAP Customer Relationship Management (SAP CRM) and TDMS for SAP NetWeaver Business Intelligence (SAP NetWeaver BI) have just been released as well. Last, but not least, TDMS for SAP ERP Human Capital Management (SAP ERP HCM) addresses the special requirements of HR test systems, regarding system access, authorizations, and test-data scrambling.
Manfred Gonschor
Manfred Gonschor works as a consultant for ENERGY4U GmbH. He graduated from the Berufsakademie Mannheim, Baden-Württemberg, Germany, with studies in business information technologies. He was part of the System Landscape Optimization (SLO) consulting team at SAP Germany from 2005 to 2008 with a focus on SLO Analytic Services and TDMS. In the past few years, Manfred ran numerous TDMS implementation projects and conducted official training classes around the globe.
You may contact the author at
manfred.gonschor@energy4u.org.
If you have comments about this article or publication, or would like to submit an article idea, please contact the
editor.